Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
-
Updated
May 21, 2024 - Python
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🔥 A tool for visualizing and tracking your machine learning experiments. This repo contains the CLI and Python API.
The Unified AI Framework
JAX implementation of instant-ngp (NeRF part)
Accelerate your training with this open-source library. Optimize performance with streamlined training and serving options with JAX. 🚀
PennyLane is a cross-platform Python library for quantum computing, quantum machine learning, and quantum chemistry. Train a quantum computer the same way as a neural network.
Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.
⚡️SwanLab: your ML experiment notebook. 你的AI实验笔记本,跟踪与可视化你的机器学习全流程
This is a JAX/Flax-based transformer language model trained on a Japanese dataset. It is based on the official Flax example code (lm1b).
A differentiable physics engine and multibody dynamics library for control and robot learning.
Machine learning algorithms for many-body quantum systems
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
msThesis
causalimages: An R package for performing causal inference with image and image sequence data
TFDS is a collection of datasets ready to use with TensorFlow, Jax, ...
Flax is a neural network library for JAX that is designed for flexibility.
A retargetable MLIR-based machine learning compiler and runtime toolkit.
Add a description, image, and links to the jax topic page so that developers can more easily learn about it.
To associate your repository with the jax topic, visit your repo's landing page and select "manage topics."