Enabling PyTorch on XLA Devices (e.g. Google TPU)
-
Updated
May 23, 2024 - C++
Enabling PyTorch on XLA Devices (e.g. Google TPU)
Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python ⚡
JAX - A curated list of resources https://github.com/google/jax
GoMLX -- Accelerated ML Libraries for Go
Access the Xspec models and corresponding JAX/XLA ops.
S + Autograd + XLA :: S-parameter based frequency domain circuit simulations and optimizations using JAX.
Mine verus coin on ARM like Pi, Tablet, Mobile & Other.
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Official scala pool repository
Presents comprehensive benchmarks of XLA-compatible pre-trained models in Keras.
Modern Graph TensorFlow implementation of Super-Resolution GAN
Tutorial about How to change your slow tensorflow training faster
Example how to train GPT-2 (XLA + AMP), export to SavedModel and serve with Tensorflow Serving
ALBERT model Pretraining and Fine Tuning using TF2.0
Contains materials for my talk "You don't know TensorFlow".
Provides code to serialize the different models involved in Stable Diffusion as SavedModels and to compile them with XLA.
Add a description, image, and links to the xla topic page so that developers can more easily learn about it.
To associate your repository with the xla topic, visit your repo's landing page and select "manage topics."