function transforms (aka torch.func, functorch)
Richard Zou edited this page Dec 20, 2022
·
2 revisions
Page Maintainers: @zou3519
- understand what composable function transforms are and their most common use cases
- understand what DynamicLayerStack is and how it is used to implement composition of function transforms
- Read through the whirlwind tour
- Read through the advanced autodiff tutorial
- Read through the per-sample-gradients tutorial
- Read through the model ensembling tutorial
The advanced autodiff tutorial explains how to compute Jacobians via a composition of vmap and vjp.
- Without looking at the source code for jacfwd or torch.autograd.functional.jacobian, write a function to compute the Jacobian using forward-mode AD and a for-loop. Note that forward-mode AD computes Jacobian-vector products while reverse-mode AD (vjp, grad) compute vector-Jacobian products.
- Write a function to compute the Jacobian by composing vmap and jvp.
The APIs should have the following signature:
def jacobian(f, *args):
pass
You can assume that f
accepts multiple Tensor arguments and returns a single Tensor argument.
Read through this gdoc.
- Install Prerequisites
- Fork, clone, and checkout the PyTorch source
- Install Dependencies
- Build PyTorch from source
- Tips for developing PyTorch
- PyTorch Workflow Git cheatsheet
- Overview of the Pull Request Lifecycle
- Finding Or Reporting Issues
- Pre Commit Checks
- Create a Pull Request
- Typical Pull Request Workflow
- Pull Request FAQs
- Getting Help
- Codebase structure
- Tensors, Operators, and Testing
- Autograd
- Dispatcher, Structured Kernels, and Codegen
- torch.nn
- CUDA basics
- Data (Optional)
- function transforms (Optional)