You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a convenient fit function to train GPs against objectives. It would be good though e.g., in the regression notebook to show a simple python for loop, and a simple lax.scan training loop, to demonstrate that users can write their own training loops. Give insight to, eg., the ConjugateMLL is something you can take jax.grad against and just do gradient descent on.
Then it would be good to link this to a more extensive notebook exposing users to stoping gradients, bijectors transformations etc, and show how to add a training bar to the loop.
The text was updated successfully, but these errors were encountered:
We have a convenient
fit
function to train GPs against objectives. It would be good though e.g., in the regression notebook to show a simple python for loop, and a simplelax.scan
training loop, to demonstrate that users can write their own training loops. Give insight to, eg., the ConjugateMLL is something you can takejax.grad
against and just do gradient descent on.Then it would be good to link this to a more extensive notebook exposing users to stoping gradients, bijectors transformations etc, and show how to add a training bar to the loop.
The text was updated successfully, but these errors were encountered: