You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We want to provide high-level gradient-descent based optimization procedures. The idea is that a user constructs some model structure with parameters $\theta$, has a dataset $\mathcal{D}$ and can now maximize $p(\mathcal{D} | \theta)$ with any backend. An examplary method could look like the following:
which then uses optimizer to maximize the data likelihood batch-wise for epochs number of epochs.
This is not supposed to be very flexible but should provide a user with a simplistic version to train a model in a specific backend. More advanced users will most likely write their own optimization procedure.
Since tensorly does not provide any dataset/optimizer system, this needs to be implemented in all supported backends.
The text was updated successfully, but these errors were encountered:
braun-steven
changed the title
[dev] Add high-level backend specific optimization procedures
Add high-level backend specific optimization procedures
Dec 11, 2023
braun-steven
changed the title
Add high-level backend specific optimization procedures
Add high-level backend specific gradient based optimization procedures
Dec 27, 2023
We want to provide high-level gradient-descent based optimization procedures. The idea is that a user constructs some model structure with parameters$\theta$ , has a dataset $\mathcal{D}$ and can now maximize $p(\mathcal{D} | \theta)$ with any backend. An examplary method could look like the following:
which then uses
optimizer
to maximize thedata
likelihood batch-wise forepochs
number of epochs.This is not supposed to be very flexible but should provide a user with a simplistic version to train a model in a specific backend. More advanced users will most likely write their own optimization procedure.
Since tensorly does not provide any dataset/optimizer system, this needs to be implemented in all supported backends.
The text was updated successfully, but these errors were encountered: