Skip to content

How is Neural Prophet being trained? #247

Discussion options

You must be logged in to vote

Hi @SkanderHn , thank you for your question.
All components are jointly trained by an AdamW optimizer with a OneCycle learning rate schedule.
See code reference here.

We can achieve this as all components are defined as model parameters, recognized by PyTorch as trainable weights.

Regarding the AR-regularization: Instead of the original AR-Net regularization, we use a tuned-back, optional, regularization for AR weights in order to ensure stable training when combined with other model components. However, it should still work similarly - if you encounter any difficulties, please report back to us!

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ourownstory
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants