Skip to content

Code to create orthogonal optimisers from torch.optim.Optimizer

License

Notifications You must be signed in to change notification settings

MarkTuddenham/Orthogonal-Optimisers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Orthogonalised Optimisers

Code for Orthogonalising gradients to speed up neural network optimisation

Install package

git clone https://github.com/MarkTuddenham/Orthogonal-Optimisers.git
cd Orthogonal-Optimisers
pip install .

or

pip install git+https://github.com/MarkTuddenham/Orthogonal-Optimisers.git#egg=orth_optim

Usage

And then at the top of your main python script:

from orth_optim import hook
hook()

Now the torch optimisers have an orthogonal option, e.g:

torch.optim.SGD(model.parameters(),
                lr=1e-3,
                momentum=0.9,
                orth=True)

Custom Optimisers

If you have a custom optimiser you can apply the orthogonalise decorator.

from orth_optim import orthogonalise

@orthogonalise
class LARS(torch.optim.Optimizer):
	...

About

Code to create orthogonal optimisers from torch.optim.Optimizer

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages