You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
pytorch-image-models/timm/optim
Ross Wightman 1042b8a146
Add non fused LAMB optimizer option
3 years ago
..
__init__.py adabelief 3 years ago
adabelief.py Fix adabelief imports, remove prints, preserve memory format is the default arg for zeros_like 3 years ago
adafactor.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adahessian.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adamp.py Add `adamp` and 'sgdp' optimizers. 4 years ago
adamw.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
lamb.py Add non fused LAMB optimizer option 3 years ago
lookahead.py Fix some attributions, add copyrights to some file docstrings 4 years ago
nadam.py Big re-org, working towards making pip/module as 'timm' 5 years ago
novograd.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
nvnovograd.py Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers 5 years ago
optim_factory.py Add non fused LAMB optimizer option 3 years ago
radam.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
rmsprop_tf.py Fix some attributions, add copyrights to some file docstrings 4 years ago
sgdp.py Add `adamp` and 'sgdp' optimizers. 4 years ago