You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
pytorch-image-models/timm/optim
Ross Wightman 288682796f
Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7
4 years ago
..
__init__.py Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7 4 years ago
adafactor.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adahessian.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adamp.py Add `adamp` and 'sgdp' optimizers. 4 years ago
adamw.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
lookahead.py Fix some attributions, add copyrights to some file docstrings 4 years ago
nadam.py Big re-org, working towards making pip/module as 'timm' 5 years ago
novograd.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
nvnovograd.py Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers 5 years ago
optim_factory.py Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7 4 years ago
radam.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
rmsprop_tf.py Fix some attributions, add copyrights to some file docstrings 4 years ago
sgdp.py Add `adamp` and 'sgdp' optimizers. 4 years ago