You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
pytorch-image-models/timm/optim
Ross Wightman f944242cb0
Fix #262, num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT.
4 years ago
..
__init__.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adafactor.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adahessian.py Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 4 years ago
adamp.py Add `adamp` and 'sgdp' optimizers. 4 years ago
adamw.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
lookahead.py Fix some attributions, add copyrights to some file docstrings 4 years ago
nadam.py
novograd.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
nvnovograd.py Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers 5 years ago
optim_factory.py Fix #262, num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT. 4 years ago
radam.py Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak. 5 years ago
rmsprop_tf.py Fix some attributions, add copyrights to some file docstrings 4 years ago
sgdp.py Add `adamp` and 'sgdp' optimizers. 4 years ago