Ross Wightman
c207e02782
MOAR optimizer changes. Woo!
3 years ago
Ross Wightman
a426511c95
More optimizer cleanup. Change all to no longer use .data. Improve (b)float16 use with adabelief. Add XLA compatible Lars.
3 years ago
Ross Wightman
a6af48be64
add madgradw optimizer
3 years ago
Ross Wightman
55fb5eedf6
Remove experiment from lamb impl
3 years ago
Ross Wightman
8a9eca5157
A few optimizer comments, dead import, missing import
3 years ago
Ross Wightman
ac469b50da
Optimizer improvements, additions, cleanup
...
* Add MADGRAD code
* Fix Lamb (non-fused variant) to work w/ PyTorch XLA
* Tweak optimizer factory args (lr/learning_rate and opt/optimizer_name), may break compat
* Use newer fn signatures for all add,addcdiv, addcmul in optimizers
* Use upcoming PyTorch native Nadam if it's available
* Cleanup lookahead opt
* Add optimizer tests
* Remove novograd.py impl as it was messy, keep nvnovograd
* Make AdamP/SGDP work in channels_last layout
* Add rectified adablief mode (radabelief)
* Support a few more PyTorch optim, adamax, adagrad
3 years ago
Ross Wightman
1042b8a146
Add non fused LAMB optimizer option
3 years ago
Ross Wightman
cd3dc4979f
Fix adabelief imports, remove prints, preserve memory format is the default arg for zeros_like
4 years ago
juntang
addfc7c1ac
adabelief
4 years ago
Ross Wightman
37c71a5609
Some further create_optimizer_v2 tweaks, remove some redudnant code, add back safe model str. Benchmark step times per batch.
4 years ago
Ross Wightman
288682796f
Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7
4 years ago
Ross Wightman
0e16d4e9fb
Add benchmark.py script, and update optimizer factory to be more friendly to use outside of argparse interface.
4 years ago
Jasha
7c56c718f3
Configure create_optimizer with args.opt_args
...
Closes #301
4 years ago
Ross Wightman
30ab4a1494
Fix issue in optim factory with sgd / eps flag. Bump version to 0.3.1
4 years ago
Ross Wightman
f944242cb0
Fix #262 , num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT.
4 years ago
Ross Wightman
477a78ed81
Fix optimizer factory regressin for optimizers like sgd/momentum that don't have an eps arg
4 years ago
Ross Wightman
a4d8fea61e
Add model based wd skip support. Improve cross version compat of optimizer factory. Fix #247
4 years ago
Ross Wightman
80078c47bb
Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support.
4 years ago
Ross Wightman
7995295968
Merge branch 'logger' into features. Change 'logger' to '_logger'.
4 years ago
Ross Wightman
6c17d57a2c
Fix some attributions, add copyrights to some file docstrings
4 years ago
Sangdoo Yun
e93e571f7a
Add `adamp` and 'sgdp' optimizers.
...
Update requirements.txt
Update optim_factory.py
Add `adamp` optimizer
Update __init__.py
copy files of adamp & sgdp
Create adamp.py
Update __init__.py
Create sgdp.py
Update optim_factory.py
Update optim_factory.py
Update requirements.txt
Update adamp.py
Update sgdp.py
Update sgdp.py
Update adamp.py
4 years ago
Ross Wightman
e6f24e5578
Add 'momentum' optimizer (SGD w/o nesterov) for stable EfficientDet training defaults
5 years ago
Ross Wightman
64966f61f7
Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers
5 years ago
Ross Wightman
fac58f609a
Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
...
* Add some of the trendy new optimizers. Decent results but not clearly better than the standards.
* Can create a None scheduler for constant LR
* ResNet defaults to zero_init of last BN in residual
* add resnet50d config
5 years ago
Ross Wightman
aa4354f466
Big re-org, working towards making pip/module as 'timm'
5 years ago