Commit Graph

35 Commits (9da7e3a79924a05f62544e4a67742d1cac317f1e)

Author SHA1 Message Date
Ross Wightman b1b024dfed Scheduler update, add v2 factory method, support scheduling on updates instead of just epochs. Add LR to summary csv. Add lr_base scaling calculations to train script. Fix #1168
2 years ago
Ross Wightman 2a296412be Add Adan optimizer
2 years ago
Ross Wightman 33e30f8c8b Remove layer-decay print
2 years ago
Ross Wightman 0557c8257d Fix bug introduced in non layer_decay weight_decay application. Remove debug print, fix arg desc.
2 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
2 years ago
Mi-Peng cdcd0a92ca fix lars
2 years ago
Ross Wightman a16a753852 Add lamb/lars to optim init imports, remove stray comment
3 years ago
Ross Wightman c207e02782 MOAR optimizer changes. Woo!
3 years ago
Ross Wightman a426511c95 More optimizer cleanup. Change all to no longer use .data. Improve (b)float16 use with adabelief. Add XLA compatible Lars.
3 years ago
Ross Wightman 9541f4963b One more scalar -> tensor fix for lamb optimizer
3 years ago
Ross Wightman 8f68193c91
Update lamp.py comment
3 years ago
Ross Wightman a6af48be64 add madgradw optimizer
3 years ago
Ross Wightman 55fb5eedf6 Remove experiment from lamb impl
3 years ago
Ross Wightman 8a9eca5157 A few optimizer comments, dead import, missing import
3 years ago
Ross Wightman ac469b50da Optimizer improvements, additions, cleanup
3 years ago
Ross Wightman 1042b8a146 Add non fused LAMB optimizer option
3 years ago
Ross Wightman cd3dc4979f Fix adabelief imports, remove prints, preserve memory format is the default arg for zeros_like
3 years ago
juntang addfc7c1ac adabelief
3 years ago
Ross Wightman 37c71a5609 Some further create_optimizer_v2 tweaks, remove some redudnant code, add back safe model str. Benchmark step times per batch.
3 years ago
Ross Wightman 288682796f Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7
3 years ago
Ross Wightman 0e16d4e9fb Add benchmark.py script, and update optimizer factory to be more friendly to use outside of argparse interface.
3 years ago
Jasha 7c56c718f3 Configure create_optimizer with args.opt_args
4 years ago
Ross Wightman 30ab4a1494 Fix issue in optim factory with sgd / eps flag. Bump version to 0.3.1
4 years ago
Ross Wightman f944242cb0 Fix #262, num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT.
4 years ago
Ross Wightman 477a78ed81 Fix optimizer factory regressin for optimizers like sgd/momentum that don't have an eps arg
4 years ago
Ross Wightman a4d8fea61e Add model based wd skip support. Improve cross version compat of optimizer factory. Fix #247
4 years ago
Ross Wightman 80078c47bb Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support.
4 years ago
Ross Wightman 7995295968 Merge branch 'logger' into features. Change 'logger' to '_logger'.
4 years ago
Ross Wightman 6c17d57a2c Fix some attributions, add copyrights to some file docstrings
4 years ago
Sangdoo Yun e93e571f7a Add `adamp` and 'sgdp' optimizers.
4 years ago
Ross Wightman e6f24e5578 Add 'momentum' optimizer (SGD w/o nesterov) for stable EfficientDet training defaults
4 years ago
Ross Wightman 64966f61f7 Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers
5 years ago
Ross Wightman ba3c97c3ad Some Lookahead cleanup and fixes
5 years ago
Ross Wightman fac58f609a Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
5 years ago
Ross Wightman aa4354f466 Big re-org, working towards making pip/module as 'timm'
5 years ago