romamartyanov
ff1a657f57
fixing config reader
5 years ago
romamartyanov
fa2e5c6f16
Added set_deterministic function
...
This function can be useful for checking the representativeness of experiment.
5 years ago
romamartyanov
41e1f8d282
Moved all argparse configs to .yaml files
...
All configs are parsed from the .yaml file. If necessary, any parameter can be written to the terminal as before and these parameters will overwrite the .yaml file. All args are also stored in args_text variable as string
5 years ago
Ross Wightman
80078c47bb
Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support.
5 years ago
Ross Wightman
47a7b3b5b1
More flexible mixup mode, add 'half' mode.
5 years ago
Ross Wightman
532e3b417d
Reorg of utils into separate modules
5 years ago
Ross Wightman
751b0bba98
Add global_pool (--gp) arg changes to allow passing 'fast' easily for train/validate to avoid channels_last issue with AdaptiveAvgPool
5 years ago
Ross Wightman
9c297ec67d
Cleanup Apex vs native AMP scaler state save/load. Cleanup CheckpointSaver a bit.
5 years ago
Ross Wightman
c2cd1a332e
Improve torch amp support and add channels_last support for train/validate scripts
5 years ago
datamining99
5f563ca4df
fix save_checkpoint bug with native amp
5 years ago
datamining99
d98967ed5d
add support for native torch AMP in torch 1.6
5 years ago
Ross Wightman
8c9814e3f5
Final cleanup of mixup/cutmix. Element/batch modes working with both collate (prefetcher active) and without prefetcher.
5 years ago
Ross Wightman
f471c17c9d
More cutmix/mixup overhaul, ready to kick-off some trials.
5 years ago
Ross Wightman
92f2d0d65d
Merge branch 'master' into cutmix. Fixup a few issues.
5 years ago
Ross Wightman
fa28067704
Add more augmentation arguments, including a no_aug disable flag. Fix #209
5 years ago
Ross Wightman
7995295968
Merge branch 'logger' into features. Change 'logger' to '_logger'.
5 years ago
Ross Wightman
1998bd3180
Merge branch 'feature/AB/logger' of https://github.com/antoinebrl/pytorch-image-models into logger
5 years ago
Ross Wightman
6c17d57a2c
Fix some attributions, add copyrights to some file docstrings
5 years ago
Antoine Broyelle
78fa0772cc
Leverage python hierachical logger
...
with this update one can tune the kind of logs generated by timm but
training and inference traces are unchanged
5 years ago
Ross Wightman
6441e9cc1b
Fix memory_efficient mode for DenseNets. Add AntiAliasing (Blur) support for DenseNets and create one test model. Add lr cycle/mul params to train args.
5 years ago
AFLALO, Jonathan Isaac
a7f570c9b7
added MultiEpochsDataLoader
5 years ago
Ross Wightman
13cf68850b
Remove poorly named metrics from torch imagenet example origins. Use top1/top5 in csv output for consistency with existing validation results files, acc elsewhere. Fixes #111
5 years ago
Ross Wightman
27b3680d49
Revamp LR noise, move logic to scheduler base. Fixup PlateauLRScheduler and add it as an option.
5 years ago
Ross Wightman
514b0938c4
Experimenting with per-epoch learning rate noise w/ step scheduler
5 years ago
Ross Wightman
43225d110c
Unify drop connect vs drop path under 'drop path' name, switch all EfficientNet/MobilenetV3 refs to 'drop_path'. Update factory to handle new drop args.
5 years ago
Ross Wightman
b3cb5f3275
Working on CutMix impl as per #8 , integrating with Mixup, currently experimenting...
6 years ago
Andrew Lavin
b72013def8
Added commandline argument validation-batch-size-multiplier with default set to 1.
6 years ago
Ross Wightman
5b7cc16ac9
Add warning about using sync-bn with zero initialized BN layers. Fixes #54
6 years ago
Ross Wightman
d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
...
AugMix, JSD loss, SplitBatchNorm (Auxiliary BN), and more
6 years ago
Ross Wightman
3eb4a96eda
Update AugMix, JSD, etc comments and references
6 years ago
Ross Wightman
7547119891
Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together
6 years ago
Ross Wightman
40fea63ebe
Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts
6 years ago
Ross Wightman
4666cc9aed
Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script
6 years ago
Ross Wightman
232ab7fb12
Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl
6 years ago
Ross Wightman
5719b493ad
Missed update dist-bn logic for EMA model
6 years ago
Ross Wightman
a435ea1327
Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training.
6 years ago
Ross Wightman
3bff2b21dc
Add support for keeping running bn stats the same across distributed training nodes before eval/save
6 years ago
Ross Wightman
1f39d15f15
Allow float decay epochs arg for training, works out with step lr math
6 years ago
Ross Wightman
7b83e67f77
Pass drop connect arg through to EfficientNet models
6 years ago
Ross Wightman
4748c6dff2
Fix non-prefetch variant of Mixup. Fixes #50
6 years ago
Ross Wightman
187ecbafbe
Add support for loading args from yaml file (and saving them with each experiment)
6 years ago
Ross Wightman
b750b76f67
More AutoAugment work. Ready to roll...
6 years ago
Ross Wightman
3d9c8a6489
Add support for new AMP checkpointing support w/ amp.state_dict
6 years ago
Ross Wightman
fac58f609a
Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
...
* Add some of the trendy new optimizers. Decent results but not clearly better than the standards.
* Can create a None scheduler for constant LR
* ResNet defaults to zero_init of last BN in residual
* add resnet50d config
6 years ago
Ross Wightman
66634d2200
Add support to split random erasing blocks into randomly selected number with --recount arg. Fix random selection of aspect ratios.
6 years ago
Ross Wightman
e7c8a37334
Make min-lr and cooldown-epochs cmdline args, change dash in color_jitter arg for consistency
6 years ago
Ross Wightman
c6b32cbe73
A number of tweaks to arguments, epoch handling, config
...
* reorganize train args
* allow resolve_data_config to be used with dict args, not just arparse
* stop incrementing epoch before save, more consistent naming vs csv, etc
* update resume and start epoch handling to match above
* stop auto-incrementing epoch in scheduler
6 years ago
Ross Wightman
b20bb58284
Distributed tweaks
...
* Support PyTorch native DDP as fallback if APEX not present
* Support SyncBN for both APEX and Torch native (if torch >= 1.1)
* EMA model does not appear to need DDP wrapper, no gradients, updated from wrapped model
6 years ago
Ross Wightman
6fc886acaf
Remove all prints, change most to logging calls, tweak alignment of batch logs, improve setup.py
6 years ago
Ross Wightman
aa4354f466
Big re-org, working towards making pip/module as 'timm'
6 years ago