Ross Wightman
de6046e213
Initial commit for dataset / parser reorg to support additional datasets / types
4 years ago
Ross Wightman
2ed8f24715
A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman
460eba7f24
Work around casting issue with combination of native torch AMP and torchscript for Linear layers
4 years ago
Ross Wightman
27bbc70d71
Add back old ModelEma and rename new one to ModelEmaV2 to avoid compat breaks in dependant code. Shuffle train script, add a few comments, remove DataParallel support, support experimental torchscript training.
4 years ago
Ross Wightman
9214ca0716
Simplifying EMA...
4 years ago
Ross Wightman
80078c47bb
Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support.
4 years ago
Ross Wightman
47a7b3b5b1
More flexible mixup mode, add 'half' mode.
4 years ago
Ross Wightman
532e3b417d
Reorg of utils into separate modules
4 years ago
Ross Wightman
751b0bba98
Add global_pool (--gp) arg changes to allow passing 'fast' easily for train/validate to avoid channels_last issue with AdaptiveAvgPool
4 years ago
Ross Wightman
9c297ec67d
Cleanup Apex vs native AMP scaler state save/load. Cleanup CheckpointSaver a bit.
4 years ago
Ross Wightman
c2cd1a332e
Improve torch amp support and add channels_last support for train/validate scripts
4 years ago
datamining99
5f563ca4df
fix save_checkpoint bug with native amp
4 years ago
datamining99
d98967ed5d
add support for native torch AMP in torch 1.6
4 years ago
Ross Wightman
8c9814e3f5
Final cleanup of mixup/cutmix. Element/batch modes working with both collate (prefetcher active) and without prefetcher.
4 years ago
Ross Wightman
f471c17c9d
More cutmix/mixup overhaul, ready to kick-off some trials.
4 years ago
Ross Wightman
92f2d0d65d
Merge branch 'master' into cutmix. Fixup a few issues.
4 years ago
Ross Wightman
fa28067704
Add more augmentation arguments, including a no_aug disable flag. Fix #209
4 years ago
Ross Wightman
7995295968
Merge branch 'logger' into features. Change 'logger' to '_logger'.
4 years ago
Ross Wightman
1998bd3180
Merge branch 'feature/AB/logger' of https://github.com/antoinebrl/pytorch-image-models into logger
4 years ago
Ross Wightman
6c17d57a2c
Fix some attributions, add copyrights to some file docstrings
4 years ago
Antoine Broyelle
78fa0772cc
Leverage python hierachical logger
...
with this update one can tune the kind of logs generated by timm but
training and inference traces are unchanged
4 years ago
Ross Wightman
6441e9cc1b
Fix memory_efficient mode for DenseNets. Add AntiAliasing (Blur) support for DenseNets and create one test model. Add lr cycle/mul params to train args.
5 years ago
AFLALO, Jonathan Isaac
a7f570c9b7
added MultiEpochsDataLoader
5 years ago
Ross Wightman
13cf68850b
Remove poorly named metrics from torch imagenet example origins. Use top1/top5 in csv output for consistency with existing validation results files, acc elsewhere. Fixes #111
5 years ago
Ross Wightman
27b3680d49
Revamp LR noise, move logic to scheduler base. Fixup PlateauLRScheduler and add it as an option.
5 years ago
Ross Wightman
514b0938c4
Experimenting with per-epoch learning rate noise w/ step scheduler
5 years ago
Ross Wightman
43225d110c
Unify drop connect vs drop path under 'drop path' name, switch all EfficientNet/MobilenetV3 refs to 'drop_path'. Update factory to handle new drop args.
5 years ago
Ross Wightman
b3cb5f3275
Working on CutMix impl as per #8 , integrating with Mixup, currently experimenting...
5 years ago
Andrew Lavin
b72013def8
Added commandline argument validation-batch-size-multiplier with default set to 1.
5 years ago
Ross Wightman
5b7cc16ac9
Add warning about using sync-bn with zero initialized BN layers. Fixes #54
5 years ago
Ross Wightman
d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
...
AugMix, JSD loss, SplitBatchNorm (Auxiliary BN), and more
5 years ago
Ross Wightman
3eb4a96eda
Update AugMix, JSD, etc comments and references
5 years ago
Ross Wightman
7547119891
Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together
5 years ago
Ross Wightman
40fea63ebe
Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts
5 years ago
Ross Wightman
4666cc9aed
Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script
5 years ago
Ross Wightman
232ab7fb12
Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl
5 years ago
Ross Wightman
5719b493ad
Missed update dist-bn logic for EMA model
5 years ago
Ross Wightman
a435ea1327
Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training.
5 years ago
Ross Wightman
3bff2b21dc
Add support for keeping running bn stats the same across distributed training nodes before eval/save
5 years ago
Ross Wightman
1f39d15f15
Allow float decay epochs arg for training, works out with step lr math
5 years ago
Ross Wightman
7b83e67f77
Pass drop connect arg through to EfficientNet models
5 years ago
Ross Wightman
4748c6dff2
Fix non-prefetch variant of Mixup. Fixes #50
5 years ago
Ross Wightman
187ecbafbe
Add support for loading args from yaml file (and saving them with each experiment)
5 years ago
Ross Wightman
b750b76f67
More AutoAugment work. Ready to roll...
5 years ago
Ross Wightman
3d9c8a6489
Add support for new AMP checkpointing support w/ amp.state_dict
5 years ago
Ross Wightman
fac58f609a
Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
...
* Add some of the trendy new optimizers. Decent results but not clearly better than the standards.
* Can create a None scheduler for constant LR
* ResNet defaults to zero_init of last BN in residual
* add resnet50d config
5 years ago
Ross Wightman
66634d2200
Add support to split random erasing blocks into randomly selected number with --recount arg. Fix random selection of aspect ratios.
5 years ago
Ross Wightman
e7c8a37334
Make min-lr and cooldown-epochs cmdline args, change dash in color_jitter arg for consistency
5 years ago
Ross Wightman
c6b32cbe73
A number of tweaks to arguments, epoch handling, config
...
* reorganize train args
* allow resolve_data_config to be used with dict args, not just arparse
* stop incrementing epoch before save, more consistent naming vs csv, etc
* update resume and start epoch handling to match above
* stop auto-incrementing epoch in scheduler
5 years ago
Ross Wightman
b20bb58284
Distributed tweaks
...
* Support PyTorch native DDP as fallback if APEX not present
* Support SyncBN for both APEX and Torch native (if torch >= 1.1)
* EMA model does not appear to need DDP wrapper, no gradients, updated from wrapped model
5 years ago