Commit Graph

77 Commits (fd962c4b4a5214650a8678a2a987d1853933e1c0)

Author SHA1 Message Date
Ross Wightman 27bbc70d71 Add back old ModelEma and rename new one to ModelEmaV2 to avoid compat breaks in dependant code. Shuffle train script, add a few comments, remove DataParallel support, support experimental torchscript training.
4 years ago
Ross Wightman 9214ca0716 Simplifying EMA...
4 years ago
Ross Wightman 80078c47bb Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support.
4 years ago
Ross Wightman 47a7b3b5b1 More flexible mixup mode, add 'half' mode.
4 years ago
Ross Wightman 532e3b417d Reorg of utils into separate modules
4 years ago
Ross Wightman 751b0bba98 Add global_pool (--gp) arg changes to allow passing 'fast' easily for train/validate to avoid channels_last issue with AdaptiveAvgPool
4 years ago
Ross Wightman 9c297ec67d Cleanup Apex vs native AMP scaler state save/load. Cleanup CheckpointSaver a bit.
4 years ago
Ross Wightman c2cd1a332e Improve torch amp support and add channels_last support for train/validate scripts
4 years ago
datamining99 5f563ca4df fix save_checkpoint bug with native amp
4 years ago
datamining99 d98967ed5d add support for native torch AMP in torch 1.6
4 years ago
Ross Wightman 8c9814e3f5 Final cleanup of mixup/cutmix. Element/batch modes working with both collate (prefetcher active) and without prefetcher.
4 years ago
Ross Wightman f471c17c9d More cutmix/mixup overhaul, ready to kick-off some trials.
4 years ago
Ross Wightman 92f2d0d65d Merge branch 'master' into cutmix. Fixup a few issues.
4 years ago
Ross Wightman fa28067704 Add more augmentation arguments, including a no_aug disable flag. Fix #209
4 years ago
Ross Wightman 7995295968 Merge branch 'logger' into features. Change 'logger' to '_logger'.
4 years ago
Ross Wightman 1998bd3180 Merge branch 'feature/AB/logger' of https://github.com/antoinebrl/pytorch-image-models into logger
4 years ago
Ross Wightman 6c17d57a2c Fix some attributions, add copyrights to some file docstrings
4 years ago
Antoine Broyelle 78fa0772cc Leverage python hierachical logger
4 years ago
Ross Wightman 6441e9cc1b Fix memory_efficient mode for DenseNets. Add AntiAliasing (Blur) support for DenseNets and create one test model. Add lr cycle/mul params to train args.
5 years ago
AFLALO, Jonathan Isaac a7f570c9b7 added MultiEpochsDataLoader
5 years ago
Ross Wightman 13cf68850b Remove poorly named metrics from torch imagenet example origins. Use top1/top5 in csv output for consistency with existing validation results files, acc elsewhere. Fixes #111
5 years ago
Ross Wightman 27b3680d49 Revamp LR noise, move logic to scheduler base. Fixup PlateauLRScheduler and add it as an option.
5 years ago
Ross Wightman 514b0938c4 Experimenting with per-epoch learning rate noise w/ step scheduler
5 years ago
Ross Wightman 43225d110c Unify drop connect vs drop path under 'drop path' name, switch all EfficientNet/MobilenetV3 refs to 'drop_path'. Update factory to handle new drop args.
5 years ago
Ross Wightman b3cb5f3275 Working on CutMix impl as per #8, integrating with Mixup, currently experimenting...
5 years ago
Andrew Lavin b72013def8 Added commandline argument validation-batch-size-multiplier with default set to 1.
5 years ago
Ross Wightman 5b7cc16ac9 Add warning about using sync-bn with zero initialized BN layers. Fixes #54
5 years ago
Ross Wightman d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
5 years ago
Ross Wightman 3eb4a96eda Update AugMix, JSD, etc comments and references
5 years ago
Ross Wightman 7547119891 Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together
5 years ago
Ross Wightman 40fea63ebe Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts
5 years ago
Ross Wightman 4666cc9aed Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script
5 years ago
Ross Wightman 232ab7fb12 Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl
5 years ago
Ross Wightman 5719b493ad Missed update dist-bn logic for EMA model
5 years ago
Ross Wightman a435ea1327 Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training.
5 years ago
Ross Wightman 3bff2b21dc Add support for keeping running bn stats the same across distributed training nodes before eval/save
5 years ago
Ross Wightman 1f39d15f15 Allow float decay epochs arg for training, works out with step lr math
5 years ago
Ross Wightman 7b83e67f77 Pass drop connect arg through to EfficientNet models
5 years ago
Ross Wightman 4748c6dff2 Fix non-prefetch variant of Mixup. Fixes #50
5 years ago
Ross Wightman 187ecbafbe Add support for loading args from yaml file (and saving them with each experiment)
5 years ago
Ross Wightman b750b76f67 More AutoAugment work. Ready to roll...
5 years ago
Ross Wightman 3d9c8a6489 Add support for new AMP checkpointing support w/ amp.state_dict
5 years ago
Ross Wightman fac58f609a Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
5 years ago
Ross Wightman 66634d2200 Add support to split random erasing blocks into randomly selected number with --recount arg. Fix random selection of aspect ratios.
5 years ago
Ross Wightman e7c8a37334 Make min-lr and cooldown-epochs cmdline args, change dash in color_jitter arg for consistency
5 years ago
Ross Wightman c6b32cbe73 A number of tweaks to arguments, epoch handling, config
5 years ago
Ross Wightman b20bb58284 Distributed tweaks
5 years ago
Ross Wightman 6fc886acaf Remove all prints, change most to logging calls, tweak alignment of batch logs, improve setup.py
5 years ago
Ross Wightman aa4354f466 Big re-org, working towards making pip/module as 'timm'
5 years ago
Ross Wightman 7dab6d1ec7 Default to img_size in model default_cfg, defer output folder creation until later in the init sequence
5 years ago