Commit Graph

63 Commits (e62758cf4f935e849aebfc6ea0f2db1bcd54b63d)

Author SHA1 Message Date
Ross Wightman fa28067704 Add more augmentation arguments, including a no_aug disable flag. Fix #209
4 years ago
Ross Wightman 7995295968 Merge branch 'logger' into features. Change 'logger' to '_logger'.
4 years ago
Ross Wightman 1998bd3180 Merge branch 'feature/AB/logger' of https://github.com/antoinebrl/pytorch-image-models into logger
4 years ago
Ross Wightman 6c17d57a2c Fix some attributions, add copyrights to some file docstrings
4 years ago
Antoine Broyelle 78fa0772cc Leverage python hierachical logger
4 years ago
Ross Wightman 6441e9cc1b Fix memory_efficient mode for DenseNets. Add AntiAliasing (Blur) support for DenseNets and create one test model. Add lr cycle/mul params to train args.
5 years ago
AFLALO, Jonathan Isaac a7f570c9b7 added MultiEpochsDataLoader
5 years ago
Ross Wightman 13cf68850b Remove poorly named metrics from torch imagenet example origins. Use top1/top5 in csv output for consistency with existing validation results files, acc elsewhere. Fixes #111
5 years ago
Ross Wightman 27b3680d49 Revamp LR noise, move logic to scheduler base. Fixup PlateauLRScheduler and add it as an option.
5 years ago
Ross Wightman 514b0938c4 Experimenting with per-epoch learning rate noise w/ step scheduler
5 years ago
Ross Wightman 43225d110c Unify drop connect vs drop path under 'drop path' name, switch all EfficientNet/MobilenetV3 refs to 'drop_path'. Update factory to handle new drop args.
5 years ago
Andrew Lavin b72013def8 Added commandline argument validation-batch-size-multiplier with default set to 1.
5 years ago
Ross Wightman 5b7cc16ac9 Add warning about using sync-bn with zero initialized BN layers. Fixes #54
5 years ago
Ross Wightman d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
5 years ago
Ross Wightman 3eb4a96eda Update AugMix, JSD, etc comments and references
5 years ago
Ross Wightman 7547119891 Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together
5 years ago
Ross Wightman 40fea63ebe Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts
5 years ago
Ross Wightman 4666cc9aed Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script
5 years ago
Ross Wightman 232ab7fb12 Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl
5 years ago
Ross Wightman 5719b493ad Missed update dist-bn logic for EMA model
5 years ago
Ross Wightman a435ea1327 Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training.
5 years ago
Ross Wightman 3bff2b21dc Add support for keeping running bn stats the same across distributed training nodes before eval/save
5 years ago
Ross Wightman 1f39d15f15 Allow float decay epochs arg for training, works out with step lr math
5 years ago
Ross Wightman 7b83e67f77 Pass drop connect arg through to EfficientNet models
5 years ago
Ross Wightman 4748c6dff2 Fix non-prefetch variant of Mixup. Fixes #50
5 years ago
Ross Wightman 187ecbafbe Add support for loading args from yaml file (and saving them with each experiment)
5 years ago
Ross Wightman b750b76f67 More AutoAugment work. Ready to roll...
5 years ago
Ross Wightman 3d9c8a6489 Add support for new AMP checkpointing support w/ amp.state_dict
5 years ago
Ross Wightman fac58f609a Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
5 years ago
Ross Wightman 66634d2200 Add support to split random erasing blocks into randomly selected number with --recount arg. Fix random selection of aspect ratios.
5 years ago
Ross Wightman e7c8a37334 Make min-lr and cooldown-epochs cmdline args, change dash in color_jitter arg for consistency
5 years ago
Ross Wightman c6b32cbe73 A number of tweaks to arguments, epoch handling, config
5 years ago
Ross Wightman b20bb58284 Distributed tweaks
5 years ago
Ross Wightman 6fc886acaf Remove all prints, change most to logging calls, tweak alignment of batch logs, improve setup.py
5 years ago
Ross Wightman aa4354f466 Big re-org, working towards making pip/module as 'timm'
5 years ago
Ross Wightman 7dab6d1ec7 Default to img_size in model default_cfg, defer output folder creation until later in the init sequence
5 years ago
Ross Wightman 9bcd65181b Add exponential moving average for model weights + few other additions and cleanup
5 years ago
Ross Wightman e6c14427c0 More appropriate/correct loss name
6 years ago
Zhun Zhong 127487369f
Fix bug for prefetcher
6 years ago
Ross Wightman 4d2056722a Mixup and prefetcher improvements
6 years ago
Ross Wightman 780c0a96a4 Change args for RandomErasing so only one required for pixel/color mode
6 years ago
Ross Wightman 76539d905e Some transform/data/loader refactoring, hopefully didn't break things
6 years ago
Ross Wightman fee607edf6 Mixup implemention in progress
6 years ago
Ross Wightman 8fbd62a169 Exclude batchnorm and bias params from weight_decay by default
6 years ago
Ross Wightman bc264269c9 Morph mnasnet impl into a generic mobilenet that covers Mnasnet, MobileNetV1/V2, ChamNet, FBNet, and related
6 years ago
Ross Wightman e9c7961efc Fix pooling in mnasnet, more sensible default for AMP opt level
6 years ago
Ross Wightman 0562b91c38 Add per model crop pct, interpolation defaults, tie it all together
6 years ago
Ross Wightman c328b155e9 Random erasing crash fix and args pass through
6 years ago
Ross Wightman 9c3859fb9c Uniform pretrained model handling.
6 years ago
Ross Wightman f1cd1a5ce3 Cleanup CheckpointSaver, add support for increasing or decreasing metric, switch to prec1 metric in train loop
6 years ago