Commit Graph

150 Commits (041709b470bdc993c9c5b4e926791a694119a632)

Author SHA1 Message Date
Ross Wightman d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
5 years ago
Ross Wightman 3eb4a96eda Update AugMix, JSD, etc comments and references
5 years ago
Ross Wightman 7547119891 Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together
5 years ago
Ross Wightman 40fea63ebe Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts
5 years ago
Ross Wightman 4666cc9aed Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script
5 years ago
Ross Wightman 232ab7fb12 Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl
5 years ago
Ross Wightman 5719b493ad Missed update dist-bn logic for EMA model
5 years ago
Ross Wightman a435ea1327 Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training.
5 years ago
Ross Wightman 3bff2b21dc Add support for keeping running bn stats the same across distributed training nodes before eval/save
5 years ago
Ross Wightman 1f39d15f15 Allow float decay epochs arg for training, works out with step lr math
5 years ago
Ross Wightman 7b83e67f77 Pass drop connect arg through to EfficientNet models
5 years ago
Ross Wightman 4748c6dff2 Fix non-prefetch variant of Mixup. Fixes #50
5 years ago
Ross Wightman 187ecbafbe Add support for loading args from yaml file (and saving them with each experiment)
5 years ago
Ross Wightman b750b76f67 More AutoAugment work. Ready to roll...
5 years ago
Ross Wightman 3d9c8a6489 Add support for new AMP checkpointing support w/ amp.state_dict
5 years ago
Ross Wightman fac58f609a Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
5 years ago
Ross Wightman 66634d2200 Add support to split random erasing blocks into randomly selected number with --recount arg. Fix random selection of aspect ratios.
5 years ago
Ross Wightman e7c8a37334 Make min-lr and cooldown-epochs cmdline args, change dash in color_jitter arg for consistency
5 years ago
Ross Wightman c6b32cbe73 A number of tweaks to arguments, epoch handling, config
6 years ago
Ross Wightman b20bb58284 Distributed tweaks
6 years ago
Ross Wightman 6fc886acaf Remove all prints, change most to logging calls, tweak alignment of batch logs, improve setup.py
6 years ago
Ross Wightman aa4354f466 Big re-org, working towards making pip/module as 'timm'
6 years ago
Ross Wightman 7dab6d1ec7 Default to img_size in model default_cfg, defer output folder creation until later in the init sequence
6 years ago
Ross Wightman 9bcd65181b Add exponential moving average for model weights + few other additions and cleanup
6 years ago
Ross Wightman e6c14427c0 More appropriate/correct loss name
6 years ago
Zhun Zhong 127487369f
Fix bug for prefetcher
6 years ago
Ross Wightman 4d2056722a Mixup and prefetcher improvements
6 years ago
Ross Wightman 780c0a96a4 Change args for RandomErasing so only one required for pixel/color mode
6 years ago
Ross Wightman 76539d905e Some transform/data/loader refactoring, hopefully didn't break things
6 years ago
Ross Wightman fee607edf6 Mixup implemention in progress
6 years ago
Ross Wightman 8fbd62a169 Exclude batchnorm and bias params from weight_decay by default
6 years ago
Ross Wightman bc264269c9 Morph mnasnet impl into a generic mobilenet that covers Mnasnet, MobileNetV1/V2, ChamNet, FBNet, and related
6 years ago
Ross Wightman e9c7961efc Fix pooling in mnasnet, more sensible default for AMP opt level
6 years ago
Ross Wightman 0562b91c38 Add per model crop pct, interpolation defaults, tie it all together
6 years ago
Ross Wightman c328b155e9 Random erasing crash fix and args pass through
6 years ago
Ross Wightman 9c3859fb9c Uniform pretrained model handling.
6 years ago
Ross Wightman f1cd1a5ce3 Cleanup CheckpointSaver, add support for increasing or decreasing metric, switch to prec1 metric in train loop
6 years ago
Ross Wightman 5180f94c7e Distributed (multi-process) train, multi-gpu single process train, and NVIDIA AMP support
6 years ago
Ross Wightman 45cde6f0c7 Improve creation of data pipeline with prefetch enabled vs disabled, fixup inception_res_v2 and dpn models
6 years ago
Ross Wightman 2295cf56c2 Add some Nvidia performance enhancements (prefetch loader, fast collate), and refactor some of training and model fact/transforms
6 years ago
Ross Wightman 9d927a389a Add adabound, random erasing
6 years ago
Ross Wightman 1577c52976 Resnext added, changes to bring it and seresnet in line with rest of models
6 years ago
Ross Wightman 31055466fc Fixup validate/inference script args, fix senet init for better test accuracy
6 years ago
Ross Wightman b1a5a71151 Update schedulers
6 years ago
Ross Wightman b5255960d9 Tweaking tanh scheduler, senet weight init (for BN), transform defaults
6 years ago
Ross Wightman a336e5bff3 Minor updates
6 years ago
Ross Wightman cf0c280e1b Cleanup tranforms, add custom schedulers, tweak senet34 model
6 years ago
Ross Wightman c57717d325 Fix tta train bug, improve logging
6 years ago
Ross Wightman 72b4d162a2 Increase training performance
6 years ago
Ross Wightman 5855b07ae0 Initial commit, puting some ol pieces together
6 years ago