Ross Wightman
38d8f67570
Fix potential issue with change to num_classes arg in train/validate.py defaulting to None (rely on model def / default_cfg)
4 years ago
Ross Wightman
587780e56b
Update README.md and bump version to 0.4.0
4 years ago
Ross Wightman
bb50ac4708
Add DeiT distilled weights and distilled model def. Remove some redudant ViT model args.
4 years ago
Ross Wightman
c16e965037
Add some ViT comments and fix a few minor issues.
4 years ago
Ross Wightman
22748f1a2d
Convert samples/targets in ParserImageInTar to numpy arrays, slightly less mem usage for massive datasets. Add a few more se/eca model defs to resnet.py
4 years ago
Ross Wightman
5d4c3d0af3
Add enhanced ParserImageInTar that can read images from tars within tars, folders with multiple tars, etc. Additional comment cleanup.
4 years ago
Ross Wightman
55f7dfa9ea
Refactor vision_transformer entrpy fns, add pos embedding resize support for fine tuning, add some deit models for testing
4 years ago
Ross Wightman
9d5d4b8df6
Fix silly train.py typo during dataset work
4 years ago
Ross Wightman
d55bcc0fee
Finishing adding stochastic depth support to BiT ResNetV2 models
4 years ago
Ross Wightman
0a1668f63e
Update tests
4 years ago
Ross Wightman
58ccf43150
Add BiT references and knowledge distill links to readme/docs
4 years ago
Ross Wightman
855d6cc217
More dataset work including factories and a tensorflow datasets (TFDS) wrapper
...
* Add parser/dataset factory methods for more flexible dataset & parser creation
* Add dataset parser that wraps TFDS image classification datasets
* Tweak num_classes handling bug for 21k models
* Add initial deit models so they can be benchmarked in next csv results runs
4 years ago
Ross Wightman
f8463b8fa9
Version 0.3.4. Tweak setup.cfg and update setup.py metadata
4 years ago
Ross Wightman
20516abc18
Fix some broken tests for ResNetV2 BiT models
4 years ago
Ross Wightman
fd9061dbf7
Remove debug print from train.py
4 years ago
Ross Wightman
59ec7e6a53
Merge branch 'master' into imagenet21k_datasets_more
4 years ago
Ross Wightman
fc3d9183e8
Merge pull request #335 from kecsap/new_option
...
Add --input-size option to scripts to specify full input dimensions f…
4 years ago
Ross Wightman
e7a9ddf982
Merge pull request #334 from kecsap/links
...
Follow symbolic links during dataset scanning
4 years ago
Ross Wightman
19816fe226
Add citation info
4 years ago
Csaba Kertesz
e42b140ade
Add --input-size option to scripts to specify full input dimensions from command-line
4 years ago
Csaba Kertesz
7cae7e7035
Follow links during dataset scanning
4 years ago
Ross Wightman
1d01c2b68c
Update README.md
4 years ago
Ross Wightman
c96e9f99a0
Update version to 0.3.3
4 years ago
Ross Wightman
a7d0a8b5b2
Update results csv files with latest models, incl 101D, 152D, 200D, SE152D ResNets and yet to be merged BiT and ViT-R50 models.
4 years ago
Ross Wightman
4e2533db77
Add 320x320 model default cfgs for 101D and 152D ResNets. Add SEResNet-152D weights and 320x320 cfg.
4 years ago
Ross Wightman
0167f749d3
Remove some old __future__ imports
4 years ago
Ross Wightman
85bf4b8cd6
Add setup.cfg for conda / fastai integration
4 years ago
Ross Wightman
e553480b67
Add 21843 synset txt for google 21k models like BiT/ViT
4 years ago
Ross Wightman
e35e9760a6
More work on dataset / parser split and imagenet21k (tar) support
4 years ago
Ross Wightman
ce69de70d3
Add 21k weight urls to vision_transformer. Cleanup feature_info for preact ResNetV2 (BiT) models
4 years ago
Ross Wightman
231d04e91a
ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
4 years ago
Ross Wightman
de6046e213
Initial commit for dataset / parser reorg to support additional datasets / types
4 years ago
Ross Wightman
392595c7eb
Add pool_size to default cfgs for new models to prevent tests from failing. Add explicit 200D_320 model entrypoint for next benchmark run.
4 years ago
Ross Wightman
7a75b8d033
Update README.md
4 years ago
Ross Wightman
b1f1228a41
Add ResNet101D, 152D, and 200D weights, remove meh 66d model
4 years ago
Ross Wightman
198f6ea0f3
Merge pull request #302 from Jasha10/create_optimizer-opt_args
...
Configure create_optimizer with args.opt_args
4 years ago
Jasha
7c56c718f3
Configure create_optimizer with args.opt_args
...
Closes #301
4 years ago
Ross Wightman
51d74d91da
Update README.md
4 years ago
Ross Wightman
9a25fdf3ad
Merge pull request #297 from rwightman/ema_simplify
...
Simplified JIT compatible Ema module. Fixes for SiLU export and torchscript training w/ Linear layer.
4 years ago
Ross Wightman
c9ebe86d03
Merge pull request #300 from tmkkk/real-labels-fix
...
Fix a bug with accuracy retrieving from RealLabels
4 years ago
Tymoteusz Wiśniewski
de15b43865
Fix a bug with accuracy retrieving from RealLabels
4 years ago
Ross Wightman
cd72e66eff
Bug in last mod for features_only default_cfg
4 years ago
Ross Wightman
867a0e5a04
Add default_cfg back to models wrapped in feature extraction module as per discussion in #294 .
4 years ago
Ross Wightman
4ca52d73d8
Add separate set and update method to ModelEmaV2
4 years ago
Ross Wightman
2ed8f24715
A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman
6504a42832
Version 0.3.2
4 years ago
Ross Wightman
460eba7f24
Work around casting issue with combination of native torch AMP and torchscript for Linear layers
4 years ago
Ross Wightman
5f4b6076d8
Fix inplace arg compat for GELU and PreLU via activation factory
4 years ago
Ross Wightman
fd962c4b4a
Native SiLU (Swish) op doesn't export to ONNX
4 years ago
Ross Wightman
27bbc70d71
Add back old ModelEma and rename new one to ModelEmaV2 to avoid compat breaks in dependant code. Shuffle train script, add a few comments, remove DataParallel support, support experimental torchscript training.
4 years ago