Commit Graph

285 Commits (9811e229f74c1a0e151a45041a35598025f7125d)

Author SHA1 Message Date
Ross Wightman 9811e229f7 Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models.
4 years ago
Ross Wightman a39c3ee216
Merge branch 'master' into eca-weights
4 years ago
Ross Wightman 666de85cf1 Move stride in EdgeResidual block to 3x3 expansion conv. Fix #414
4 years ago
Ross Wightman 3b57490a63 Fix some half removed resnet model defs, pooling for ecaresnet269d
4 years ago
Ross Wightman 68a4144882 Add new weights for ecaresnet26t/50t/269d models. Remove distinction between 't' and 'tn' (tiered models), tn is now t. Add test time img size spec to default cfg.
4 years ago
Ross Wightman b9843f954b
Merge pull request #282 from tigert1998/patch-1
4 years ago
hwangdeyu 7a4be5c035 add operator HardSwishJitAutoFn export to onnx
4 years ago
Ross Wightman f0e65e37b7 Fix NF-ResNet101 model defs
4 years ago
Ross Wightman 2de54d174a Fix pool size defs for NFNet models, add a comment.
4 years ago
Ross Wightman 90980de4a9 Fix up a few details in NFResNet models, managed stable training. Add support for gamma gain to be applied in activation or ScaleStdConv. Some tweaks to ScaledStdConv.
4 years ago
Ross Wightman 5a8e1e643e Initial Normalizer-Free Reg/ResNet impl. A bit of related layer refactoring.
4 years ago
Ross Wightman 38d8f67570 Fix potential issue with change to num_classes arg in train/validate.py defaulting to None (rely on model def / default_cfg)
4 years ago
Ross Wightman bb50ac4708 Add DeiT distilled weights and distilled model def. Remove some redudant ViT model args.
4 years ago
Ross Wightman c16e965037 Add some ViT comments and fix a few minor issues.
4 years ago
Ross Wightman 22748f1a2d Convert samples/targets in ParserImageInTar to numpy arrays, slightly less mem usage for massive datasets. Add a few more se/eca model defs to resnet.py
4 years ago
Ross Wightman 55f7dfa9ea Refactor vision_transformer entrpy fns, add pos embedding resize support for fine tuning, add some deit models for testing
4 years ago
Ross Wightman d55bcc0fee Finishing adding stochastic depth support to BiT ResNetV2 models
4 years ago
Ross Wightman 855d6cc217 More dataset work including factories and a tensorflow datasets (TFDS) wrapper
4 years ago
Ross Wightman 20516abc18 Fix some broken tests for ResNetV2 BiT models
4 years ago
Ross Wightman 59ec7e6a53 Merge branch 'master' into imagenet21k_datasets_more
4 years ago
Ross Wightman 4e2533db77 Add 320x320 model default cfgs for 101D and 152D ResNets. Add SEResNet-152D weights and 320x320 cfg.
4 years ago
Ross Wightman 0167f749d3 Remove some old __future__ imports
4 years ago
Ross Wightman ce69de70d3 Add 21k weight urls to vision_transformer. Cleanup feature_info for preact ResNetV2 (BiT) models
4 years ago
Ross Wightman 231d04e91a ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
4 years ago
Ross Wightman 392595c7eb Add pool_size to default cfgs for new models to prevent tests from failing. Add explicit 200D_320 model entrypoint for next benchmark run.
4 years ago
Ross Wightman b1f1228a41 Add ResNet101D, 152D, and 200D weights, remove meh 66d model
4 years ago
Ross Wightman cd72e66eff Bug in last mod for features_only default_cfg
4 years ago
Ross Wightman 867a0e5a04 Add default_cfg back to models wrapped in feature extraction module as per discussion in #294.
4 years ago
Ross Wightman 2ed8f24715 A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman 460eba7f24 Work around casting issue with combination of native torch AMP and torchscript for Linear layers
4 years ago
Ross Wightman 5f4b6076d8 Fix inplace arg compat for GELU and PreLU via activation factory
4 years ago
Ross Wightman fd962c4b4a Native SiLU (Swish) op doesn't export to ONNX
4 years ago
tigertang 43f2500c26
Add symbolic for SwishJitAutoFn to support onnx
4 years ago
Ross Wightman b401952caf Add newly added vision transformer large/base 224x224 weights ported from JAX official repo
4 years ago
Ross Wightman 61200db0ab in_chans=1 working w/ pretrained weights for vision_transformer
4 years ago
Ross Wightman e90edce438 Support native silu activation (aka swish). An optimized ver is available in PyTorch 1.7.
4 years ago
Ross Wightman da6cd2cc1f Fix regression for pretrained classifier loading when using entrypt functions directly
4 years ago
Ross Wightman f591e90b0d Make sure num_features attr is present in vit models as with others
4 years ago
Ross Wightman f944242cb0 Fix #262, num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT.
4 years ago
Ross Wightman 736f209e7d Update vision transformers to be compatible with official code. Port official ViT weights from jax impl.
4 years ago
Ross Wightman 27a93e9de7 Improve test crop for ViT models. Small now 77.85, added base weights at 79.35 top-1.
4 years ago
Ross Wightman d4db9e7977 Add small vision transformer weights. 77.42 top-1.
4 years ago
Ross Wightman f31933cb37 Initial Vision Transformer impl w/ patch and hybrid variants. Refactor tuple helpers.
4 years ago
Ross Wightman fcb6258877 Add missing leaky_relu layer factory defn, update Apex/Native loss scaler interfaces to support unscaled grad clipping. Bump ver to 0.2.2 for pending release.
4 years ago
Ross Wightman e8e2d9cabf Add DropPath (stochastic depth) to ReXNet and VoVNet. RegNet DropPath impl tweak and dedupe se args.
4 years ago
Ross Wightman e8ca45854c More models in sotabench, more control over sotabench run, dataset filename extraction consistency
4 years ago
Ross Wightman 9c406532bd Add EfficientNet-EdgeTPU-M (efficientnet_em) model trained natively in PyTorch. More sotabench fiddling.
4 years ago
Ross Wightman c40384f5bd Add ResNet weights. 80.5 (top-1) ResNet-50-D, 77.1 ResNet-34-D, 72.7 ResNet-18-D.
4 years ago
Ross Wightman 33f8a1bf36 Updated README, add wide_resnet50_2 and seresnext50_32x4d weights
4 years ago
Ross Wightman 751b0bba98 Add global_pool (--gp) arg changes to allow passing 'fast' easily for train/validate to avoid channels_last issue with AdaptiveAvgPool
4 years ago