Ross Wightman
|
231d04e91a
|
ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
|
4 years ago |
Ross Wightman
|
392595c7eb
|
Add pool_size to default cfgs for new models to prevent tests from failing. Add explicit 200D_320 model entrypoint for next benchmark run.
|
4 years ago |
Ross Wightman
|
b1f1228a41
|
Add ResNet101D, 152D, and 200D weights, remove meh 66d model
|
4 years ago |
Ross Wightman
|
cd72e66eff
|
Bug in last mod for features_only default_cfg
|
4 years ago |
Ross Wightman
|
867a0e5a04
|
Add default_cfg back to models wrapped in feature extraction module as per discussion in #294.
|
4 years ago |
Ross Wightman
|
2ed8f24715
|
A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
|
4 years ago |
Ross Wightman
|
460eba7f24
|
Work around casting issue with combination of native torch AMP and torchscript for Linear layers
|
4 years ago |
Ross Wightman
|
5f4b6076d8
|
Fix inplace arg compat for GELU and PreLU via activation factory
|
4 years ago |
Ross Wightman
|
fd962c4b4a
|
Native SiLU (Swish) op doesn't export to ONNX
|
4 years ago |
Ross Wightman
|
b401952caf
|
Add newly added vision transformer large/base 224x224 weights ported from JAX official repo
|
4 years ago |
Ross Wightman
|
61200db0ab
|
in_chans=1 working w/ pretrained weights for vision_transformer
|
4 years ago |
Ross Wightman
|
e90edce438
|
Support native silu activation (aka swish). An optimized ver is available in PyTorch 1.7.
|
4 years ago |
Ross Wightman
|
da6cd2cc1f
|
Fix regression for pretrained classifier loading when using entrypt functions directly
|
4 years ago |
Ross Wightman
|
f591e90b0d
|
Make sure num_features attr is present in vit models as with others
|
4 years ago |
Ross Wightman
|
f944242cb0
|
Fix #262, num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT.
|
4 years ago |
Ross Wightman
|
736f209e7d
|
Update vision transformers to be compatible with official code. Port official ViT weights from jax impl.
|
4 years ago |
Ross Wightman
|
27a93e9de7
|
Improve test crop for ViT models. Small now 77.85, added base weights at 79.35 top-1.
|
4 years ago |
Ross Wightman
|
d4db9e7977
|
Add small vision transformer weights. 77.42 top-1.
|
4 years ago |
Ross Wightman
|
f31933cb37
|
Initial Vision Transformer impl w/ patch and hybrid variants. Refactor tuple helpers.
|
4 years ago |
Ross Wightman
|
fcb6258877
|
Add missing leaky_relu layer factory defn, update Apex/Native loss scaler interfaces to support unscaled grad clipping. Bump ver to 0.2.2 for pending release.
|
4 years ago |
Ross Wightman
|
e8e2d9cabf
|
Add DropPath (stochastic depth) to ReXNet and VoVNet. RegNet DropPath impl tweak and dedupe se args.
|
4 years ago |
Ross Wightman
|
e8ca45854c
|
More models in sotabench, more control over sotabench run, dataset filename extraction consistency
|
4 years ago |
Ross Wightman
|
9c406532bd
|
Add EfficientNet-EdgeTPU-M (efficientnet_em) model trained natively in PyTorch. More sotabench fiddling.
|
4 years ago |
Ross Wightman
|
c40384f5bd
|
Add ResNet weights. 80.5 (top-1) ResNet-50-D, 77.1 ResNet-34-D, 72.7 ResNet-18-D.
|
4 years ago |
Ross Wightman
|
33f8a1bf36
|
Updated README, add wide_resnet50_2 and seresnext50_32x4d weights
|
4 years ago |
Ross Wightman
|
751b0bba98
|
Add global_pool (--gp) arg changes to allow passing 'fast' easily for train/validate to avoid channels_last issue with AdaptiveAvgPool
|
4 years ago |
Ross Wightman
|
9c297ec67d
|
Cleanup Apex vs native AMP scaler state save/load. Cleanup CheckpointSaver a bit.
|
4 years ago |
Ross Wightman
|
80c9d9cc72
|
Add 'fast' global pool option, remove redundant SEModule from tresnet, normal one is now 'fast'
|
4 years ago |
Ross Wightman
|
90a01f47d1
|
hrnet features_only pretrained weight loading issue. Fix #232.
|
4 years ago |
Ross Wightman
|
110a7c4982
|
AdaptiveAvgPool2d -> mean((2,3)) for all SE/attn layers to avoid NaN with AMP + channels_last layout. See https://github.com/pytorch/pytorch/issues/43992
|
4 years ago |
Ross Wightman
|
470220b1f4
|
Fix MobileNetV3 crash with global_pool='', output consistent with other models but not equivalent due to efficient head.
|
4 years ago |
Ross Wightman
|
fc8b8afb6f
|
Fix a silly bug in Sample version of EvoNorm missing x* part of swish, update EvoNormBatch to accumulated unbiased variance.
|
4 years ago |
Ross Wightman
|
0f5d9d8166
|
Add CSPResNet50 weights, 79.6 top-1 at 256x256
|
4 years ago |
Ross Wightman
|
b1b6e7c361
|
Fix a few more issues related to #216 w/ TResNet (space2depth) and FP16 weights in wide resnets. Also don't completely dump pretrained weights in in_chans != 1 or 3 cases.
|
4 years ago |
Ross Wightman
|
512b2dd645
|
Add new EfficientNet-B3 and RegNetY-3.2GF weights, both just over 82 top-1
|
4 years ago |
Ross Wightman
|
6890300877
|
Add DropPath (stochastic depth) to RegNet
|
4 years ago |
Yusuke Uchida
|
f6b56602f9
|
fix test_model_default_cfgs
|
4 years ago |
Ross Wightman
|
d5145fa4d5
|
Change default_cfg names for senet to include the legacy and match model names
|
4 years ago |
Ross Wightman
|
b1f1a54de9
|
More uniform treatment of classifiers across all models, reduce code duplication.
|
4 years ago |
Ross Wightman
|
d72ddafe56
|
Fix some checkpoint / model str regressions
|
4 years ago |
Ross Wightman
|
ac18adb9c3
|
Remove debug print from RexNet
|
4 years ago |
Ross Wightman
|
ec4976fdba
|
Add EfficientNet-Lite0 weights trained with this code by @hal-314, 75.484 top-1
|
4 years ago |
Ross Wightman
|
9ecd16bd7b
|
Add new seresnet50 (non-legacy) model weights, 80.274 top-1
|
4 years ago |
Ross Wightman
|
7995295968
|
Merge branch 'logger' into features. Change 'logger' to '_logger'.
|
4 years ago |
Ross Wightman
|
1998bd3180
|
Merge branch 'feature/AB/logger' of https://github.com/antoinebrl/pytorch-image-models into logger
|
4 years ago |
Ross Wightman
|
6c17d57a2c
|
Fix some attributions, add copyrights to some file docstrings
|
4 years ago |
Ross Wightman
|
a69c0e04f0
|
Fix pool size in cspnet
|
4 years ago |
Ross Wightman
|
14ef7a0dd6
|
Rename csp.py -> cspnet.py
|
4 years ago |
Ross Wightman
|
ec37008432
|
Add pretrained weight links to CSPNet for cspdarknet53, cspresnext50
|
4 years ago |
Ross Wightman
|
08016e839d
|
Cleanup FeatureInfo getters, add TF models sourced Xception41/65/71 weights
|
4 years ago |