Commit Graph

47 Commits (2e83bba1422295f177c2681d5835012633cbae19)

Author SHA1 Message Date
Ross Wightman 9a51e4ea2e Add FlexiViT models and weights, refactoring, push more weights
2 years ago
Ross Wightman cda39b35bd Add a deprecation phase to module re-org
2 years ago
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2 years ago
Ross Wightman ffaf97f813 MaxxVit! A very configurable MaxVit and CoAtNet impl with lots of goodies..
2 years ago
Ross Wightman 43aa84e861 Add 'fast' layer norm that doesn't cast to float32, support APEX LN impl for slight speed gain, update norm and act factories, tweak SE for ability to disable bias (needed by GCVit)
2 years ago
Ross Wightman eca09b8642 Add MobileVitV2 support. Fix #1332. Move GroupNorm1 to common layers (used in poolformer + mobilevitv2). Keep ol custom ConvNeXt LayerNorm2d impl as LayerNormExp2d for reference.
2 years ago
Ross Wightman 7a9c6811c9 Add eps arg to LayerNorm2d, add 'tf' (tensorflow) variant of trunc_normal_ that applies scale/shift after sampling (instead of needing to move a/b)
2 years ago
Ross Wightman 879df47c0a Support BatchNormAct2d for sync-bn use. Fix #1254
2 years ago
Ross Wightman 95cfc9b3e8 Merge remote-tracking branch 'origin/master' into norm_norm_norm
3 years ago
Ross Wightman 656757d26b Fix MobileNetV2 head conv size for multiplier < 1.0. Add some missing modification copyrights, fix starting date of some old ones.
3 years ago
Ross Wightman ab49d275de Significant norm update
3 years ago
Ross Wightman 78912b6375 Updated EvoNorm implementations with some experimentation. Add FilterResponseNorm. Updated RegnetZ and ResNetV2 model defs for trials.
3 years ago
Ross Wightman 4f0f9cb348 Fix #954 by bringing traceable _assert into timm to allow compat w/ PyTorch < 1.8
3 years ago
Ross Wightman 925e102982 Update attention / self-attn based models from a series of experiments:
3 years ago
Ross Wightman 307a935b79 Add non-local and BAT attention. Merge attn and self-attn factories into one. Add attention references to README. Add mlp 'mode' to ECA.
4 years ago
Ross Wightman 742c2d5247 Add Gather-Excite and Global Context attn modules. Refactor existing SE-like attn for consistency and refactor byob/byoanet for less redundancy.
4 years ago
Ross Wightman f45de37690 Merge branch 'master' into levit_visformer_rednet
4 years ago
Ross Wightman d5af752117 Add preliminary gMLP and ResMLP impl to Mlp-Mixer
4 years ago
Ross Wightman 165fb354b2 Add initial RedNet model / Involution layer impl for testing
4 years ago
Ross Wightman b2c305c2aa Move Mlp and PatchEmbed modules into layers. Being used in lots of models now...
4 years ago
Ross Wightman 0d87650fea Remove filter hack from BlurPool w/ non-persistent buffer. Use BlurPool2d instead of AntiAliasing.. for TResNet. Breaks PyTorch < 1.6.
4 years ago
Ross Wightman ce62f96d4d ByoaNet with bottleneck transformer, lambda resnet, and halo net experiments
4 years ago
Ross Wightman cf5fec5047 Cleanup experimental vit weight init a bit
4 years ago
Ross Wightman 678ba4e0a2 Add NFNet-F model weights ported from DeepMind Haiku impl and new set of models w/ compatible config.
4 years ago
Ross Wightman 9811e229f7 Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models.
4 years ago
Ross Wightman 5a8e1e643e Initial Normalizer-Free Reg/ResNet impl. A bit of related layer refactoring.
4 years ago
Ross Wightman 231d04e91a ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
4 years ago
Ross Wightman 2ed8f24715 A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman f31933cb37 Initial Vision Transformer impl w/ patch and hybrid variants. Refactor tuple helpers.
4 years ago
Ross Wightman b1f1a54de9 More uniform treatment of classifiers across all models, reduce code duplication.
4 years ago
Ross Wightman 3b9004bef9 Lots of changes to model creation helpers, close to finalizing feature extraction / interfaces
4 years ago
Ross Wightman 88129b2569 Add set_layer_config contextmgr to adjust all layer configs at once, use in create_module with new args. Remove a few old warning causing constant annotations for jit.
5 years ago
Ross Wightman eb7653614f Monster commit, activation refactor, VoVNet, norm_act improvements, more
5 years ago
Ross Wightman 7df83258c9 Merge branch 'master' into densenet_update_and_more
5 years ago
Ross Wightman 17270c69b9 Remove annoying InceptionV3 dependency on scipy and insanely slow trunc_norm init. Bring InceptionV3 code into this codebase and use upcoming torch trunch_norm_ init.
5 years ago
Ross Wightman 780860d140 Add norm_act factory method, move JIT of norm layers to factory
5 years ago
Ross Wightman 14edacdf9a DenseNet converted to support ABN (norm + act) modules. Experimenting with EvoNorm, IABN
5 years ago
Ross Wightman 2681a8d618 Final blurpool2d cleanup and add resnetblur50 weights, match tresnet Downsample arg order to BlurPool2d for interop
5 years ago
Ross Wightman 9590f301a9 Merge branch 'blur' of https://github.com/VRandme/pytorch-image-models into VRandme-blur
5 years ago
talrid 6209146738 TResNet models
5 years ago
Ross Wightman 1a8f5900ab Update EfficientNet feature extraction for EfficientDet. Add needed MaxPoolSame as well.
5 years ago
Chris Ha acd1b6cccd Implement Functional Blur on resnet.py
5 years ago
Ross Wightman f1d5f8a6c4 Update comments for Selective Kernel and DropBlock/Path impl, add skresnet34 weights
5 years ago
Ross Wightman f902bcd54c Layer refactoring continues, ResNet downsample rewrite for proper dilation in 3x3 and avg_pool cases
5 years ago
Ross Wightman a99ec4e7d1 A bunch more layer reorg, splitting many layers into own files. Improve torchscript compatibility.
5 years ago
Ross Wightman 13746a33fc Big move, layer modules and fn to timm/models/layers
5 years ago
Ross Wightman 4defbbbaa8 Fix module name mistake, start layers sub-package
5 years ago