Commit Graph

26 Commits (79640fcc1f72241431b534420f2f7c9868157e93)

Author SHA1 Message Date
Ross Wightman ce62f96d4d ByoaNet with bottleneck transformer, lambda resnet, and halo net experiments
4 years ago
Ross Wightman cf5fec5047 Cleanup experimental vit weight init a bit
4 years ago
Ross Wightman 678ba4e0a2 Add NFNet-F model weights ported from DeepMind Haiku impl and new set of models w/ compatible config.
4 years ago
Ross Wightman 9811e229f7 Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models.
4 years ago
Ross Wightman 5a8e1e643e Initial Normalizer-Free Reg/ResNet impl. A bit of related layer refactoring.
4 years ago
Ross Wightman 231d04e91a ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
4 years ago
Ross Wightman 2ed8f24715 A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman f31933cb37 Initial Vision Transformer impl w/ patch and hybrid variants. Refactor tuple helpers.
4 years ago
Ross Wightman b1f1a54de9 More uniform treatment of classifiers across all models, reduce code duplication.
4 years ago
Ross Wightman 3b9004bef9 Lots of changes to model creation helpers, close to finalizing feature extraction / interfaces
4 years ago
Ross Wightman 88129b2569 Add set_layer_config contextmgr to adjust all layer configs at once, use in create_module with new args. Remove a few old warning causing constant annotations for jit.
5 years ago
Ross Wightman eb7653614f Monster commit, activation refactor, VoVNet, norm_act improvements, more
5 years ago
Ross Wightman 7df83258c9 Merge branch 'master' into densenet_update_and_more
5 years ago
Ross Wightman 17270c69b9 Remove annoying InceptionV3 dependency on scipy and insanely slow trunc_norm init. Bring InceptionV3 code into this codebase and use upcoming torch trunch_norm_ init.
5 years ago
Ross Wightman 780860d140 Add norm_act factory method, move JIT of norm layers to factory
5 years ago
Ross Wightman 14edacdf9a DenseNet converted to support ABN (norm + act) modules. Experimenting with EvoNorm, IABN
5 years ago
Ross Wightman 2681a8d618 Final blurpool2d cleanup and add resnetblur50 weights, match tresnet Downsample arg order to BlurPool2d for interop
5 years ago
Ross Wightman 9590f301a9 Merge branch 'blur' of https://github.com/VRandme/pytorch-image-models into VRandme-blur
5 years ago
talrid 6209146738 TResNet models
5 years ago
Ross Wightman 1a8f5900ab Update EfficientNet feature extraction for EfficientDet. Add needed MaxPoolSame as well.
5 years ago
Chris Ha acd1b6cccd Implement Functional Blur on resnet.py
5 years ago
Ross Wightman f1d5f8a6c4 Update comments for Selective Kernel and DropBlock/Path impl, add skresnet34 weights
5 years ago
Ross Wightman f902bcd54c Layer refactoring continues, ResNet downsample rewrite for proper dilation in 3x3 and avg_pool cases
5 years ago
Ross Wightman a99ec4e7d1 A bunch more layer reorg, splitting many layers into own files. Improve torchscript compatibility.
5 years ago
Ross Wightman 13746a33fc Big move, layer modules and fn to timm/models/layers
5 years ago
Ross Wightman 4defbbbaa8 Fix module name mistake, start layers sub-package
5 years ago