Ross Wightman
656757d26b
Fix MobileNetV2 head conv size for multiplier < 1.0. Add some missing modification copyrights, fix starting date of some old ones.
3 years ago
Ross Wightman
4f0f9cb348
Fix #954 by bringing traceable _assert into timm to allow compat w/ PyTorch < 1.8
3 years ago
Ross Wightman
925e102982
Update attention / self-attn based models from a series of experiments:
...
* remove dud attention, involution + my swin attention adaptation don't seem worth keeping
* add or update several new 26/50 layer ResNe(X)t variants that were used in experiments
* remove models associated with dead-end or uninteresting experiment results
* weights coming soon...
3 years ago
Ross Wightman
307a935b79
Add non-local and BAT attention. Merge attn and self-attn factories into one. Add attention references to README. Add mlp 'mode' to ECA.
3 years ago
Ross Wightman
742c2d5247
Add Gather-Excite and Global Context attn modules. Refactor existing SE-like attn for consistency and refactor byob/byoanet for less redundancy.
3 years ago
Ross Wightman
f45de37690
Merge branch 'master' into levit_visformer_rednet
4 years ago
Ross Wightman
d5af752117
Add preliminary gMLP and ResMLP impl to Mlp-Mixer
4 years ago
Ross Wightman
165fb354b2
Add initial RedNet model / Involution layer impl for testing
4 years ago
Ross Wightman
b2c305c2aa
Move Mlp and PatchEmbed modules into layers. Being used in lots of models now...
4 years ago
Ross Wightman
0d87650fea
Remove filter hack from BlurPool w/ non-persistent buffer. Use BlurPool2d instead of AntiAliasing.. for TResNet. Breaks PyTorch < 1.6.
4 years ago
Ross Wightman
ce62f96d4d
ByoaNet with bottleneck transformer, lambda resnet, and halo net experiments
4 years ago
Ross Wightman
cf5fec5047
Cleanup experimental vit weight init a bit
4 years ago
Ross Wightman
678ba4e0a2
Add NFNet-F model weights ported from DeepMind Haiku impl and new set of models w/ compatible config.
4 years ago
Ross Wightman
9811e229f7
Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models.
4 years ago
Ross Wightman
5a8e1e643e
Initial Normalizer-Free Reg/ResNet impl. A bit of related layer refactoring.
4 years ago
Ross Wightman
231d04e91a
ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
4 years ago
Ross Wightman
2ed8f24715
A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman
f31933cb37
Initial Vision Transformer impl w/ patch and hybrid variants. Refactor tuple helpers.
4 years ago
Ross Wightman
b1f1a54de9
More uniform treatment of classifiers across all models, reduce code duplication.
4 years ago
Ross Wightman
3b9004bef9
Lots of changes to model creation helpers, close to finalizing feature extraction / interfaces
4 years ago
Ross Wightman
88129b2569
Add set_layer_config contextmgr to adjust all layer configs at once, use in create_module with new args. Remove a few old warning causing constant annotations for jit.
4 years ago
Ross Wightman
eb7653614f
Monster commit, activation refactor, VoVNet, norm_act improvements, more
...
* refactor activations into basic PyTorch, jit scripted, and memory efficient custom auto
* implement hard-mish, better grad for hard-swish
* add initial VovNet V1/V2 impl, fix #151
* VovNet and DenseNet first models to use NormAct layers (support BatchNormAct2d, EvoNorm, InplaceIABN)
* Wrap IABN for any models that use it
* make more models torchscript compatible (DPN, PNasNet, Res2Net, SelecSLS) and add tests
4 years ago
Ross Wightman
7df83258c9
Merge branch 'master' into densenet_update_and_more
5 years ago
Ross Wightman
17270c69b9
Remove annoying InceptionV3 dependency on scipy and insanely slow trunc_norm init. Bring InceptionV3 code into this codebase and use upcoming torch trunch_norm_ init.
5 years ago
Ross Wightman
780860d140
Add norm_act factory method, move JIT of norm layers to factory
5 years ago
Ross Wightman
14edacdf9a
DenseNet converted to support ABN (norm + act) modules. Experimenting with EvoNorm, IABN
5 years ago
Ross Wightman
2681a8d618
Final blurpool2d cleanup and add resnetblur50 weights, match tresnet Downsample arg order to BlurPool2d for interop
5 years ago
Ross Wightman
9590f301a9
Merge branch 'blur' of https://github.com/VRandme/pytorch-image-models into VRandme-blur
5 years ago
talrid
6209146738
TResNet models
5 years ago
Ross Wightman
1a8f5900ab
Update EfficientNet feature extraction for EfficientDet. Add needed MaxPoolSame as well.
5 years ago
Chris Ha
acd1b6cccd
Implement Functional Blur on resnet.py
...
1. add ResNet argument blur=''
2. implement blur for maxpool and strided convs in downsampling blocks
5 years ago
Ross Wightman
f1d5f8a6c4
Update comments for Selective Kernel and DropBlock/Path impl, add skresnet34 weights
5 years ago
Ross Wightman
f902bcd54c
Layer refactoring continues, ResNet downsample rewrite for proper dilation in 3x3 and avg_pool cases
...
* select_conv2d -> create_conv2d
* added create_attn to create attention module from string/bool/module
* factor padding helpers into own file, use in both conv2d_same and avg_pool2d_same
* add some more test eca resnet variants
* minor tweaks, naming, comments, consistency
5 years ago
Ross Wightman
a99ec4e7d1
A bunch more layer reorg, splitting many layers into own files. Improve torchscript compatibility.
5 years ago
Ross Wightman
13746a33fc
Big move, layer modules and fn to timm/models/layers
5 years ago
Ross Wightman
4defbbbaa8
Fix module name mistake, start layers sub-package
5 years ago