Ross Wightman
b2c305c2aa
Move Mlp and PatchEmbed modules into layers. Being used in lots of models now...
4 years ago
Ross Wightman
d5473c17f7
Fix incorrect name of shortcut/identity paths in many residual nets. Inherited from naming in old old torchvision, long fixed there.
4 years ago
Ross Wightman
ddc743fdf8
Update ResNet-RS models to EMA weights
4 years ago
Ross Wightman
08d60f4a9a
resnetrs50 pool sizing wrong
4 years ago
Ross Wightman
67d0665b46
Post ResNet-RS merge cleanup. Add weight urls, adjust train/test/crop pct.
4 years ago
Aman Arora
560eae38f5
[WIP] Add ResNet-RS models ( #554 )
...
* Add ResNet-RS models
* Only include resnet-rs changes
* remove whitespace diff
* EOF newline
* Update time
* increase time
* Add first conv
* Try running only resnetv2_101x1_bitm on Linux runner
* Add to exclude filter
* Run test_model_forward_features for all
* Add to exclude ftrs
* back to defaults
* only run test_forward_features
* run all tests
* Run all tests
* Add bigger resnetrs to model filters to fix Github CLI
* Remove resnetv2_101x1_bitm from exclude feat features
* Remove hardcoded values
* Make sure reduction ratio in resnetrs is 0.25
* There is no bias in replaced maxpool so remove it
4 years ago
Ross Wightman
e15c3886ba
Defaul lambda r=7. Define '26t' stage 4/5 256x256 variants for all of bot/halo/lambda nets for experiment. Add resnet50t for exp. Fix a few comments.
4 years ago
Ross Wightman
d584e7f617
Support for huggingface hub via create_model and default_cfgs.
...
* improve consistency of model creation helper fns
* add comments to some of the model helpers
* support passing external default_cfgs so they can be sourced from hub
4 years ago
Ross Wightman
3b57490a63
Fix some half removed resnet model defs, pooling for ecaresnet269d
4 years ago
Ross Wightman
68a4144882
Add new weights for ecaresnet26t/50t/269d models. Remove distinction between 't' and 'tn' (tiered models), tn is now t. Add test time img size spec to default cfg.
4 years ago
Ross Wightman
22748f1a2d
Convert samples/targets in ParserImageInTar to numpy arrays, slightly less mem usage for massive datasets. Add a few more se/eca model defs to resnet.py
4 years ago
Ross Wightman
4e2533db77
Add 320x320 model default cfgs for 101D and 152D ResNets. Add SEResNet-152D weights and 320x320 cfg.
4 years ago
Ross Wightman
392595c7eb
Add pool_size to default cfgs for new models to prevent tests from failing. Add explicit 200D_320 model entrypoint for next benchmark run.
4 years ago
Ross Wightman
b1f1228a41
Add ResNet101D, 152D, and 200D weights, remove meh 66d model
4 years ago
Ross Wightman
c40384f5bd
Add ResNet weights. 80.5 (top-1) ResNet-50-D, 77.1 ResNet-34-D, 72.7 ResNet-18-D.
4 years ago
Ross Wightman
33f8a1bf36
Updated README, add wide_resnet50_2 and seresnext50_32x4d weights
4 years ago
Yusuke Uchida
f6b56602f9
fix test_model_default_cfgs
4 years ago
Ross Wightman
b1f1a54de9
More uniform treatment of classifiers across all models, reduce code duplication.
4 years ago
Ross Wightman
d72ddafe56
Fix some checkpoint / model str regressions
4 years ago
Ross Wightman
9ecd16bd7b
Add new seresnet50 (non-legacy) model weights, 80.274 top-1
4 years ago
Ross Wightman
6c17d57a2c
Fix some attributions, add copyrights to some file docstrings
4 years ago
Ross Wightman
3b9004bef9
Lots of changes to model creation helpers, close to finalizing feature extraction / interfaces
4 years ago
Ross Wightman
f122f0274b
Significant ResNet refactor:
...
* stage creation + make_layer moved to separate fn with more sensible dilation/output_stride calc
* drop path rate decay easy to impl with refactored block creation loops
* fix dilation + blur pool combo
4 years ago
Ross Wightman
a66df5fb91
More model feature extraction support, start to deprecate senet.py, dilations added to regnet, add proper aligned xception
4 years ago
Ross Wightman
7729f40dca
Fix another bug, update all gluon resnet models to use new creation method (feature support)
4 years ago
Ross Wightman
d0113f9cdb
Fix a few issues that came up in tests
4 years ago
Ross Wightman
d23a2697d0
Working on feature extraction, interfaces refined, a number of models working, some in progress.
4 years ago
Ross Wightman
eb7653614f
Monster commit, activation refactor, VoVNet, norm_act improvements, more
...
* refactor activations into basic PyTorch, jit scripted, and memory efficient custom auto
* implement hard-mish, better grad for hard-swish
* add initial VovNet V1/V2 impl, fix #151
* VovNet and DenseNet first models to use NormAct layers (support BatchNormAct2d, EvoNorm, InplaceIABN)
* Wrap IABN for any models that use it
* make more models torchscript compatible (DPN, PNasNet, Res2Net, SelecSLS) and add tests
5 years ago
Vyacheslav Shults
a7ebe09029
Replace all None by nn.Identity() in all models reset_classifier when False-values num_classes is given.
...
Make small code refactoring
5 years ago
Ross Wightman
2681a8d618
Final blurpool2d cleanup and add resnetblur50 weights, match tresnet Downsample arg order to BlurPool2d for interop
5 years ago
Ross Wightman
9590f301a9
Merge branch 'blur' of https://github.com/VRandme/pytorch-image-models into VRandme-blur
5 years ago
Ross Wightman
0834fbc01c
Move pruned model adapt strings to separate txt files. A few minor formatting alignment tweaks.
5 years ago
AFLALO, Jonathan Isaac
07f19dd699
added eca resnet
5 years ago
Chris Ha
06a50a94a8
Fix minor typos in create_attn.py and resnet.py
...
'eca'->'ceca'
and
doest not-> does not
5 years ago
Ross Wightman
6cdeca24a3
Some cleanup and fixes for initial BlurPool impl. Still some testing and tweaks to go...
5 years ago
Chris Ha
acd1b6cccd
Implement Functional Blur on resnet.py
...
1. add ResNet argument blur=''
2. implement blur for maxpool and strided convs in downsampling blocks
5 years ago
Ross Wightman
5a16c533ff
Add better resnext50_32x4d weights trained by andravin
5 years ago
Ross Wightman
f1d5f8a6c4
Update comments for Selective Kernel and DropBlock/Path impl, add skresnet34 weights
5 years ago
Ross Wightman
f902bcd54c
Layer refactoring continues, ResNet downsample rewrite for proper dilation in 3x3 and avg_pool cases
...
* select_conv2d -> create_conv2d
* added create_attn to create attention module from string/bool/module
* factor padding helpers into own file, use in both conv2d_same and avg_pool2d_same
* add some more test eca resnet variants
* minor tweaks, naming, comments, consistency
5 years ago
Ross Wightman
13746a33fc
Big move, layer modules and fn to timm/models/layers
5 years ago
Ross Wightman
f54612f648
Merge branch 'select_kernel' into attention
5 years ago
Ross Wightman
4defbbbaa8
Fix module name mistake, start layers sub-package
5 years ago
Chris Ha
db91ba053b
EcaModule(CamelCase)
...
CamelCased EcaModule.
Renamed all instances of ecalayer to EcaModule.
eca_module.py->EcaModule.py
5 years ago
Chris Ha
f87fcd7e88
Implement Eca modules
...
implement ECA module by
1. adopting original eca_module.py into models folder
2. adding use_eca layer besides every instance of SE layer
5 years ago
Ross Wightman
a9d2424fd1
Add separate zero_init_last_bn function to support more block variety without a mess
5 years ago
Ross Wightman
9f11b4e8a2
Add ConvBnAct layer to parallel integrated SelectKernelConv, add support for DropPath and DropBlock to ResNet base and SK blocks
5 years ago
Ross Wightman
cefc9b7761
Move SelectKernelConv to conv2d_layers and more
...
* always apply attention in SelectKernelConv, leave MixedConv for no attention alternative
* make MixedConv torchscript compatible
* refactor first/previous dilation name to make more sense in ResNet* networks
5 years ago
Ross Wightman
58e28dc7e7
Move Selective Kernel blocks/convs to their own sknet.py file
5 years ago
Ross Wightman
a93bae6dc5
A SelectiveKernelBasicBlock for more experiments
5 years ago
Ross Wightman
ad087b4b17
Missed bias=False in selection conv
5 years ago