Commit Graph

748 Commits (3a429d04ee6790322496c56511cabbff91f49a3d)

Author SHA1 Message Date
Ross Wightman dc376e3676 Ensure all model entrypoint fn default to `pretrained=False` (a few didn't)
2 years ago
Ross Wightman 23b102064a Add cs3sedarknet_x weights w/ 82.65 @ 288 top1. Add 2 cs3 edgenet models (w/ 3x3-1x1 block), remove aa from cspnet blocks (not needed)
2 years ago
Ross Wightman 05313940e2 Add cs3darknet_x, cs3sedarknet_l, and darknetaa53 weights from TPU sessions. Move SE btwn conv1 & conv2 in DarkBlock. Improve SE/attn handling in Csp/DarkNet. Fix leaky_relu bug on older csp models.
2 years ago
nateraw 51cca82aa1 👽 use hf_hub_download instead of cached_download
2 years ago
Ross Wightman a45b4bce9a x and xx small edgenext models do benefit from larger test input size
2 years ago
Ross Wightman a8e34051c1 Unbreak gamma remap impacting beit checkpoint load, version bump to 0.6.4
2 years ago
Ross Wightman a1cb25066e Add edgnext_small_rw weights trained with swin like recipe. Better than original 'small' but not the recent 'USI' distilled weights.
2 years ago
Ross Wightman 7c7ecd2492 Add --use-train-size flag to force use of train input_size (over test input size) for validation. Default test-time pooling to use train input size (fixes issues).
2 years ago
Ross Wightman ce65a7b29f Update vit_relpos w/ some additional weights, some cleanup to match recent vit updates, more MLP log coord experiments.
2 years ago
Ross Wightman 58621723bd Add CrossStage3 DarkNet (cs3) weights
2 years ago
Ross Wightman db0cee9910 Refactor cspnet configuration using dataclasses, update feature extraction for new cs3 variants.
2 years ago
Ross Wightman eca09b8642 Add MobileVitV2 support. Fix #1332. Move GroupNorm1 to common layers (used in poolformer + mobilevitv2). Keep ol custom ConvNeXt LayerNorm2d impl as LayerNormExp2d for reference.
2 years ago
Ross Wightman 06307b8b41 Remove experimental downsample in block support in ConvNeXt. Experiment further before keeping it in.
2 years ago
Ross Wightman 7d4b3807d5 Support DeiT-3 (Revenge of the ViT) checkpoints. Add non-overlapping (w/ class token) pos-embed support to vit.
2 years ago
Ross Wightman d0c5bd5722 Rename cs2->cs3 for darknets. Fix features_only for cs3 darknets.
2 years ago
Ross Wightman d765305821 Remove first_conv for resnetaa50 def
2 years ago
Ross Wightman dd9b8f57c4 Add feature_info to edgenext for features_only support, hopefully fix some fx / test errors
2 years ago
Ross Wightman 377e9bfa21 Add TPU trained darknet53 weights. Add mising pretrain_cfg for some csp/darknet models.
2 years ago
Ross Wightman c170ba3173 Add weights for resnet10t, resnet14t, and resnetaa50 models. Fix #1314
2 years ago
Ross Wightman 188c194b0f Left some experiment stem code in convnext by mistake
2 years ago
Ross Wightman 6064d16a2d Add initial EdgeNeXt import. Significant cleanup / reorg (like ConvNeXt). Fix #1320
2 years ago
Ross Wightman 7a9c6811c9 Add eps arg to LayerNorm2d, add 'tf' (tensorflow) variant of trunc_normal_ that applies scale/shift after sampling (instead of needing to move a/b)
2 years ago
Ross Wightman 82c311d082 Add more experimental darknet and 'cs2' darknet variants (different cross stage setup, closer to newer YOLO backbones) for train trials.
2 years ago
Ross Wightman a050fde5cd Add resnet10t (basic block) and resnet14t (bottleneck) with 1,1,1,1 repeats
2 years ago
Ross Wightman e6d7df40ec no longer a point using kwargs for pretrain_cfg resolve, just pass explicit arg
2 years ago
Ross Wightman 07d0c4ae96 Improve repr for DropPath module
2 years ago
Ross Wightman e27c16b8a0 Remove unecessary code for synbn guard
2 years ago
Ross Wightman 0da3c9ebbf Remove SiLU layer in default args that breaks import on old old PyTorch
2 years ago
Ross Wightman 7d657d2ef4 Improve resolve_pretrained_cfg behaviour when no cfg exists, warn instead of crash. Improve usability ex #1311
2 years ago
Ross Wightman 879df47c0a Support BatchNormAct2d for sync-bn use. Fix #1254
2 years ago
Ross Wightman 4b30bae67b Add updated vit_relpos weights, and impl w/ support for official swin-v2 differences for relpos. Add bias control support for MLP layers
3 years ago
Ross Wightman d4c0588012 Remove persistent buffers from Swin-V2. Change SwinV2Cr cos attn + tau/logit_scale to match official, add ckpt convert, init_value zeros resid LN weight by default
3 years ago
Ross Wightman 27c42f0830 Fix torchscript use for offician Swin-V2, add support for non-square window/shift to WindowAttn/Block
3 years ago
Ross Wightman c0211b0bf7 Swin-V2 test fixes, typo
3 years ago
Ross Wightman 9a86b900fa Official SwinV2 models
3 years ago
Ross Wightman d07d015173
Merge pull request #1249 from okojoalg/sequencer
3 years ago
Ross Wightman 39b725e1c9 Fix tests for rank-4 output where feature channels dim is -1 (3) and not 1
3 years ago
Ross Wightman 78a32655fa Fix poolformer group_matcher to merge proj downsample with previous block, support coarse
3 years ago
Ross Wightman d79f3d9d1e Fix torchscript use for sequencer, add group_matcher, forward_head support, minor formatting
3 years ago
Ross Wightman 37b6920df3 Fix group_matcher regex for regnet.py
3 years ago
okojoalg 93a79a3dd9 Fix num_features in Sequencer
3 years ago
okojoalg 578d52e752 Add Sequencer
3 years ago
Ross Wightman f5ca4141f7 Adjust arg order for recent vit model args, add a few comments
3 years ago
Ross Wightman 41dc49a337 Vision Transformer refactoring and Rel Pos impl
3 years ago
Ross Wightman b7cb8d0337 Add Swin-V2 Small-NS weights (83.5 @ 224). Add layer scale like 'init_values' via post-norm LN weight scaling
3 years ago
jjsjann123 f88c606fcf fixing channels_last on cond_conv2d; update nvfuser debug env variable
3 years ago
Li Dong 09e9f3defb
migrate azure blob for beit checkpoints
3 years ago
Ross Wightman 52ac881402 Missed first_conv in latest seresnext 'D' default_cfgs
3 years ago
Ross Wightman 7629d8264d Add two new SE-ResNeXt101-D 32x8d weights, one anti-aliased and one not. Reshuffle default_cfgs vs model entrypoints for resnet.py so they are better aligned.
3 years ago
SeeFun 8f0bc0591e fix convnext args
3 years ago
Ross Wightman c5a8e929fb Add initial swinv2 tiny / small weights
3 years ago
Ross Wightman f670d98cb8 Make a few more layers symbolically traceable (remove from FX leaf modules)
3 years ago
SeeFun ec4e9aa5a0
Add ConvNeXt tiny and small pretrain in22k
3 years ago
Ross Wightman 575924ed60 Update test crop for new RegNet-V weights to match Y
3 years ago
Ross Wightman 1618527098 Add layer scale and parallel blocks to vision_transformer
3 years ago
Ross Wightman c42be74621 Add attrib / comments about Swin-S3 (AutoFormerV2) weights
3 years ago
Ross Wightman 474ac906a2 Add 'head norm first' convnext_tiny_hnf weights
3 years ago
Ross Wightman dc51334cdc Fix pruned adapt for EfficientNet models that are now using BatchNormAct layers
3 years ago
Ross Wightman 024fc4d9ab version 0.6.1 for master
3 years ago
Ross Wightman e1e037ba52 Fix bad tuple typing fix that was on XLA branch bust missed on master merge
3 years ago
Ross Wightman fe457c1996 Update SwinTransformerV2Cr post-merge, update with grad checkpointing / grad matcher
3 years ago
Ross Wightman b049a5c5c6 Merge remote-tracking branch 'origin/master' into norm_norm_norm
3 years ago
Ross Wightman 9440a50c95 Merge branch 'mrT23-master'
3 years ago
Ross Wightman d98aa47d12 Revert ml-decoder changes to model factory and train script
3 years ago
Ross Wightman b20665d379
Merge pull request #1007 from qwertyforce/patch-1
3 years ago
Ross Wightman 61d3493f87 Fix hf-hub handling when hf-hub is config source
3 years ago
Ross Wightman 5f47518f27 Fix pit implementation to be clsoer to deit/levit re distillation head handling
3 years ago
Ross Wightman 0862e6ebae Fix correctness of some group matching regex (no impact on result), some formatting, missed forward_head for resnet
3 years ago
Ross Wightman 94bcdebd73 Add latest weights trained on TPU-v3 VM instances
3 years ago
Ross Wightman 0557c8257d Fix bug introduced in non layer_decay weight_decay application. Remove debug print, fix arg desc.
3 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
3 years ago
Ross Wightman 1420c118df Missed comitting outstanding changes to default_cfg keys and test exclusions for swin v2
3 years ago
Ross Wightman c6e4b7895a Swin V2 CR impl refactor.
3 years ago
Christoph Reich 67d140446b Fix bug in classification head
3 years ago
Christoph Reich 29add820ac Refactor (back to relative imports)
3 years ago
Christoph Reich 74a04e0016 Add parameter to change normalization type
3 years ago
Christoph Reich 2a4f6c13dd Create model functions
3 years ago
Christoph Reich 87b4d7a29a Add get and reset classifier method
3 years ago
Christoph Reich ff5f6bcd6c Check input resolution
3 years ago
Christoph Reich 81bf0b4033 Change parameter names to match Swin V1
3 years ago
Christoph Reich f227b88831 Add initials (CR) to model and file
3 years ago
Christoph Reich 90dc74c450 Add code from https://github.com/ChristophReich1996/Swin-Transformer-V2 and change docstring style to match timm
3 years ago
Ross Wightman 2c3870e107 semobilevit_s for good measure
3 years ago
Ross Wightman 58ba49c8ef Add MobileViT models (w/ ByobNet base). Close #1038.
3 years ago
Ross Wightman 5f81d4de23 Move DeiT to own file, vit getting crowded. Working towards fixing #1029, make pooling interface for transformers and mlp closer to convnets. Still working through some details...
3 years ago
Ross Wightman 95cfc9b3e8 Merge remote-tracking branch 'origin/master' into norm_norm_norm
3 years ago
Ross Wightman abc9ba2544 Transitioning default_cfg -> pretrained_cfg. Improving handling of pretrained_cfg source (HF-Hub, files, timm config, etc). Checkpoint handling tweaks.
3 years ago
Ross Wightman 07379c6d5d Add vit_base2_patch32_256 for a model between base_patch16 and patch32 with a slightly larger img size and width
3 years ago
Ross Wightman 83b40c5a58 Last batch of small model weights (for now). mobilenetv3_small 050/075/100 and updated mnasnet_small with lambc/lamb optimizer.
3 years ago
Ross Wightman 1aa617cb3b Add AvgPool2d anti-aliasing support to ResNet arch (as per OpenAI CLIP models), add a few blur aa models as well
3 years ago
Ross Wightman 010b486590 Add Dino pretrained weights (no head) for vit models. Add support to tests and helpers for models w/ no classifier (num_classes=0 in pretrained cfg)
3 years ago
Ross Wightman 738a9cd635 unbiased=False for torch.var_mean path of ConvNeXt LN. Fix #1090
3 years ago
Ross Wightman e0c4eec4b6 Default conv_mlp to False across the board for ConvNeXt, causing issues on more setups than it's improving right now...
3 years ago
Ross Wightman b669f4a588 Add ConvNeXt 22k->1k fine-tuned and 384 22k-1k fine-tuned weights after testing
3 years ago
Ross Wightman e967c72875 Update REAMDE.md. Sneak in g/G (giant / gigantic?) ViT defs from scaling paper
3 years ago
Ross Wightman 9ca3437178 Add some more small model weights lcnet, mnas, mnv2
3 years ago
Ross Wightman fa81164378 Fix stem width for really small mobilenetv3 arch defs
3 years ago
Ross Wightman edd3d73695 Add missing dropout for head reset in ConvNeXt default head
3 years ago
Ross Wightman b093dcb46d Some convnext cleanup, remove in place mul_ for gamma, breaking symbolic trace, cleanup head a bit...
3 years ago
Ross Wightman 18934debc5 Add initial ConvNeXt impl (mods of official code)
3 years ago