Commit Graph

799 Commits (8ab573cd2637a0a18e7f4cd799d1ff790e03989e)

Author SHA1 Message Date
Ross Wightman 8ab573cd26 Add convnext_tiny and convnext_small 384x384 fine-tunes of in12k weights, fix pool size for laion CLIP convnext weights
2 years ago
Ross Wightman e9aac412de Correct mean/std for CLIP convnexts
2 years ago
Ross Wightman 42bd8f7bcb Add convnext_base CLIP image tower weights for fine-tuning / features
2 years ago
Ross Wightman a2c14c2064 Add tiny/small in12k pretrained and fine-tuned ConvNeXt models
2 years ago
Ross Wightman 2e83bba142 Revert head norm changes to ConvNeXt as it broke some downstream use, alternate workaround for fcmae weights
2 years ago
Ross Wightman 1825b5e314 maxxvit type
2 years ago
Ross Wightman 5078b28f8a More kwarg handling tweaks, maxvit_base_rw def added
2 years ago
Ross Wightman c0d7388a1b Improving kwarg merging in more models
2 years ago
Ross Wightman 60ebb6cefa Re-order vit pretrained entries for more sensible default weights (no .tag specified)
2 years ago
Ross Wightman e861b74cf8 Pass through --model-kwargs (and --opt-kwargs for train) from command line through to model __init__. Update some models to improve arg overlay. Cleanup along the way.
2 years ago
Ross Wightman add3fb864e Working on improved model card template for push_to_hf_hub
2 years ago
Ross Wightman 6e5553da5f
Add ConvNeXt-V2 support (model additions and weights) (#1614)
2 years ago
Ross Wightman 6902c48a5f Fix ResNet based models to work w/ norm layers w/o affine params. Reformat long arg lists into vertical form.
2 years ago
Ross Wightman 8ece53e194 Switch BEiT to HF hub weights
2 years ago
Ross Wightman 9a51e4ea2e Add FlexiViT models and weights, refactoring, push more weights
2 years ago
Ross Wightman 656e1776de Convert mobilenetv3 to multi-weight, tweak PretrainedCfg metadata
2 years ago
Ross Wightman 6a01101905 Update efficientnet.py and convnext.py to multi-weight, add ImageNet-12k pretrained EfficientNet-B5 and ConvNeXt-Nano.
2 years ago
Ross Wightman d5e7d6b27e Merge remote-tracking branch 'origin/main' into refactor-imports
2 years ago
Ross Wightman cda39b35bd Add a deprecation phase to module re-org
2 years ago
Ross Wightman 7c4ed4d5a4 Add EVA-large models
2 years ago
Ross Wightman 98047ef5e3 Add EVA FT results, hopefully fix BEiT test failures
2 years ago
Ross Wightman 3cc4d7a894 Fix missing register for 224 eva model
2 years ago
Ross Wightman eba07b0de7 Add eva models to beit.py
2 years ago
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2 years ago
Ross Wightman 3785c234d7 Remove clip vit models that won't be ft and comment two that aren't uploaded yet
2 years ago
Ross Wightman 755570e2d6 Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
2 years ago
Ross Wightman 72cfa57761 Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
2 years ago
Ross Wightman 4d5c395160 MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
2 years ago
Ross Wightman 9da7e3a799 Add crop_mode for pretraind config / image transforms. Add support for dynamo compilation to benchmark/train/validate
2 years ago
Ross Wightman b2b6285af7 Add two more FT clip weights
2 years ago
Ross Wightman 5895056dc4 Add openai b32 ft
2 years ago
Ross Wightman 9dea5143d5 Adding more clip ft variants
2 years ago
Ross Wightman 444dcba4ad CLIP B16 12k weights added
2 years ago
Ross Wightman dff4717cbf Add clip b16 384x384 finetunes
2 years ago
Ross Wightman 883fa2eeaa Add fine-tuned B/16 224x224 in1k clip models
2 years ago
Ross Wightman 9a3d2ac2d5 Add latest CLIP ViT fine-tune pretrained configs / model entrypt updates
2 years ago
Ross Wightman 42bbbddee9 Add missing model config
2 years ago
Ross Wightman def68befa7 Updating vit model defs for mult-weight support trial (vit first). Prepping for CLIP (laion2b and openai) fine-tuned weights.
2 years ago
Ross Wightman 0dadb4a6e9 Initial multi-weight support, handled so old pretraing config handling co-exists with new tags.
2 years ago
Wauplin 9b114754db refactor push_to_hub helper
2 years ago
Wauplin ae0a0db7de Create repo before cloning with Repository.clone_from
2 years ago
Ross Wightman 803254bb40 Fix spacing misalignment for fast norm path in LayerNorm modules
2 years ago
Ross Wightman 6635bc3f7d
Merge pull request #1479 from rwightman/script_cleanup
2 years ago
Ross Wightman 0e6023f032
Merge pull request #1381 from ChristophReich1996/master
2 years ago
Ross Wightman 66f4af7090 Merge remote-tracking branch 'origin/master' into script_cleanup
2 years ago
Ross Wightman 9914f744dc Add more maxxvit weights includ ConvNeXt conv block based experiments.
2 years ago
Mohamed Rashad 8fda68aff6
Fix repo id bug
2 years ago
Ross Wightman 1199c5a1a4 clip_laion2b models need 1e-5 eps for LayerNorm
2 years ago
Ross Wightman e858912e0c Add brute-force checkpoint remapping option
2 years ago
Ross Wightman b293dfa595 Add CL SE module
2 years ago