Commit Graph

776 Commits (16d2db7e4b85b1574cc03694d9c12561d618a0f9)

Author SHA1 Message Date
Ross Wightman 16d2db7e4b Remove clip vit models that won't be ft and comment two that aren't uploaded yet
2 years ago
Ross Wightman c3be79a8b7 Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
2 years ago
Ross Wightman 23b357f1df Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
2 years ago
Ross Wightman c59d88339b Merge remote-tracking branch 'origin/main' into multi-weight
2 years ago
Ross Wightman ec6921fcb0 MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
2 years ago
Ross Wightman 0ed0cc7eba Add crop_mode for pretraind config / image transforms. Add support for dynamo compilation to benchmark/train/validate
2 years ago
Wauplin 9b114754db refactor push_to_hub helper
2 years ago
Wauplin ae0a0db7de Create repo before cloning with Repository.clone_from
2 years ago
Ross Wightman f6e0a848d0 Add two more FT clip weights
2 years ago
Ross Wightman 884a0f1a12 Add openai b32 ft
2 years ago
Ross Wightman 2c80da3b9a Adding more clip ft variants
2 years ago
Ross Wightman da2de0de95 CLIP B16 12k weights added
2 years ago
Ross Wightman b2897f5ea6 Add clip b16 384x384 finetunes
2 years ago
Ross Wightman 092287436e Add fine-tuned B/16 224x224 in1k clip models
2 years ago
Ross Wightman d3415e3134 Add latest CLIP ViT fine-tune pretrained configs / model entrypt updates
2 years ago
Ross Wightman 2eb825c014 Add missing model config
2 years ago
Ross Wightman 0761ce7a1b Updating vit model defs for mult-weight support trial (vit first). Prepping for CLIP (laion2b and openai) fine-tuned weights.
2 years ago
Ross Wightman ebb99a1f8d Initial multi-weight support, handled so old pretraing config handling co-exists with new tags.
2 years ago
Ross Wightman 803254bb40 Fix spacing misalignment for fast norm path in LayerNorm modules
2 years ago
Ross Wightman 6635bc3f7d
Merge pull request #1479 from rwightman/script_cleanup
2 years ago
Ross Wightman 0e6023f032
Merge pull request #1381 from ChristophReich1996/master
2 years ago
Ross Wightman 66f4af7090 Merge remote-tracking branch 'origin/master' into script_cleanup
2 years ago
Ross Wightman 9914f744dc Add more maxxvit weights includ ConvNeXt conv block based experiments.
2 years ago
Mohamed Rashad 8fda68aff6
Fix repo id bug
2 years ago
Ross Wightman 1199c5a1a4 clip_laion2b models need 1e-5 eps for LayerNorm
2 years ago
Ross Wightman e858912e0c Add brute-force checkpoint remapping option
2 years ago
Ross Wightman b293dfa595 Add CL SE module
2 years ago
Ross Wightman a383ef99f5 Make huggingface_hub necessary if it's the only source for a pretrained weight
2 years ago
Ross Wightman e069249a2d Add hf hub entries for laion2b clip models, add huggingface_hub dependency, update some setup/reqs, torch >= 1.7
2 years ago
Ross Wightman 9d65557be3 Fix errant import
2 years ago
Ross Wightman 9709dbaaa9 Adding support for fine-tune CLIP LAION-2B image tower weights for B/32, L/14, H/14 and g/14. Still WIP
2 years ago
Ross Wightman a520da9b49 Update tresnet features_info for v2
2 years ago
Ross Wightman c8ab747bf4 BEiT-V2 checkpoints didn't remove 'module' from weights, adapt checkpoint filter
2 years ago
Ross Wightman 73049dc2aa Fix type in dla weight update
2 years ago
Ross Wightman e11efa872d Update a bunch of weights with external links to timm release assets. Fixes issue with *aliyuncs.com returning forbidden. Did pickle scan / verify and re-hash. Add TresNet-V2-L weights.
2 years ago
Ross Wightman fa8c84eede Update maxvit_tiny_256 weight to better iter, add coatnet / maxvit / maxxvit model defs for future runs
2 years ago
Ross Wightman c1b3cea19d Add maxvit_rmlp_tiny_rw_256 model def and weights w/ 84.2 top-1 @ 256, 84.8 @ 320
2 years ago
Ross Wightman 914544fc81 Add beitv2 224x224 checkpoints from https://github.com/microsoft/unilm/tree/master/beit2
2 years ago
Ross Wightman dc90816f26 Add `maxvit_tiny_rw_224` weights 83.5 @ 224 and `maxvit_rmlp_pico_rw_256` relpos weights, 80.5 @ 256, 81.3 @ 320
2 years ago
Ross Wightman f489f02ad1 Make gcvit window size ratio based to improve resolution changing support #1449. Change default init to original.
2 years ago
Ross Wightman 7f1b223c02 Add maxvit_rmlp_nano_rw_256 model def & weights, make window/grid size dynamic wrt img_size by default
2 years ago
Ross Wightman e6a4361306 pretrained_cfg entry for mvitv2_small_cls
2 years ago
Ross Wightman f66e5f0e35 Fix class token support in MViT-V2, add small_class variant to ensure it's tested. Fix #1443
2 years ago
Ross Wightman f1d2160d85 Update a few maxxvit comments, rename PartitionAttention -> PartitionAttenionCl for consistency with other blocks
2 years ago
Ross Wightman eca6f0a25c Fix syntax error (extra dataclass comma) in maxxvit.py
2 years ago
Ross Wightman ff6a919cf5 Add --fast-norm arg to benchmark.py, train.py, validate.py
2 years ago
Ross Wightman 769ab4b98a Clean up no_grad for trunc normal weight inits
2 years ago
Ross Wightman 48e1df8b37 Add norm/norm_act header comments
2 years ago
Ross Wightman 7c2660576d Tweak init for convnext block using maxxvit/coatnext.
2 years ago
Ross Wightman 1d8d6f6072 Fix two default args in DenseNet blocks... fix #1427
2 years ago