Commit Graph

23 Commits (b3042081b4ecd7d5c6b006c855cecf9475bce17b)

Author SHA1 Message Date
Ross Wightman 6f28b562c6 Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments
2 years ago
Ross Wightman bed350f5e5 Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights.
2 years ago
Ross Wightman 1825b5e314 maxxvit type
2 years ago
Ross Wightman 5078b28f8a More kwarg handling tweaks, maxvit_base_rw def added
2 years ago
Ross Wightman c0d7388a1b Improving kwarg merging in more models
2 years ago
Ross Wightman 9a51e4ea2e Add FlexiViT models and weights, refactoring, push more weights
2 years ago
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2 years ago
Ross Wightman 755570e2d6 Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
2 years ago
Ross Wightman 72cfa57761 Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
2 years ago
Ross Wightman 4d5c395160 MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
2 years ago
Ross Wightman 9914f744dc Add more maxxvit weights includ ConvNeXt conv block based experiments.
2 years ago
Ross Wightman fa8c84eede Update maxvit_tiny_256 weight to better iter, add coatnet / maxvit / maxxvit model defs for future runs
2 years ago
Ross Wightman c1b3cea19d Add maxvit_rmlp_tiny_rw_256 model def and weights w/ 84.2 top-1 @ 256, 84.8 @ 320
2 years ago
Ross Wightman dc90816f26 Add `maxvit_tiny_rw_224` weights 83.5 @ 224 and `maxvit_rmlp_pico_rw_256` relpos weights, 80.5 @ 256, 81.3 @ 320
2 years ago
Ross Wightman 7f1b223c02 Add maxvit_rmlp_nano_rw_256 model def & weights, make window/grid size dynamic wrt img_size by default
2 years ago
Ross Wightman f1d2160d85 Update a few maxxvit comments, rename PartitionAttention -> PartitionAttenionCl for consistency with other blocks
2 years ago
Ross Wightman eca6f0a25c Fix syntax error (extra dataclass comma) in maxxvit.py
2 years ago
Ross Wightman 7c2660576d Tweak init for convnext block using maxxvit/coatnext.
2 years ago
Ross Wightman 527f9a4cb2 Updated to correct maxvit_nano weights...
2 years ago
Ross Wightman b2e8426fca Make k=stride=2 ('avg2') pooling default for coatnet/maxvit. Add weight links. Rename 'combined' partition to 'parallel'.
2 years ago
Ross Wightman cac0a4570a More test fixes, pool size for 256x256 maxvit models
2 years ago
Ross Wightman e939ed19b9 Rename internal creation fn for maxvit, has not been just coatnet for a while...
2 years ago
Ross Wightman ffaf97f813 MaxxVit! A very configurable MaxVit and CoAtNet impl with lots of goodies..
2 years ago