Commit Graph

832 Commits (e04400dc780299ec34adbb94d4f26beb5e2c382d)

Author SHA1 Message Date
Ross Wightman e04400dc78 Remove dead line
1 year ago
Ross Wightman c53cf76fa3 Torchscript fixes/hacks for rms_norm, refactor ParallelScalingBlock with manual combination of input projections, closer paper match
1 year ago
Ross Wightman b6eb652924 Add Final annotation to attn_fas to avoid symbol lookup of new scaled_dot_product_attn fn on old PyTorch in jit
1 year ago
Ross Wightman a9739258f4 Add ideas from 'Scaling ViT to 22-B Params', testing PyTorch 2.0 fused F.scaled_dot_product_attention impl in vit, vit_relpos, maxxvit / coatnet.
1 year ago
Ross Wightman 624266148d Remove unused imports from _hub helpers
1 year ago
Ross Wightman 2cfff0581b Add grad_checkpointing support to features_only, test in EfficientDet.
1 year ago
Ross Wightman 9c14654a0d Improve support for custom dataset label name/description through HF hub export, via pretrained_cfg
1 year ago
Ross Wightman 0d33127df2 Add 384x384 convnext_large_mlp laion2b fine-tune on in1k
1 year ago
Ross Wightman 7a0bd095cb Update model prune loader to use pkgutil
1 year ago
Ross Wightman 13acac8c5e Update head metadata for effformerv2
1 year ago
Ross Wightman 8682528096 Add first conv metadata for efficientformer_v2
1 year ago
Ross Wightman 72fba669a8 is_scripting() guard on checkpoint_seq
1 year ago
Ross Wightman 95ec255f7f Finish timm mode api for efficientformer_v2, add grad checkpointing support to both efficientformers
1 year ago
Ross Wightman 9d03c6f526 Merge remote-tracking branch 'origin/main' into levit_efficientformer_redux
1 year ago
Ross Wightman 086bd55a94 Add EfficientFormer-V2, refactor EfficientFormer and Levit for more uniformity across the 3 related arch. Add features_out support to levit conv models and efficientformer_v2. All weights on hub.
1 year ago
Ross Wightman 2cb2699dc8 Apply fix from #1649 to main
1 year ago
Ross Wightman b3042081b4 Add laion -> in1k fine-tuned base and large_mlp weights for convnext
1 year ago
Ross Wightman 316bdf8955 Add mlp head support for convnext_large, add laion2b CLIP weights, prep fine-tuned weight tags
1 year ago
Ross Wightman 6f28b562c6 Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments
1 year ago
Ross Wightman 9a53c3f727 Finalize DaViT, some formatting and modelling simplifications (separate PatchEmbed to Stem + Downsample, weights on HF hub.
1 year ago
Fredo Guan fb717056da Merge remote-tracking branch 'upstream/main'
1 year ago
Ross Wightman 64667bfa0e Add 'gigantic' vit clip variant for feature extraction and future fine-tuning
1 year ago
Ross Wightman 36989cfae4 Factor out readme generation in hub helper, add more readme fields
1 year ago
Ross Wightman 32f252381d Change order of checkpoitn filtering fn application in builder, try dict, model variant first
1 year ago
Ross Wightman bed350f5e5 Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights.
1 year ago
Ross Wightman ca38e1e73f Update ClassifierHead module, add reset() method, update in_chs -> in_features for consistency
1 year ago
Ross Wightman 8ab573cd26 Add convnext_tiny and convnext_small 384x384 fine-tunes of in12k weights, fix pool size for laion CLIP convnext weights
1 year ago
Fredo Guan 81ca323751
Davit update formatting and fix grad checkpointing (#7)
1 year ago
Ross Wightman e9aac412de Correct mean/std for CLIP convnexts
1 year ago
Ross Wightman 42bd8f7bcb Add convnext_base CLIP image tower weights for fine-tuning / features
1 year ago
Ross Wightman a2c14c2064 Add tiny/small in12k pretrained and fine-tuned ConvNeXt models
1 year ago
Ross Wightman 2e83bba142 Revert head norm changes to ConvNeXt as it broke some downstream use, alternate workaround for fcmae weights
1 year ago
Ross Wightman 1825b5e314 maxxvit type
1 year ago
Ross Wightman 5078b28f8a More kwarg handling tweaks, maxvit_base_rw def added
1 year ago
Ross Wightman c0d7388a1b Improving kwarg merging in more models
1 year ago
Ross Wightman 60ebb6cefa Re-order vit pretrained entries for more sensible default weights (no .tag specified)
1 year ago
Ross Wightman e861b74cf8 Pass through --model-kwargs (and --opt-kwargs for train) from command line through to model __init__. Update some models to improve arg overlay. Cleanup along the way.
1 year ago
Ross Wightman add3fb864e Working on improved model card template for push_to_hf_hub
1 year ago
Ross Wightman 6e5553da5f
Add ConvNeXt-V2 support (model additions and weights) (#1614)
1 year ago
Ross Wightman 6902c48a5f Fix ResNet based models to work w/ norm layers w/o affine params. Reformat long arg lists into vertical form.
1 year ago
Ross Wightman 8ece53e194 Switch BEiT to HF hub weights
1 year ago
Ross Wightman 9a51e4ea2e Add FlexiViT models and weights, refactoring, push more weights
1 year ago
Fredo Guan 10b3f696b4
Davit std (#6)
1 year ago
Ross Wightman 656e1776de Convert mobilenetv3 to multi-weight, tweak PretrainedCfg metadata
1 year ago
Ross Wightman 6a01101905 Update efficientnet.py and convnext.py to multi-weight, add ImageNet-12k pretrained EfficientNet-B5 and ConvNeXt-Nano.
1 year ago
Fredo Guan 84178fca60
Merge branch 'rwightman:main' into main
1 year ago
Fredo Guan c43340ddd4
Davit std (#5)
1 year ago
Ross Wightman d5e7d6b27e Merge remote-tracking branch 'origin/main' into refactor-imports
2 years ago
Ross Wightman cda39b35bd Add a deprecation phase to module re-org
2 years ago
Fredo Guan edea013dd1
Davit std (#3)
2 years ago