Ross Wightman
|
b3042081b4
|
Add laion -> in1k fine-tuned base and large_mlp weights for convnext
|
2 years ago |
Ross Wightman
|
316bdf8955
|
Add mlp head support for convnext_large, add laion2b CLIP weights, prep fine-tuned weight tags
|
2 years ago |
Ross Wightman
|
6f28b562c6
|
Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments
|
2 years ago |
Ross Wightman
|
29fda20e6d
|
Merge branch 'fffffgggg54-main'
|
2 years ago |
Ross Wightman
|
9a53c3f727
|
Finalize DaViT, some formatting and modelling simplifications (separate PatchEmbed to Stem + Downsample, weights on HF hub.
|
2 years ago |
Fredo Guan
|
fb717056da
|
Merge remote-tracking branch 'upstream/main'
|
2 years ago |
Ross Wightman
|
2bbc26dd82
|
version 0.8.8dev0
|
2 years ago |
Ross Wightman
|
64667bfa0e
|
Add 'gigantic' vit clip variant for feature extraction and future fine-tuning
|
2 years ago |
Ross Wightman
|
3aa31f537d
|
Merge pull request #1641 from rwightman/maxxvit_hub
MaxxViT weights on hub, new 12k FT 1k weights, convnext 384x384 12k FT 1k, and more
|
2 years ago |
Ross Wightman
|
9983ed7721
|
xlarge maxvit killing the tests
|
2 years ago |
Ross Wightman
|
c2822568ec
|
Update version to 0.8.7dev0
|
2 years ago |
Ross Wightman
|
0417a9dd81
|
Update README
|
2 years ago |
Ross Wightman
|
36989cfae4
|
Factor out readme generation in hub helper, add more readme fields
|
2 years ago |
Ross Wightman
|
32f252381d
|
Change order of checkpoitn filtering fn application in builder, try dict, model variant first
|
2 years ago |
Ross Wightman
|
e9f1376cde
|
Cleanup resolve data config fns, add 'model' variant that takes model as first arg, make 'args' arg optional in original fn
|
2 years ago |
Ross Wightman
|
bed350f5e5
|
Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights.
|
2 years ago |
Ross Wightman
|
ca38e1e73f
|
Update ClassifierHead module, add reset() method, update in_chs -> in_features for consistency
|
2 years ago |
Ross Wightman
|
8ab573cd26
|
Add convnext_tiny and convnext_small 384x384 fine-tunes of in12k weights, fix pool size for laion CLIP convnext weights
|
2 years ago |
Fredo Guan
|
e58a884c1c
|
Merge remote-tracking branch 'upstream/main'
|
2 years ago |
Fredo Guan
|
81ca323751
|
Davit update formatting and fix grad checkpointing (#7)
fixed head to gap->norm->fc as per convnext, along with option for norm->gap->fc
failed tests due to clip convnext models, davit tests passed
|
2 years ago |
Ross Wightman
|
e9aac412de
|
Correct mean/std for CLIP convnexts
|
2 years ago |
Ross Wightman
|
42bd8f7bcb
|
Add convnext_base CLIP image tower weights for fine-tuning / features
|
2 years ago |
Ross Wightman
|
65aea97067
|
Update tests.yml
Attempt to work around flaky azure ubuntu mirrors
|
2 years ago |
Ross Wightman
|
dd60c45044
|
Merge pull request #1633 from rwightman/freeze_norm_revisit
Update batchnorm freezing to handle NormAct variants
|
2 years ago |
Ross Wightman
|
e520553e3e
|
Update batchnorm freezing to handle NormAct variants, Add GroupNorm1Act, update BatchNormAct2d tracing change from PyTorch
|
2 years ago |
Ross Wightman
|
a2c14c2064
|
Add tiny/small in12k pretrained and fine-tuned ConvNeXt models
|
2 years ago |
Ross Wightman
|
01aea8c1bf
|
Version 0.8.6dev0
|
2 years ago |
Ross Wightman
|
2e83bba142
|
Revert head norm changes to ConvNeXt as it broke some downstream use, alternate workaround for fcmae weights
|
2 years ago |
Ikko Eltociear Ashimine
|
2c24cb98f1
|
Fix typo in results/README.md
occuring -> occurring
|
2 years ago |
Ross Wightman
|
1825b5e314
|
maxxvit type
|
2 years ago |
Ross Wightman
|
5078b28f8a
|
More kwarg handling tweaks, maxvit_base_rw def added
|
2 years ago |
Ross Wightman
|
c0d7388a1b
|
Improving kwarg merging in more models
|
2 years ago |
Ross Wightman
|
94a91598c3
|
Update README.md
|
2 years ago |
Ross Wightman
|
d2ef5a3a94
|
Update README.md
|
2 years ago |
Ross Wightman
|
ae9153052f
|
Update version.py
|
2 years ago |
Ross Wightman
|
60ebb6cefa
|
Re-order vit pretrained entries for more sensible default weights (no .tag specified)
|
2 years ago |
Ross Wightman
|
e861b74cf8
|
Pass through --model-kwargs (and --opt-kwargs for train) from command line through to model __init__. Update some models to improve arg overlay. Cleanup along the way.
|
2 years ago |
Ross Wightman
|
add3fb864e
|
Working on improved model card template for push_to_hf_hub
|
2 years ago |
Xa9aX ツ
|
13c7183c52
|
Update installation.mdx
|
2 years ago |
Ross Wightman
|
eb83eb3bd1
|
Rotate changelogs, add redirects to mkdocs -> equivalent HF docs pages
|
2 years ago |
Ross Wightman
|
dd0bb327e9
|
Update version.py
Ver 0.8.4dev0
|
2 years ago |
Ross Wightman
|
6e5553da5f
|
Add ConvNeXt-V2 support (model additions and weights) (#1614)
* Add ConvNeXt-V2 support (model additions and weights)
* ConvNeXt-V2 weights on HF Hub, tweaking some tests
* Update README, fixing convnextv2 tests
|
2 years ago |
nateraw
|
3698e79ac5
|
🐛 fix github source links in hf docs
|
2 years ago |
Nathan Raw
|
9f5bba9ef9
|
Structure Hugging Face Docs (#1575)
* 🎨 structure docs
* 🚧 wip docs
* 📝 add installation doc
* 📝 wip docs
* 📝 wip docs
* 📝 wip docs
* 📝 wip docs
* 📝 wip docs
* 📝 add basic reference docs
* 📝 remove augmentation from toctree
* 👷 update pr doc builder to bugfix branch
* 📝 wip docs
* 🚧 wip
* 👷 bump CI
* 🚧 wip
* 🚧 bump CI
* 🚧 wip
* 🚧 wip
* 🚧 wip
* 📝 add hf hub tutorial doc
* 🔥 remove inference tut
* 🚧 wip
* 📝 wip docs
* 📝 wip docs
* 📝 update docs
* 📝 move validation script doc up in order
* 🎨 restructure to remove legacy docs
* 📝 update index doc
* 📝 update number of pretrained models
* Update hfdocs/README.md
* Update .github/workflows/build_pr_documentation.yml
* Update build_pr_documentation.yml
* bump
* 📌 update gh action to use main branch
* 🔥 remove comment
|
2 years ago |
Ross Wightman
|
960f5f92e6
|
Update results csv with latest val/test set runs
|
2 years ago |
Ross Wightman
|
6902c48a5f
|
Fix ResNet based models to work w/ norm layers w/o affine params. Reformat long arg lists into vertical form.
|
2 years ago |
Ross Wightman
|
d5aa17e415
|
Remove print from auto_augment
|
2 years ago |
Ross Wightman
|
7c846d9970
|
Better vmap compat across recent torch versions
|
2 years ago |
Ross Wightman
|
130458988a
|
Update README.md
|
2 years ago |
Ross Wightman
|
d96538f1d2
|
Update README
|
2 years ago |