Commit Graph

1537 Commits (9343f6e431b2da08f271c7fb3d2ea831d799b88a)
 

Author SHA1 Message Date
nateraw 9343f6e431 👷 update ci
2 years ago
nateraw 5f9eef712c 👷 bump ci
2 years ago
nateraw 09a77700f6 👷 bump back. it was actions outage not me
2 years ago
nateraw b8e3ffd498 👷 revert ci workflow to main
2 years ago
nateraw d56f9f7a23 🐛 fix link
2 years ago
nateraw 6153816250 🚧 test doc-builder branch to fix build here
2 years ago
nateraw 2f1abc2cf9 👷 add build_documentation workflow
2 years ago
nateraw 3a429d04ee 🚑 supply --not_python_module for now
2 years ago
nateraw d4d915caf3 🚧 update path_to_docs in pr doc builder workflow
2 years ago
nateraw 96109a909b 🚧 update docs path
2 years ago
nateraw 6c3d02a7e5 🚧 update inputs to build_pr_documentation workflow
2 years ago
nateraw 25059001a9 🚧 add repo_owner
2 years ago
nateraw e4c99d2bd6 📝 add hfdocs documentation
2 years ago
Ross Wightman 5dc4343308 version 0.6.11
2 years ago
Ross Wightman a383ef99f5 Make huggingface_hub necessary if it's the only source for a pretrained weight
2 years ago
Ross Wightman d199f6651d
Merge pull request #1467 from rwightman/clip_laion2b
2 years ago
Ross Wightman 33e30f8c8b Remove layer-decay print
2 years ago
Ross Wightman e069249a2d Add hf hub entries for laion2b clip models, add huggingface_hub dependency, update some setup/reqs, torch >= 1.7
2 years ago
Ross Wightman 9d65557be3 Fix errant import
2 years ago
Ross Wightman 9709dbaaa9 Adding support for fine-tune CLIP LAION-2B image tower weights for B/32, L/14, H/14 and g/14. Still WIP
2 years ago
Ross Wightman a520da9b49 Update tresnet features_info for v2
2 years ago
Ross Wightman c8ab747bf4 BEiT-V2 checkpoints didn't remove 'module' from weights, adapt checkpoint filter
2 years ago
Ross Wightman 73049dc2aa Fix type in dla weight update
2 years ago
Ross Wightman 3599c7e6a4 version 0.6.10
2 years ago
Ross Wightman e11efa872d Update a bunch of weights with external links to timm release assets. Fixes issue with *aliyuncs.com returning forbidden. Did pickle scan / verify and re-hash. Add TresNet-V2-L weights.
2 years ago
Ross Wightman fa8c84eede Update maxvit_tiny_256 weight to better iter, add coatnet / maxvit / maxxvit model defs for future runs
2 years ago
Ross Wightman de40f66536 Update README.md
2 years ago
Ross Wightman c1b3cea19d Add maxvit_rmlp_tiny_rw_256 model def and weights w/ 84.2 top-1 @ 256, 84.8 @ 320
2 years ago
Ross Wightman da6f8f5a40 Fix beitv2 tests
2 years ago
Ross Wightman 914544fc81 Add beitv2 224x224 checkpoints from https://github.com/microsoft/unilm/tree/master/beit2
2 years ago
Ross Wightman dc90816f26 Add `maxvit_tiny_rw_224` weights 83.5 @ 224 and `maxvit_rmlp_pico_rw_256` relpos weights, 80.5 @ 256, 81.3 @ 320
2 years ago
Ross Wightman f489f02ad1 Make gcvit window size ratio based to improve resolution changing support #1449. Change default init to original.
2 years ago
Ross Wightman c45c6ee8e4 Update README.md
2 years ago
Ross Wightman 7f1b223c02 Add maxvit_rmlp_nano_rw_256 model def & weights, make window/grid size dynamic wrt img_size by default
2 years ago
Ross Wightman e6a4361306 pretrained_cfg entry for mvitv2_small_cls
2 years ago
Ross Wightman f66e5f0e35 Fix class token support in MViT-V2, add small_class variant to ensure it's tested. Fix #1443
2 years ago
Ross Wightman b94b7cea65 Missed GCVit in README paper links
2 years ago
Ross Wightman f1d2160d85 Update a few maxxvit comments, rename PartitionAttention -> PartitionAttenionCl for consistency with other blocks
2 years ago
Ross Wightman eca6f0a25c Fix syntax error (extra dataclass comma) in maxxvit.py
2 years ago
Ross Wightman 4f72bae43b
Merge pull request #1415 from rwightman/more_vit
2 years ago
Ross Wightman ff6a919cf5 Add --fast-norm arg to benchmark.py, train.py, validate.py
2 years ago
Ross Wightman 769ab4b98a Clean up no_grad for trunc normal weight inits
2 years ago
Ross Wightman 48e1df8b37 Add norm/norm_act header comments
2 years ago
Ross Wightman 99ee61e245 Add T/G legend to README.md maxvit list
2 years ago
Ross Wightman a54008bd97 Update README.md for merge
2 years ago
Ross Wightman 7c2660576d Tweak init for convnext block using maxxvit/coatnext.
2 years ago
Ross Wightman 1d8d6f6072 Fix two default args in DenseNet blocks... fix #1427
2 years ago
Ross Wightman 527f9a4cb2 Updated to correct maxvit_nano weights...
2 years ago
Ross Wightman 2a5b5b2a7b
Update feature_request.md
2 years ago
Ross Wightman e018253acc
Update config.yml
2 years ago