Commit Graph

24 Commits (81ca323751505c7e4ee9b7561020fd3c6008d116)

Author SHA1 Message Date
Fredo Guan 81ca323751
Davit update formatting and fix grad checkpointing (#7)
1 year ago
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
1 year ago
Ross Wightman 755570e2d6 Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
1 year ago
Ross Wightman 72cfa57761 Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
1 year ago
Ross Wightman 4d5c395160 MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
1 year ago
Ross Wightman 837c68263b For ConvNeXt, use timm internal LayerNorm for fast_norm in non conv_mlp mode
2 years ago
Ross Wightman 1d8ada359a Add timm ConvNeXt 'atto' weights, change test resolution for FB ConvNeXt 224x224 weights, add support for different dw kernel_size
2 years ago
Ross Wightman 2544d3b80f ConvNeXt pico, femto, and nano, pico, femto ols (overlapping stem) weights and model defs
2 years ago
Ross Wightman 6f103a442b Add convnext_nano weights, 80.8 @ 224, 81.5 @ 288
2 years ago
Ross Wightman c5e0d1c700 Add dilation support to convnext, allows output_stride=8 and 16 use. Fix #1341
2 years ago
Ross Wightman 06307b8b41 Remove experimental downsample in block support in ConvNeXt. Experiment further before keeping it in.
2 years ago
Ross Wightman 188c194b0f Left some experiment stem code in convnext by mistake
2 years ago
Ross Wightman 6064d16a2d Add initial EdgeNeXt import. Significant cleanup / reorg (like ConvNeXt). Fix #1320
2 years ago
SeeFun 8f0bc0591e fix convnext args
2 years ago
SeeFun ec4e9aa5a0
Add ConvNeXt tiny and small pretrain in22k
2 years ago
Ross Wightman 474ac906a2 Add 'head norm first' convnext_tiny_hnf weights
2 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
2 years ago
Ross Wightman 5f81d4de23 Move DeiT to own file, vit getting crowded. Working towards fixing #1029, make pooling interface for transformers and mlp closer to convnets. Still working through some details...
2 years ago
Ross Wightman 738a9cd635 unbiased=False for torch.var_mean path of ConvNeXt LN. Fix #1090
2 years ago
Ross Wightman e0c4eec4b6 Default conv_mlp to False across the board for ConvNeXt, causing issues on more setups than it's improving right now...
2 years ago
Ross Wightman b669f4a588 Add ConvNeXt 22k->1k fine-tuned and 384 22k-1k fine-tuned weights after testing
2 years ago
Ross Wightman edd3d73695 Add missing dropout for head reset in ConvNeXt default head
2 years ago
Ross Wightman b093dcb46d Some convnext cleanup, remove in place mul_ for gamma, breaking symbolic trace, cleanup head a bit...
2 years ago
Ross Wightman 18934debc5 Add initial ConvNeXt impl (mods of official code)
2 years ago