Commit Graph

20 Commits (909705e7ffd8ac69ca9088dea90f4d09d0578006)

Author SHA1 Message Date
Ross Wightman 0862e6ebae Fix correctness of some group matching regex (no impact on result), some formatting, missed forward_head for resnet
3 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
3 years ago
Ross Wightman 5f81d4de23 Move DeiT to own file, vit getting crowded. Working towards fixing #1029, make pooling interface for transformers and mlp closer to convnets. Still working through some details...
3 years ago
Ross Wightman abc9ba2544 Transitioning default_cfg -> pretrained_cfg. Improving handling of pretrained_cfg source (HF-Hub, files, timm config, etc). Checkpoint handling tweaks.
3 years ago
Martins Bruveris 85c5ff26d7 Added DINO pretrained ResMLP models.
3 years ago
Ross Wightman 20a2be14c3 Add gMLP-S weights, 79.6 top-1
3 years ago
Ross Wightman b41cffaa93 Fix a few issues loading pretrained vit/bit npz weights w/ num_classes=0 __init__ arg. Missed a few other small classifier handling detail on Mlp, GhostNet, Levit. Should fix #713
3 years ago
Ross Wightman 8f4a0222ed Add GMixer-24 MLP model weights, trained w/ TPU + PyTorch XLA
3 years ago
Ross Wightman 511a8e8c96 Add official ResMLP weights.
3 years ago
Ross Wightman 4d96165989 Merge branch 'master' into cleanup_xla_model_fixes
3 years ago
Ross Wightman 8880f696b6 Refactoring, cleanup, improved test coverage.
3 years ago
Ross Wightman d413eef1bf Add ResMLP-24 model weights that I trained in PyTorch XLA on TPU-VM. 79.2 top-1.
3 years ago
Ross Wightman 2f5ed2dec1 Update `init_values` const for 24 and 36 layer ResMLP models
3 years ago
Ross Wightman bfc72f75d3 Expand scope of testing for non-std vision transformer / mlp models. Some related cleanup and create fn cleanup for all vision transformer and mlp models. More CoaT weights.
4 years ago
talrid dc1a4efd28 mixer_b16_224_miil, mixer_b16_224_miil_in21k models
4 years ago
Ross Wightman d5af752117 Add preliminary gMLP and ResMLP impl to Mlp-Mixer
4 years ago
Ross Wightman e7f0db8664 Fix drop/drop_path arg on MLP-Mixer model. Fix #641
4 years ago
Ross Wightman b2c305c2aa Move Mlp and PatchEmbed modules into layers. Being used in lots of models now...
4 years ago
Ross Wightman 2d8b09fe8b Add official pretrained weights to MLP-Mixer, complete model cfgs.
4 years ago
Ross Wightman 12efffa6b1 Initial MLP-Mixer attempt...
4 years ago