Commit Graph

67 Commits (a5b01ec04e7ba78d0b5ab5c3f2f43a356562a130)

Author SHA1 Message Date
Ross Wightman cda39b35bd Add a deprecation phase to module re-org
2 years ago
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2 years ago
Ross Wightman 755570e2d6 Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
2 years ago
Ross Wightman 72cfa57761 Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
2 years ago
Ross Wightman def68befa7 Updating vit model defs for mult-weight support trial (vit first). Prepping for CLIP (laion2b and openai) fine-tuned weights.
2 years ago
Ross Wightman 0dadb4a6e9 Initial multi-weight support, handled so old pretraing config handling co-exists with new tags.
2 years ago
Ross Wightman e858912e0c Add brute-force checkpoint remapping option
2 years ago
Ross Wightman a383ef99f5 Make huggingface_hub necessary if it's the only source for a pretrained weight
2 years ago
Ross Wightman 9709dbaaa9 Adding support for fine-tune CLIP LAION-2B image tower weights for B/32, L/14, H/14 and g/14. Still WIP
2 years ago
Ross Wightman e6d7df40ec no longer a point using kwargs for pretrain_cfg resolve, just pass explicit arg
2 years ago
Ross Wightman 7d657d2ef4 Improve resolve_pretrained_cfg behaviour when no cfg exists, warn instead of crash. Improve usability ex #1311
2 years ago
Ross Wightman 9a86b900fa Official SwinV2 models
3 years ago
Ross Wightman c5a8e929fb Add initial swinv2 tiny / small weights
3 years ago
Ross Wightman dc51334cdc Fix pruned adapt for EfficientNet models that are now using BatchNormAct layers
3 years ago
Ross Wightman 61d3493f87 Fix hf-hub handling when hf-hub is config source
3 years ago
Ross Wightman 94bcdebd73 Add latest weights trained on TPU-v3 VM instances
3 years ago
Ross Wightman 0557c8257d Fix bug introduced in non layer_decay weight_decay application. Remove debug print, fix arg desc.
3 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
3 years ago
Ross Wightman 95cfc9b3e8 Merge remote-tracking branch 'origin/master' into norm_norm_norm
3 years ago
Ross Wightman abc9ba2544 Transitioning default_cfg -> pretrained_cfg. Improving handling of pretrained_cfg source (HF-Hub, files, timm config, etc). Checkpoint handling tweaks.
3 years ago
Ross Wightman 010b486590 Add Dino pretrained weights (no head) for vit models. Add support to tests and helpers for models w/ no classifier (num_classes=0 in pretrained cfg)
3 years ago
Ross Wightman d633a014e6 Post merge cleanup. Fix potential security issue passing kwargs directly through to serialized web data.
3 years ago
Nathan Raw b18c9e323b
Update helpers.py
3 years ago
Nathan Raw 308d0b9554
Merge branch 'master' into hf-save-and-push
3 years ago
Alexander Soare ab3ac3f25b Add FX based FeatureGraphNet capability
3 years ago
Ross Wightman b2094f4ee8 support bits checkpoints in avg/load
3 years ago
nateraw adcb74f87f 🎨 Import load_state_dict_from_url directly
3 years ago
Ross Wightman 8880f696b6 Refactoring, cleanup, improved test coverage.
3 years ago
Ross Wightman d7bab8a6c5 Fix strict flag change for checkpoint load.
4 years ago
Ross Wightman bfc72f75d3 Expand scope of testing for non-std vision transformer / mlp models. Some related cleanup and create fn cleanup for all vision transformer and mlp models. More CoaT weights.
4 years ago
Ross Wightman 7953e5d11a Fix pos_embed scaling for ViT and num_classes != 1000 for pretrained distilled deit and pit models. Fix #426 and fix #433
4 years ago
Ross Wightman 45c048ba13 A few minor fixes and bit more cleanup on the huggingface hub integration.
4 years ago
Ross Wightman ead80d33c5 Fix typo, naming consistency
4 years ago
Ross Wightman d584e7f617 Support for huggingface hub via create_model and default_cfgs.
4 years ago
Ross Wightman 4f49b94311 Initial AGC impl. Still testing.
4 years ago
Ross Wightman 0356e773f5 Default to native PyTorch AMP instead of APEX amp. Too many APEX issues cropping up lately.
4 years ago
Ross Wightman b4e216e377 Fix a few small things.
4 years ago
Ross Wightman 9811e229f7 Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models.
4 years ago
Ross Wightman 38d8f67570 Fix potential issue with change to num_classes arg in train/validate.py defaulting to None (rely on model def / default_cfg)
4 years ago
Ross Wightman 855d6cc217 More dataset work including factories and a tensorflow datasets (TFDS) wrapper
4 years ago
Ross Wightman 231d04e91a ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing.
4 years ago
Ross Wightman 867a0e5a04 Add default_cfg back to models wrapped in feature extraction module as per discussion in #294.
4 years ago
Ross Wightman 2ed8f24715 A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper.
4 years ago
Ross Wightman da6cd2cc1f Fix regression for pretrained classifier loading when using entrypt functions directly
4 years ago
Ross Wightman 9c297ec67d Cleanup Apex vs native AMP scaler state save/load. Cleanup CheckpointSaver a bit.
4 years ago
Ross Wightman b1b6e7c361 Fix a few more issues related to #216 w/ TResNet (space2depth) and FP16 weights in wide resnets. Also don't completely dump pretrained weights in in_chans != 1 or 3 cases.
4 years ago
Ross Wightman b1f1a54de9 More uniform treatment of classifiers across all models, reduce code duplication.
4 years ago
Ross Wightman 7995295968 Merge branch 'logger' into features. Change 'logger' to '_logger'.
4 years ago
Ross Wightman 4e61c6a12d Cleanup, refactoring of Feature extraction code, add tests, fix tests, non hook feature extraction working with torchscript
4 years ago
Ross Wightman 9eba134d79 More models supporting feature extraction, xception, gluon_xception, inception_v3, inception_v4, pnasnet, nasnet, dla. Fix DLA unused projection params.
4 years ago