Commit Graph

9 Commits (b995ca3c314598b2b91ca1755232652c6df07335)

Author SHA1 Message Date
Ross Wightman 122621daef Add Final annotation to attn_fas to avoid symbol lookup of new scaled_dot_product_attn fn on old PyTorch in jit
1 year ago
Ross Wightman 621e1b2182 Add ideas from 'Scaling ViT to 22-B Params', testing PyTorch 2.0 fused F.scaled_dot_product_attention impl in vit, vit_relpos, maxxvit / coatnet.
1 year ago
Ross Wightman 9a51e4ea2e Add FlexiViT models and weights, refactoring, push more weights
2 years ago
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2 years ago
Ross Wightman ce65a7b29f Update vit_relpos w/ some additional weights, some cleanup to match recent vit updates, more MLP log coord experiments.
2 years ago
Ross Wightman 7d657d2ef4 Improve resolve_pretrained_cfg behaviour when no cfg exists, warn instead of crash. Improve usability ex #1311
2 years ago
Ross Wightman 4b30bae67b Add updated vit_relpos weights, and impl w/ support for official swin-v2 differences for relpos. Add bias control support for MLP layers
2 years ago
Ross Wightman f5ca4141f7 Adjust arg order for recent vit model args, add a few comments
2 years ago
Ross Wightman 41dc49a337 Vision Transformer refactoring and Rel Pos impl
2 years ago