Commit Graph

851 Commits (fixes-syncbn_pretrain_cfg_resolve)

Author SHA1 Message Date
Ross Wightman e6d7df40ec no longer a point using kwargs for pretrain_cfg resolve, just pass explicit arg
3 years ago
Ross Wightman 07d0c4ae96 Improve repr for DropPath module
3 years ago
Ross Wightman e27c16b8a0 Remove unecessary code for synbn guard
3 years ago
Ross Wightman 0da3c9ebbf Remove SiLU layer in default args that breaks import on old old PyTorch
3 years ago
Ross Wightman 7d657d2ef4 Improve resolve_pretrained_cfg behaviour when no cfg exists, warn instead of crash. Improve usability ex #1311
3 years ago
Ross Wightman 879df47c0a Support BatchNormAct2d for sync-bn use. Fix #1254
3 years ago
Ross Wightman 7cedc8d474 Follow up to #1256, fix interpolation warning in auto_autoaugment as well
3 years ago
Jakub Kaczmarzyk db64393c0d
use `Image.Resampling` namespace for PIL mapping (#1256)
3 years ago
Ross Wightman 20a1fa63f8 Make dev version 0.6.2.dev0 for pypi pre
3 years ago
Ross Wightman 347308faad Update README.md, version to 0.6.2
3 years ago
Ross Wightman 4b30bae67b Add updated vit_relpos weights, and impl w/ support for official swin-v2 differences for relpos. Add bias control support for MLP layers
3 years ago
Ross Wightman d4c0588012 Remove persistent buffers from Swin-V2. Change SwinV2Cr cos attn + tau/logit_scale to match official, add ckpt convert, init_value zeros resid LN weight by default
3 years ago
Ross Wightman 27c42f0830 Fix torchscript use for offician Swin-V2, add support for non-square window/shift to WindowAttn/Block
3 years ago
Ross Wightman 2f2b22d8c7 Disable nvfuser fma / opt level overrides per #1244
3 years ago
Ross Wightman c0211b0bf7 Swin-V2 test fixes, typo
3 years ago
Ross Wightman 9a86b900fa Official SwinV2 models
3 years ago
Ross Wightman d07d015173
Merge pull request #1249 from okojoalg/sequencer
3 years ago
Ross Wightman d30685c283
Merge pull request #1251 from hankyul2/fix-multistep-scheduler
3 years ago
han a16171335b fix: change milestones to decay-milestones
3 years ago
Ross Wightman 39b725e1c9 Fix tests for rank-4 output where feature channels dim is -1 (3) and not 1
3 years ago
Ross Wightman 78a32655fa Fix poolformer group_matcher to merge proj downsample with previous block, support coarse
3 years ago
Ross Wightman d79f3d9d1e Fix torchscript use for sequencer, add group_matcher, forward_head support, minor formatting
3 years ago
Ross Wightman 37b6920df3 Fix group_matcher regex for regnet.py
3 years ago
okojoalg 93a79a3dd9 Fix num_features in Sequencer
3 years ago
han 57a988df30 fix: multistep lr decay epoch bugs
3 years ago
okojoalg 578d52e752 Add Sequencer
3 years ago
Ross Wightman f5ca4141f7 Adjust arg order for recent vit model args, add a few comments
3 years ago
Ross Wightman 41dc49a337 Vision Transformer refactoring and Rel Pos impl
3 years ago
Ross Wightman b7cb8d0337 Add Swin-V2 Small-NS weights (83.5 @ 224). Add layer scale like 'init_values' via post-norm LN weight scaling
3 years ago
jjsjann123 f88c606fcf fixing channels_last on cond_conv2d; update nvfuser debug env variable
3 years ago
Li Dong 09e9f3defb
migrate azure blob for beit checkpoints
3 years ago
Ross Wightman 52ac881402 Missed first_conv in latest seresnext 'D' default_cfgs
3 years ago
Ross Wightman 7629d8264d Add two new SE-ResNeXt101-D 32x8d weights, one anti-aliased and one not. Reshuffle default_cfgs vs model entrypoints for resnet.py so they are better aligned.
3 years ago
SeeFun 8f0bc0591e fix convnext args
3 years ago
Ross Wightman c5a8e929fb Add initial swinv2 tiny / small weights
3 years ago
Ross Wightman f670d98cb8 Make a few more layers symbolically traceable (remove from FX leaf modules)
3 years ago
SeeFun ec4e9aa5a0
Add ConvNeXt tiny and small pretrain in22k
3 years ago
Ross Wightman 575924ed60 Update test crop for new RegNet-V weights to match Y
3 years ago
Ross Wightman 1618527098 Add layer scale and parallel blocks to vision_transformer
3 years ago
Ross Wightman c42be74621 Add attrib / comments about Swin-S3 (AutoFormerV2) weights
3 years ago
Ross Wightman 474ac906a2 Add 'head norm first' convnext_tiny_hnf weights
3 years ago
Ross Wightman dc51334cdc Fix pruned adapt for EfficientNet models that are now using BatchNormAct layers
3 years ago
Ross Wightman 024fc4d9ab version 0.6.1 for master
3 years ago
Ross Wightman e1e037ba52 Fix bad tuple typing fix that was on XLA branch bust missed on master merge
3 years ago
Ross Wightman 341b464a5a Remove redundant noise attr from Plateau scheduler (use parent)
3 years ago
Ross Wightman fe457c1996 Update SwinTransformerV2Cr post-merge, update with grad checkpointing / grad matcher
3 years ago
Ross Wightman b049a5c5c6 Merge remote-tracking branch 'origin/master' into norm_norm_norm
3 years ago
Ross Wightman 7cdd164d77 Fix #1184, scheduler noise bug during merge madness
3 years ago
Ross Wightman 9440a50c95 Merge branch 'mrT23-master'
3 years ago
Ross Wightman d98aa47d12 Revert ml-decoder changes to model factory and train script
3 years ago