Commit Graph

1433 Commits (2898cf6e41357b6e79229547ad191ca74299f5d2)
 

Author SHA1 Message Date
Ross Wightman 2898cf6e41 version 0.6.5 for pypi release
2 years ago
Ross Wightman 66393d472f Update README.md
2 years ago
Ross Wightman a45b4bce9a x and xx small edgenext models do benefit from larger test input size
2 years ago
Ross Wightman a8e34051c1 Unbreak gamma remap impacting beit checkpoint load, version bump to 0.6.4
2 years ago
Ross Wightman 1ccce50d48
Merge pull request #1327 from rwightman/edgenext_csp_and_more
2 years ago
Ross Wightman 1c5cb819f9 bump version to 0.6.3 before merge
2 years ago
Ross Wightman a1cb25066e Add edgnext_small_rw weights trained with swin like recipe. Better than original 'small' but not the recent 'USI' distilled weights.
2 years ago
Ross Wightman 7c7ecd2492 Add --use-train-size flag to force use of train input_size (over test input size) for validation. Default test-time pooling to use train input size (fixes issues).
2 years ago
Ross Wightman ce65a7b29f Update vit_relpos w/ some additional weights, some cleanup to match recent vit updates, more MLP log coord experiments.
2 years ago
Ross Wightman 58621723bd Add CrossStage3 DarkNet (cs3) weights
2 years ago
Ross Wightman 9be0c84715 Change set -> dict w/ None keys for dataset split synonym search, so always consistent if more than 1 exists. Fix #1224
2 years ago
Ross Wightman 4670d375c6 Reorg benchmark.py import
2 years ago
Ross Wightman 2456223052
Merge pull request #1336 from xwang233/add-local-rank
2 years ago
Ross Wightman 500c190860 Add --aot-autograd (functorch efficient mem fusion) support to validate.py
2 years ago
Ross Wightman 28e0152043 Add --no-retry flag to benchmark.py to skip batch_size decay and retry on error. Fix #1226. Update deepspeed profile usage for latest DS releases. Fix # 1333
2 years ago
Xiao Wang 11060f84c5 make train.py compatible with torchrun
2 years ago
Ross Wightman db0cee9910 Refactor cspnet configuration using dataclasses, update feature extraction for new cs3 variants.
2 years ago
Ross Wightman eca09b8642 Add MobileVitV2 support. Fix #1332. Move GroupNorm1 to common layers (used in poolformer + mobilevitv2). Keep ol custom ConvNeXt LayerNorm2d impl as LayerNormExp2d for reference.
2 years ago
Ross Wightman 06307b8b41 Remove experimental downsample in block support in ConvNeXt. Experiment further before keeping it in.
2 years ago
Ross Wightman bfc0dccb0e Improve image extension handling, add methods to modify / get defaults. Fix #1335 fix #1274.
2 years ago
Ross Wightman 7d4b3807d5 Support DeiT-3 (Revenge of the ViT) checkpoints. Add non-overlapping (w/ class token) pos-embed support to vit.
2 years ago
Ross Wightman d0c5bd5722 Rename cs2->cs3 for darknets. Fix features_only for cs3 darknets.
2 years ago
Ross Wightman d765305821 Remove first_conv for resnetaa50 def
2 years ago
Ross Wightman dd9b8f57c4 Add feature_info to edgenext for features_only support, hopefully fix some fx / test errors
2 years ago
Ross Wightman 377e9bfa21 Add TPU trained darknet53 weights. Add mising pretrain_cfg for some csp/darknet models.
2 years ago
Ross Wightman c170ba3173 Add weights for resnet10t, resnet14t, and resnetaa50 models. Fix #1314
2 years ago
Ross Wightman 188c194b0f Left some experiment stem code in convnext by mistake
2 years ago
Ross Wightman 70d6d2c484 support test_crop_size in data config resolve
2 years ago
Ross Wightman 6064d16a2d Add initial EdgeNeXt import. Significant cleanup / reorg (like ConvNeXt). Fix #1320
2 years ago
Ross Wightman 7a9c6811c9 Add eps arg to LayerNorm2d, add 'tf' (tensorflow) variant of trunc_normal_ that applies scale/shift after sampling (instead of needing to move a/b)
2 years ago
Ross Wightman 82c311d082 Add more experimental darknet and 'cs2' darknet variants (different cross stage setup, closer to newer YOLO backbones) for train trials.
2 years ago
Ross Wightman a050fde5cd Add resnet10t (basic block) and resnet14t (bottleneck) with 1,1,1,1 repeats
2 years ago
Ross Wightman 34f382f8f6 move dataconfig before script, scripting killing metadata now (PyTorch 1.12? just nvfuser?)
2 years ago
Ross Wightman beef62e7ab
Merge pull request #1317 from rwightman/fixes-syncbn_pretrain_cfg_resolve
2 years ago
Ross Wightman e6d7df40ec no longer a point using kwargs for pretrain_cfg resolve, just pass explicit arg
2 years ago
Ross Wightman a29fba307d disable dist_bn when sync_bn active
2 years ago
Ross Wightman 07d0c4ae96 Improve repr for DropPath module
2 years ago
Ross Wightman e27c16b8a0 Remove unecessary code for synbn guard
2 years ago
Ross Wightman 0da3c9ebbf Remove SiLU layer in default args that breaks import on old old PyTorch
2 years ago
Ross Wightman 7d657d2ef4 Improve resolve_pretrained_cfg behaviour when no cfg exists, warn instead of crash. Improve usability ex #1311
2 years ago
Ross Wightman 879df47c0a Support BatchNormAct2d for sync-bn use. Fix #1254
2 years ago
Ross Wightman 7cedc8d474 Follow up to #1256, fix interpolation warning in auto_autoaugment as well
2 years ago
Ross Wightman 037e5e6c09 Fix #1309, move wandb init after distributed init, only init on rank == 0 process
2 years ago
Jakub Kaczmarzyk 9e12530433 use utils namespace instead of function/classnames
2 years ago
Jakub Kaczmarzyk db64393c0d
use `Image.Resampling` namespace for PIL mapping (#1256)
2 years ago
Ross Wightman db8e33c69f
Merge pull request #1294 from xwang233/add-aot-autograd
2 years ago
Ross Wightman 2d7ab06503 Move aot-autograd opt after model metadata used to setup data config in benchmark.py
2 years ago
Xiao Wang ca991c1fa5 add --aot-autograd
2 years ago
Ross Wightman e4360e6125
Merge pull request #1270 from developer0hye/patch-1
2 years ago
Yonghye Kwon 57f8361a01
fix a function parameter typo(cropt_pct -> crop_pct)
2 years ago