Ross Wightman
40f4745366
Merge branch 'norm_norm_norm' into bits_and_tpu
3 years ago
Ross Wightman
a52a614475
Remove layer experiment which should not have been added
3 years ago
Ross Wightman
1c21cac8f9
Add drop args to benchmark.py
3 years ago
Ross Wightman
d829858550
Significant norm update
...
* ConvBnAct layer renamed -> ConvNormAct and ConvNormActAa for anti-aliased
* Significant update to EfficientNet and MobileNetV3 arch to support NormAct layers and grouped conv (as alternative to depthwise)
* Update RegNet to add Z variant
* Add Pre variant of XceptionAligned that works with NormAct layers
* EvoNorm matches bits_and_tpu branch for merge
3 years ago
Ross Wightman
683fba7686
Add drop args to benchmark.py
3 years ago
Ross Wightman
ab49d275de
Significant norm update
...
* ConvBnAct layer renamed -> ConvNormAct and ConvNormActAa for anti-aliased
* Significant update to EfficientNet and MobileNetV3 arch to support NormAct layers and grouped conv (as alternative to depthwise)
* Update RegNet to add Z variant
* Add Pre variant of XceptionAligned that works with NormAct layers
* EvoNorm matches bits_and_tpu branch for merge
3 years ago
Ross Wightman
57fca2b5b2
Fix c16_evos stem / first conv setup
3 years ago
Ross Wightman
1f54a1fff7
Add C16 and E8 EvoNormS0 configs for RegNetZ BYOB nets
3 years ago
Ross Wightman
4d7a5544f7
Remove inplace sigmoid for consistency with other impl
3 years ago
Ross Wightman
88a5b54802
A few small evonorm tweaks for convergence comparisons
3 years ago
Ross Wightman
66daee4f31
Last change wasn't complete, missed adding full evo_norm changeset
3 years ago
Ross Wightman
7bbbd5ef1b
EvoNorm and GroupNormAct options for debugging TPU / XLA concerns
3 years ago
Ross Wightman
ff0f709c20
Testing TFDS shuffle across epochs
3 years ago
Ross Wightman
69e90dcd8c
Merge branch 'norm_norm_norm' into bits_and_tpu
3 years ago
Ross Wightman
d04f2f1377
Update drop_path and drop_block (fast impl) to be symbolically traceable, slightly faster
3 years ago
Ross Wightman
820ae9925e
Fix load_state_dict to handle None ema entries
3 years ago
Ross Wightman
0e212e8fe5
Merge remote-tracking branch 'origin/norm_norm_norm' into bits_and_tpu
3 years ago
Ross Wightman
cd059cbe9c
Add FX backward tests back
3 years ago
Ross Wightman
834a9ec721
Disable use of timm nn.Linear wrapper since AMP autocast + torchscript use appears fixed
3 years ago
Ross Wightman
cad170e494
Merge remote-tracking branch 'origin/norm_norm_norm' into bits_and_tpu
3 years ago
Ross Wightman
58ffa2bfb7
Update pytest for GitHub runner to use --forked with xdist, hopefully eliminate memory buildup
3 years ago
Ross Wightman
78912b6375
Updated EvoNorm implementations with some experimentation. Add FilterResponseNorm. Updated RegnetZ and ResNetV2 model defs for trials.
3 years ago
Ross Wightman
55adfbeb8d
Add commented code to increase open file limit via Python (for TFDS dataset building)
3 years ago
Ross Wightman
f7d210d759
Remove evonorm models from FX tests
3 years ago
Ross Wightman
3dc71695bf
Merge pull request #989 from martinsbruveris/feat/resmlp-dino
...
Added DINO pretrained ResMLP models.
3 years ago
Ross Wightman
480c676ffa
Fix FX breaking assert in evonorm
3 years ago
Martins Bruveris
85c5ff26d7
Added DINO pretrained ResMLP models.
3 years ago
Ross Wightman
f83b0b01e3
Would like to pass GitHub tests again disabling both FX feature extract backward and torchscript tests
3 years ago
Ross Wightman
a22b85c1b9
Merge branch 'nateraw-hf-save-and-push'
3 years ago
Ross Wightman
d633a014e6
Post merge cleanup. Fix potential security issue passing kwargs directly through to serialized web data.
3 years ago
Ross Wightman
8a83c41d7b
Merge branch 'hf-save-and-push' of https://github.com/nateraw/pytorch-image-models into nateraw-hf-save-and-push
3 years ago
Ross Wightman
147e1059a8
Remove FX backward test from GitHub actions runs for now.
3 years ago
Nathan Raw
b18c9e323b
Update helpers.py
3 years ago
Nathan Raw
308d0b9554
Merge branch 'master' into hf-save-and-push
3 years ago
Ross Wightman
878bee1d5e
Add patch8 vit model to FX exclusion filter
3 years ago
Ross Wightman
9bb4c80d2a
Update README.md, missed recent regnetz results
3 years ago
Ross Wightman
79bf4f163f
Add mention of ResNet50 weights w/ RSB recipe ingredients and lower train res for better @ 224 results (but worse res scaling beyond). Not changing default.
3 years ago
Ross Wightman
ce76a810c2
New FX test strategy, filter based on param count
3 years ago
Ross Wightman
1e51c2d02e
More FX test tweaks
3 years ago
Ross Wightman
f0507f6da6
Fix k_decay default arg != 1.0 in poly scheduler
3 years ago
Ross Wightman
947e1df3ef
Update README.md
3 years ago
Ross Wightman
90448031ea
Filter more large models from FX tests
3 years ago
Ross Wightman
1f53db2ece
Updated lamhalobotnet weights, 81.5 top-1
3 years ago
Ross Wightman
8dc269c303
Filter more models for FX tests
3 years ago
Ross Wightman
15ef108eb4
Add better halo2botnet50ts weights, 82 top-1 @ 256
3 years ago
Ross Wightman
2482652027
Add nfnet_f2 to FX test exclusion
3 years ago
Ross Wightman
809c7bb1ec
Merge remote-tracking branch 'origin/master' into bits_and_tpu
3 years ago
Ross Wightman
734b2244fe
Add RegNetZ-D8 (83.5 @ 256, 84 @ 320) and RegNetZ-E8 (84.5 @ 256, 85 @ 320) weights. Update names of existing RegZ models to include group size.
3 years ago
Ross Wightman
05092e2fbe
Add more models to FX filter
3 years ago
Ross Wightman
93cc08fdc5
Make evonorm variables 1d to match other PyTorch norm layers, will break weight compat for any existing use (likely minimal, easy to fix).
3 years ago