Commit Graph

178 Commits (11704cc721197be84ed8a0a816dc54c1468f7cc0)

Author SHA1 Message Date
Ross Wightman c59d88339b Merge remote-tracking branch 'origin/main' into multi-weight
2 years ago
Ross Wightman ec6921fcb0 MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
2 years ago
Ross Wightman 803254bb40 Fix spacing misalignment for fast norm path in LayerNorm modules
2 years ago
Ross Wightman b293dfa595 Add CL SE module
2 years ago
Ross Wightman 9709dbaaa9 Adding support for fine-tune CLIP LAION-2B image tower weights for B/32, L/14, H/14 and g/14. Still WIP
2 years ago
Ross Wightman 769ab4b98a Clean up no_grad for trunc normal weight inits
2 years ago
Ross Wightman 48e1df8b37 Add norm/norm_act header comments
2 years ago
Ross Wightman ffaf97f813 MaxxVit! A very configurable MaxVit and CoAtNet impl with lots of goodies..
2 years ago
Ross Wightman 8c9696c9df More model and test fixes
2 years ago
Ross Wightman 43aa84e861 Add 'fast' layer norm that doesn't cast to float32, support APEX LN impl for slight speed gain, update norm and act factories, tweak SE for ability to disable bias (needed by GCVit)
2 years ago
Ross Wightman 8ad4bdfa06 Allow ntuple to be used with string values
2 years ago
Ross Wightman 7c7ecd2492 Add --use-train-size flag to force use of train input_size (over test input size) for validation. Default test-time pooling to use train input size (fixes issues).
2 years ago
Ross Wightman eca09b8642 Add MobileVitV2 support. Fix #1332. Move GroupNorm1 to common layers (used in poolformer + mobilevitv2). Keep ol custom ConvNeXt LayerNorm2d impl as LayerNormExp2d for reference.
2 years ago
Ross Wightman 7a9c6811c9 Add eps arg to LayerNorm2d, add 'tf' (tensorflow) variant of trunc_normal_ that applies scale/shift after sampling (instead of needing to move a/b)
2 years ago
Ross Wightman 82c311d082 Add more experimental darknet and 'cs2' darknet variants (different cross stage setup, closer to newer YOLO backbones) for train trials.
2 years ago
Ross Wightman 07d0c4ae96 Improve repr for DropPath module
2 years ago
Ross Wightman e27c16b8a0 Remove unecessary code for synbn guard
2 years ago
Ross Wightman 0da3c9ebbf Remove SiLU layer in default args that breaks import on old old PyTorch
2 years ago
Ross Wightman 879df47c0a Support BatchNormAct2d for sync-bn use. Fix #1254
2 years ago
Ross Wightman 4b30bae67b Add updated vit_relpos weights, and impl w/ support for official swin-v2 differences for relpos. Add bias control support for MLP layers
2 years ago
jjsjann123 f88c606fcf fixing channels_last on cond_conv2d; update nvfuser debug env variable
2 years ago
Ross Wightman f670d98cb8 Make a few more layers symbolically traceable (remove from FX leaf modules)
2 years ago
Ross Wightman b049a5c5c6 Merge remote-tracking branch 'origin/master' into norm_norm_norm
2 years ago
Ross Wightman 9440a50c95 Merge branch 'mrT23-master'
2 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
2 years ago
Ross Wightman 95cfc9b3e8 Merge remote-tracking branch 'origin/master' into norm_norm_norm
2 years ago
Ross Wightman 656757d26b Fix MobileNetV2 head conv size for multiplier < 1.0. Add some missing modification copyrights, fix starting date of some old ones.
2 years ago
Ross Wightman b27c21b09a Update drop_path and drop_block (fast impl) to be symbolically traceable, slightly faster
2 years ago
Ross Wightman 214c84a235 Disable use of timm nn.Linear wrapper since AMP autocast + torchscript use appears fixed
2 years ago
Ross Wightman a52a614475 Remove layer experiment which should not have been added
3 years ago
Ross Wightman ab49d275de Significant norm update
3 years ago
Ross Wightman d04f2f1377 Update drop_path and drop_block (fast impl) to be symbolically traceable, slightly faster
3 years ago
Ross Wightman 834a9ec721 Disable use of timm nn.Linear wrapper since AMP autocast + torchscript use appears fixed
3 years ago
Ross Wightman 78912b6375 Updated EvoNorm implementations with some experimentation. Add FilterResponseNorm. Updated RegnetZ and ResNetV2 model defs for trials.
3 years ago
talrid c11f4c3218 support CNNs
3 years ago
mrT23 d6701d8a81
Merge branch 'rwightman:master' into master
3 years ago
Ross Wightman 480c676ffa Fix FX breaking assert in evonorm
3 years ago
talrid 41559247e9 use_ml_decoder_head
3 years ago
Ross Wightman 93cc08fdc5 Make evonorm variables 1d to match other PyTorch norm layers, will break weight compat for any existing use (likely minimal, easy to fix).
3 years ago
Ross Wightman af607b75cc Prep a set of ResNetV2 models with GroupNorm, EvoNormB0, EvoNormS0 for BN free model experiments on TPU and IPU
3 years ago
Ross Wightman c976a410d9 Add ResNet-50 w/ GN (resnet50_gn) and SEBotNet-33-TS (sebotnet33ts_256) model defs and weights. Update halonet50ts weights w/ slightly better variant in1k val, more robust to test sets.
3 years ago
Alexander Soare b25ff96768 wip - pre-rebase
3 years ago
Alexander Soare e051dce354 Make all models FX traceable
3 years ago
Alexander Soare 0149ec30d7 wip - attempting to rebase
3 years ago
Alexander Soare bc3d4eb403 wip -rebase
3 years ago
Ross Wightman 2ddef942b9 Better fix for #954 that doesn't break torchscript, pull torch._assert into timm namespace when it exists
3 years ago
Ross Wightman 4f0f9cb348 Fix #954 by bringing traceable _assert into timm to allow compat w/ PyTorch < 1.8
3 years ago
Ross Wightman b745d30a3e Fix formatting of last commit
3 years ago
Ross Wightman 3478f1d7f1 Traceability fix for vit models for some experiments
3 years ago
Ross Wightman f658a72e72 Cleanup re-use of Dropout modules in Mlp modules after some twitter feedback :p
3 years ago