Commit Graph

36 Commits (9f5bba9ef9db8a32a5a04325c8eb181c9f13a9b2)

Author SHA1 Message Date
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2 years ago
Ross Wightman abc9ba2544 Transitioning default_cfg -> pretrained_cfg. Improving handling of pretrained_cfg source (HF-Hub, files, timm config, etc). Checkpoint handling tweaks.
3 years ago
Ross Wightman 1f53db2ece Updated lamhalobotnet weights, 81.5 top-1
3 years ago
Ross Wightman 15ef108eb4 Add better halo2botnet50ts weights, 82 top-1 @ 256
3 years ago
Ross Wightman c976a410d9 Add ResNet-50 w/ GN (resnet50_gn) and SEBotNet-33-TS (sebotnet33ts_256) model defs and weights. Update halonet50ts weights w/ slightly better variant in1k val, more robust to test sets.
3 years ago
Ross Wightman b328e56f49 Update eca_halonext26ts weights to a better set
3 years ago
Ross Wightman ae72d009fa Add weights for lambda_resnet50ts, halo2botnet50ts, lamhalobotnet50ts, updated halonet50ts
3 years ago
Ross Wightman b6caa356d2 Fixed eca_botnext26ts_256 weights added, 79.27
3 years ago
Ross Wightman c02334d9fa Add weights for regnetz_d and haloregnetz_c, update regnetz_c weights. Add commented PyTorch XLA code for halo attention
3 years ago
Ross Wightman cd34913278 Remove some outdated comments, botnet networks working great now.
3 years ago
Ross Wightman 6ed4cdccca Update lambda_resnet26t weights with better set
3 years ago
Ross Wightman a85df34993 Update lambda_resnet26rpt weights to 78.9, add better halonet26t weights at 79.1 with tweak to attention dim
3 years ago
Ross Wightman b544ad4d3f regnetz model default cfg tweaks
3 years ago
Ross Wightman e2b8d44ff0 Halo, bottleneck attn, lambda layer additions and cleanup along w/ experimental model defs
3 years ago
Ross Wightman da0d39bedd Update default crop_pct for byoanet
3 years ago
Ross Wightman 64495505b7 Add updated lambda resnet26 and botnet26 checkpoints with fixes applied
3 years ago
Ross Wightman 007bc39323 Some halo and bottleneck attn code cleanup, add halonet50ts weights, use optimal crop ratios
3 years ago
Ross Wightman b49630a138 Add relative pos embed option to LambdaLayer, fix last transpose/reshape.
3 years ago
Ross Wightman 0ca687f224 Make 'regnetz' model experiments closer to actual RegNetZ, bottleneck expansion, expand from in_chs, no shortcut on stride 2, tweak model sizes
3 years ago
Ross Wightman cf5ac2800c BotNet models were still off, remove weights for bad configs. Add good SE-HaloNet33-TS weights.
3 years ago
Ross Wightman 8642401e88 Swap botnet 26/50 weights/models after realizing a mistake in arch def, now figuring out why they were so low...
3 years ago
Ross Wightman 5f12de4875 Add initial AttentionPool2d that's being trialed. Fix comment and still trying to improve reliability of sgd test.
3 years ago
Ross Wightman 76881d207b Add baseline resnet26t @ 256x256 weights. Add 33ts variant of halonet with at least one halo in stage 2,3,4
3 years ago
Ross Wightman 484e61648d Adding the attn series weights, tweaking model names, comments...
3 years ago
Ross Wightman 8449ba210c Improve performance of HaloAttn, change default dim calc. Some cleanup / fixes for byoanet. Rename resnet26ts to tfs to distinguish (extra fc).
3 years ago
Ross Wightman 925e102982 Update attention / self-attn based models from a series of experiments:
3 years ago
Ross Wightman 742c2d5247 Add Gather-Excite and Global Context attn modules. Refactor existing SE-like attn for consistency and refactor byob/byoanet for less redundancy.
4 years ago
Ross Wightman 9a3ae97311 Another set of byoanet models w/ ECA channel + SA + groups
4 years ago
Ross Wightman 165fb354b2 Add initial RedNet model / Involution layer impl for testing
4 years ago
Ross Wightman 3ba6b55cb2 More adjustments to ByoaNet models for further experiments.
4 years ago
Ross Wightman 0721559511 Improved (hopefully) init for SA/SA-like layers used in ByoaNets
4 years ago
Ross Wightman 9cc7dda6e5 Fixup byoanet configs to pass unit tests. Add swin_attn and swinnet26t model for testing.
4 years ago
Ross Wightman e15c3886ba Defaul lambda r=7. Define '26t' stage 4/5 256x256 variants for all of bot/halo/lambda nets for experiment. Add resnet50t for exp. Fix a few comments.
4 years ago
Ross Wightman b3d7580df1 Update ByoaNet comments. Fix first Steam feat chs for ByobNet.
4 years ago
Ross Wightman 16f7aa9f54 Add default_cfg options for min_input_size / fixed_input_size, queries in model registry, and use for testing self-attn models
4 years ago
Ross Wightman ce62f96d4d ByoaNet with bottleneck transformer, lambda resnet, and halo net experiments
4 years ago