Commit Graph

13 Commits (484e61648d70780c39301a755a65bfd30fb3ed87)

Author SHA1 Message Date
Ross Wightman 484e61648d Adding the attn series weights, tweaking model names, comments...
3 years ago
Ross Wightman 8449ba210c Improve performance of HaloAttn, change default dim calc. Some cleanup / fixes for byoanet. Rename resnet26ts to tfs to distinguish (extra fc).
3 years ago
Ross Wightman 925e102982 Update attention / self-attn based models from a series of experiments:
3 years ago
Ross Wightman 742c2d5247 Add Gather-Excite and Global Context attn modules. Refactor existing SE-like attn for consistency and refactor byob/byoanet for less redundancy.
4 years ago
Ross Wightman 9a3ae97311 Another set of byoanet models w/ ECA channel + SA + groups
4 years ago
Ross Wightman 165fb354b2 Add initial RedNet model / Involution layer impl for testing
4 years ago
Ross Wightman 3ba6b55cb2 More adjustments to ByoaNet models for further experiments.
4 years ago
Ross Wightman 0721559511 Improved (hopefully) init for SA/SA-like layers used in ByoaNets
4 years ago
Ross Wightman 9cc7dda6e5 Fixup byoanet configs to pass unit tests. Add swin_attn and swinnet26t model for testing.
4 years ago
Ross Wightman e15c3886ba Defaul lambda r=7. Define '26t' stage 4/5 256x256 variants for all of bot/halo/lambda nets for experiment. Add resnet50t for exp. Fix a few comments.
4 years ago
Ross Wightman b3d7580df1 Update ByoaNet comments. Fix first Steam feat chs for ByobNet.
4 years ago
Ross Wightman 16f7aa9f54 Add default_cfg options for min_input_size / fixed_input_size, queries in model registry, and use for testing self-attn models
4 years ago
Ross Wightman ce62f96d4d ByoaNet with bottleneck transformer, lambda resnet, and halo net experiments
4 years ago