Commit Graph

16 Commits (c865028c34fc05cc7deea8e753b30d8cd1952591)

Author SHA1 Message Date
Ross Wightman 6f103a442b Add convnext_nano weights, 80.8 @ 224, 81.5 @ 288
2 years ago
Ross Wightman c5e0d1c700 Add dilation support to convnext, allows output_stride=8 and 16 use. Fix #1341
2 years ago
Ross Wightman 06307b8b41 Remove experimental downsample in block support in ConvNeXt. Experiment further before keeping it in.
2 years ago
Ross Wightman 188c194b0f Left some experiment stem code in convnext by mistake
2 years ago
Ross Wightman 6064d16a2d Add initial EdgeNeXt import. Significant cleanup / reorg (like ConvNeXt). Fix #1320
2 years ago
SeeFun 8f0bc0591e fix convnext args
3 years ago
SeeFun ec4e9aa5a0
Add ConvNeXt tiny and small pretrain in22k
3 years ago
Ross Wightman 474ac906a2 Add 'head norm first' convnext_tiny_hnf weights
3 years ago
Ross Wightman 372ad5fa0d Significant model refactor and additions:
3 years ago
Ross Wightman 5f81d4de23 Move DeiT to own file, vit getting crowded. Working towards fixing #1029, make pooling interface for transformers and mlp closer to convnets. Still working through some details...
3 years ago
Ross Wightman 738a9cd635 unbiased=False for torch.var_mean path of ConvNeXt LN. Fix #1090
3 years ago
Ross Wightman e0c4eec4b6 Default conv_mlp to False across the board for ConvNeXt, causing issues on more setups than it's improving right now...
3 years ago
Ross Wightman b669f4a588 Add ConvNeXt 22k->1k fine-tuned and 384 22k-1k fine-tuned weights after testing
3 years ago
Ross Wightman edd3d73695 Add missing dropout for head reset in ConvNeXt default head
3 years ago
Ross Wightman b093dcb46d Some convnext cleanup, remove in place mul_ for gamma, breaking symbolic trace, cleanup head a bit...
3 years ago
Ross Wightman 18934debc5 Add initial ConvNeXt impl (mods of official code)
3 years ago