Add BiT references and knowledge distill links to readme/docs

pull/323/head
Ross Wightman 3 years ago
parent 855d6cc217
commit 58ccf43150

@ -130,6 +130,7 @@ All model architecture families include variants with pretrained weights. The ar
A full version of the list below with source links can be found in the [documentation](https://rwightman.github.io/pytorch-image-models/models/).
* Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370
* CspNet (Cross-Stage Partial Networks) - https://arxiv.org/abs/1911.11929
* DenseNet - https://arxiv.org/abs/1608.06993
* DLA - https://arxiv.org/abs/1707.06484
@ -242,6 +243,10 @@ One of the greatest assets of PyTorch is the community and their contributions.
* Albumentations - https://github.com/albumentations-team/albumentations
* Kornia - https://github.com/kornia/kornia
### Knowledge Distillation
* RepDistiller - https://github.com/HobbitLong/RepDistiller
* torchdistill - https://github.com/yoshitomo-matsubara/torchdistill
### Metric Learning
* PyTorch Metric Learning - https://github.com/KevinMusgrave/pytorch-metric-learning

@ -10,6 +10,10 @@ Most included models have pretrained weights. The weights are either:
The validation results for the pretrained weights can be found [here](results.md)
## Big Transfer ResNetV2 (BiT) [[resnetv2.py](https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/resnetv2.py)]
* Paper: `Big Transfer (BiT): General Visual Representation Learning` - https://arxiv.org/abs/1912.11370
* Reference code: https://github.com/google-research/big_transfer
## Cross-Stage Partial Networks [[cspnet.py](https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/cspnet.py)]
* Paper: `CSPNet: A New Backbone that can Enhance Learning Capability of CNN` - https://arxiv.org/abs/1911.11929
* Reference impl: https://github.com/WongKinYiu/CrossStagePartialNetworks

Loading…
Cancel
Save