You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
88 lines
2.4 KiB
88 lines
2.4 KiB
4 years ago
|
# Summary
|
||
|
|
||
|
**Wide Residual Networks** are a variant on [ResNets](https://paperswithcode.com/method/resnet) where we decrease depth and increase the width of residual networks. This is achieved through the use of [wide residual blocks](https://paperswithcode.com/method/wide-residual-block).
|
||
|
|
||
|
{% include 'code_snippets.md' %}
|
||
|
|
||
|
## How do I train this model?
|
||
|
|
||
|
You can follow the [timm recipe scripts](https://rwightman.github.io/pytorch-image-models/scripts/) for training a new model afresh.
|
||
|
|
||
|
## Citation
|
||
|
|
||
|
```BibTeX
|
||
|
@article{DBLP:journals/corr/ZagoruykoK16,
|
||
|
author = {Sergey Zagoruyko and
|
||
|
Nikos Komodakis},
|
||
|
title = {Wide Residual Networks},
|
||
|
journal = {CoRR},
|
||
|
volume = {abs/1605.07146},
|
||
|
year = {2016},
|
||
|
url = {http://arxiv.org/abs/1605.07146},
|
||
|
archivePrefix = {arXiv},
|
||
|
eprint = {1605.07146},
|
||
|
timestamp = {Mon, 13 Aug 2018 16:46:42 +0200},
|
||
|
biburl = {https://dblp.org/rec/journals/corr/ZagoruykoK16.bib},
|
||
|
bibsource = {dblp computer science bibliography, https://dblp.org}
|
||
|
}
|
||
|
```
|
||
|
|
||
|
<!--
|
||
|
Models:
|
||
|
- Name: wide_resnet101_2
|
||
|
Metadata:
|
||
|
FLOPs: 29304929280
|
||
|
Training Data:
|
||
|
- ImageNet
|
||
|
Architecture:
|
||
|
- 1x1 Convolution
|
||
|
- Batch Normalization
|
||
|
- Convolution
|
||
|
- Global Average Pooling
|
||
|
- Max Pooling
|
||
|
- ReLU
|
||
|
- Residual Connection
|
||
|
- Softmax
|
||
|
- Wide Residual Block
|
||
|
File Size: 254695146
|
||
|
Tasks:
|
||
|
- Image Classification
|
||
|
ID: wide_resnet101_2
|
||
|
Crop Pct: '0.875'
|
||
|
Image Size: '224'
|
||
|
Interpolation: bilinear
|
||
|
Code: https://github.com/rwightman/pytorch-image-models/blob/5f9aff395c224492e9e44248b15f44b5cc095d9c/timm/models/resnet.py#L802
|
||
|
In Collection: Wide ResNet
|
||
|
- Name: wide_resnet50_2
|
||
|
Metadata:
|
||
|
FLOPs: 14688058368
|
||
|
Training Data:
|
||
|
- ImageNet
|
||
|
Architecture:
|
||
|
- 1x1 Convolution
|
||
|
- Batch Normalization
|
||
|
- Convolution
|
||
|
- Global Average Pooling
|
||
|
- Max Pooling
|
||
|
- ReLU
|
||
|
- Residual Connection
|
||
|
- Softmax
|
||
|
- Wide Residual Block
|
||
|
File Size: 275853271
|
||
|
Tasks:
|
||
|
- Image Classification
|
||
|
ID: wide_resnet50_2
|
||
|
Crop Pct: '0.875'
|
||
|
Image Size: '224'
|
||
|
Interpolation: bicubic
|
||
|
Code: https://github.com/rwightman/pytorch-image-models/blob/5f9aff395c224492e9e44248b15f44b5cc095d9c/timm/models/resnet.py#L790
|
||
|
In Collection: Wide ResNet
|
||
|
Collections:
|
||
|
- Name: Wide ResNet
|
||
|
Paper:
|
||
|
title: Wide Residual Networks
|
||
|
url: https://papperswithcode.com//paper/wide-residual-networks
|
||
|
type: model-index
|
||
|
Type: model-index
|
||
|
-->
|