added training script note

The behavior is not obvious to me. Perhaps it's useful to mention this here to avoid confusion.
pull/1420/head
Florian 3 years ago committed by GitHub
parent 1d8ada359a
commit e710cc041f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -13,6 +13,8 @@ To train an SE-ResNet34 on ImageNet, locally distributed, 4 GPUs, one process pe
`./distributed_train.sh 4 /data/imagenet --model seresnet34 --sched cosine --epochs 150 --warmup-epochs 5 --lr 0.4 --reprob 0.5 --remode pixel --batch-size 256 --amp -j 4`
NOTE: It is recommended to use PyTorch 1.9+ w/ PyTorch native AMP and DDP instead of APEX AMP. `--amp` defaults to native AMP as of timm ver 0.4.3. `--apex-amp` will force use of APEX components if they are installed.
NOTE: For training and validation the same input size is used. Some models specify a different test input size which is only used in the validation script below.
## Validation / Inference Scripts
@ -24,4 +26,4 @@ To validate with the model's pretrained weights (if they exist):
To run inference from a checkpoint:
`python inference.py /imagenet/validation/ --model mobilenetv3_large_100 --checkpoint ./output/train/model_best.pth.tar`
`python inference.py /imagenet/validation/ --model mobilenetv3_large_100 --checkpoint ./output/train/model_best.pth.tar`

Loading…
Cancel
Save