Downloading: "https://download.pytorch.org/models/resnet34-b627a593.pth" to /root/.cache/torch/hub/checkpoints/resnet34-b627a593.pth
/usr/local/lib/python3.7/dist-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at /pytorch/c10/core/TensorImpl.h:1156.)
return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode)
epoch
train_loss
valid_loss
error_rate
time
0
0.129521
0.022127
0.007442
01:10
epoch
train_loss
valid_loss
error_rate
time
0
0.056711
0.023975
0.010149
01:18
We end up with an error rate of \(0.010149\).
Let’s do another round where we recreate the dataloaders, the learner and fine tune again for a single epoch. Since we have used the same seed we will get the same final result, right?
Use the set_seed function (pass in reproducible=True) and remember that any steps consuming random numbers from the pseudo random generators (such as using the learning rate finder) better be present otherwise you will end up seeing a different result.