Mention small discrepancy due to Dropout non-reproducibility in PyTorch (#519)

* Mention small discrepancy due to Dropout non-reproducibility in PyTorch

* bump pytorch version
This commit is contained in:
Sebastian Raschka
2025-02-06 14:59:52 -06:00
committed by GitHub
parent bd8f7522cb
commit 68e2efe1c9
3 changed files with 13 additions and 2 deletions

View File

@@ -1348,6 +1348,16 @@
"# print(f\"Training completed in {execution_time_minutes:.2f} minutes.\")"
]
},
{
"cell_type": "markdown",
"id": "2e8b86f0-b07d-40d7-b9d3-a9218917f204",
"metadata": {},
"source": [
"- Note that you might get slightly different loss values on your computer, which is not a reason for concern if they are roughly similar (a training loss below 1 and a validation loss below 7)\n",
"- Small differences can often be due to different GPU hardware and CUDA versions or small changes in newer PyTorch versions\n",
"- Even if you are running the example on a CPU, you may observe slight differences; a possible reason for a discrepancy is the differing behavior of `nn.Dropout` across operating systems, depending on how PyTorch was compiled, as discussed [here on the PyTorch issue tracker](https://github.com/pytorch/pytorch/issues/121595)"
]
},
{
"cell_type": "code",
"execution_count": 28,