diff --git a/ch04/01_main-chapter-code/ch04.ipynb b/ch04/01_main-chapter-code/ch04.ipynb index 14bb2a8..876d269 100644 --- a/ch04/01_main-chapter-code/ch04.ipynb +++ b/ch04/01_main-chapter-code/ch04.ipynb @@ -667,7 +667,7 @@ "metadata": {}, "source": [ "- As we can see, ReLU is a piecewise linear function that outputs the input directly if it is positive; otherwise, it outputs zero\n", - "- GELU is a smooth, non-linear function that approximates ReLU but with a non-zero gradient for negative values\n", + "- GELU is a smooth, non-linear function that approximates ReLU but with a non-zero gradient for negative values (except at approximately -0.75)\n", "\n", "- Next, let's implement the small neural network module, `FeedForward`, that we will be using in the LLM's transformer block later:" ]