add cyfi445 labs

This commit is contained in:
Frank Xu
2025-09-21 16:35:23 -04:00
parent e1a0b81c65
commit 69a0aa5308
5 changed files with 152 additions and 0 deletions

Binary file not shown.

Binary file not shown.

View File

@@ -0,0 +1,152 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "10ee0fc5",
"metadata": {},
"source": [
"# Mini Project: Autograd with PyTorch\n",
"Fill in the code cells below to complete the project.\n",
"\n",
"Use comments (`# ...`) to record your observations and reflection as requested."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "276c8f5b",
"metadata": {},
"outputs": [],
"source": [
"# --- Imports ---\n",
"import torch\n",
"import matplotlib.pyplot as plt\n",
"\n",
"torch.manual_seed(0) # for reproducibility"
]
},
{
"cell_type": "markdown",
"id": "66360d15",
"metadata": {},
"source": [
"## Step 1: Define Data and Parameters"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bd3741d7",
"metadata": {},
"outputs": [],
"source": [
"# Define the dataset\n",
"X = torch.tensor([[1., 2.], [2., 3.], [3., 4.], [4., 5.]])\n",
"y = torch.tensor([5., 7., 9., 11.])\n",
"\n",
"# Initialize parameters with requires_grad=True\n",
"w = torch.randn(2, requires_grad=True)\n",
"print('Initial weights:', w)"
]
},
{
"cell_type": "markdown",
"id": "6af9174b",
"metadata": {},
"source": [
"## Step 2: Training Loop with Manual Updates\n",
"Implement MSE loss, call `loss.backward()`, and manually update weights."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3804230e",
"metadata": {},
"outputs": [],
"source": [
"# Hyperparameters\n",
"lr = 0.01\n",
"epochs = 50\n",
"losses = []\n",
"\n",
"for epoch in range(epochs):\n",
" # Forward pass: prediction\n",
" y_pred = X @ w # linear model (no bias)\n",
"\n",
" # Compute MSE loss\n",
" loss = ((y_pred - y)**2).mean()\n",
" losses.append(loss.item())\n",
"\n",
" # Zero gradients from previous step\n",
" if w.grad is not None:\n",
" w.grad.zero_()\n",
"\n",
" # Backward pass\n",
" loss.backward()\n",
"\n",
" # Manual gradient descent update\n",
" with torch.no_grad():\n",
" w -= lr * w.grad\n",
"\n",
" if (epoch+1) % 10 == 0:\n",
" print(f\"Epoch {epoch+1}, Loss: {loss.item():.6f}\")\n",
"\n",
"# Comment here: Does the loss decrease as expected?"
]
},
{
"cell_type": "markdown",
"id": "5c1922d7",
"metadata": {},
"source": [
"## Step 3: Plot Loss Curve"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "522965a8",
"metadata": {},
"outputs": [],
"source": [
"plt.plot(losses, '-o')\n",
"plt.xlabel('Epoch')\n",
"plt.ylabel('MSE Loss')\n",
"plt.title('Loss vs. Epoch')\n",
"plt.grid(True)\n",
"plt.show()\n",
"\n",
"# Comment: Briefly describe the shape of the curve (e.g., decreasing smoothly)."
]
},
{
"cell_type": "markdown",
"id": "75a2aa1a",
"metadata": {},
"source": [
"## Reflection"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b0051b8a",
"metadata": {},
"outputs": [],
"source": [
"# Write 3-4 lines of reflection as comments:\n",
"# - What did you learn about autograd?\n",
"# - What step was confusing or new?\n",
"# - What would you try next (optimizer, dataset, etc.)?"
]
}
],
"metadata": {
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

Binary file not shown.