Skip to content

Commit f28f62d

Browse files
committed
remove snapshot logic
1 parent 6ef6485 commit f28f62d

1 file changed

Lines changed: 2 additions & 13 deletions

File tree

mnist_sequential.ipynb

Lines changed: 2 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -183,23 +183,12 @@
183183
"execution_count": null,
184184
"metadata": {},
185185
"outputs": [],
186-
"source": [
187-
"epochs = 10\n",
188-
"history = {\"loss\": [], \"accuracy\": [], \"val_loss\": [], \"val_accuracy\": []}\n",
189-
"\n",
190-
"# Epochs at which to capture prediction snapshots for visualization\n",
191-
"snapshot_epochs = {1, 3, 5, 10}\n",
192-
"epoch_snapshots = []"
193-
]
186+
"source": "epochs = 10\nhistory = {\"loss\": [], \"accuracy\": [], \"val_loss\": [], \"val_accuracy\": []}"
194187
},
195188
{
196189
"cell_type": "markdown",
197190
"metadata": {},
198-
"source": [
199-
"## Training Phase: Forward and Backward Pass\n",
200-
"\n",
201-
"For each epoch, we iterate through batches of training data. The forward pass computes predictions, the loss measures error, and backpropagation (loss.backward()) calculates gradients. The optimizer then updates weights to reduce the error. We capture predictions at certain epochs to visualize learning progress."
202-
]
191+
"source": "## Training Phase: Forward and Backward Pass\n\nFor each epoch, we iterate through batches of training data. The forward pass computes predictions, the loss measures error, and backpropagation (loss.backward()) calculates gradients. The optimizer then updates weights to reduce the error."
203192
},
204193
{
205194
"cell_type": "code",

0 commit comments

Comments
 (0)