Skip to content

Commit

Permalink
Update CM20315_Gradients_II.ipynb
Browse files Browse the repository at this point in the history
  • Loading branch information
pitmonticone committed Nov 30, 2023
1 parent ef28d84 commit 6b2f251
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions CM20315/CM20315_Gradients_II.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
"source": [
"# Gradients II: Backpropagation algorithm\n",
"\n",
"In this practical, we'll investigate the backpropagation algoritithm. This computes the gradients of the loss with respect to all of the parameters (weights and biases) in the network. We'll use these gradients when we run stochastic gradient descent."
"In this practical, we'll investigate the backpropagation algorithm. This computes the gradients of the loss with respect to all of the parameters (weights and biases) in the network. We'll use these gradients when we run stochastic gradient descent."
],
"metadata": {
"id": "L6chybAVFJW2"
Expand All @@ -53,7 +53,7 @@
{
"cell_type": "markdown",
"source": [
"First let's define a neural network. We'll just choose the weights and biaes randomly for now"
"First let's define a neural network. We'll just choose the weights and biases randomly for now"
],
"metadata": {
"id": "nnUoI0m6GyjC"
Expand Down Expand Up @@ -178,7 +178,7 @@
{
"cell_type": "markdown",
"source": [
"Now let's define a loss function. We'll just use the least squaures loss function. We'll also write a function to compute dloss_doutpu"
"Now let's define a loss function. We'll just use the least squares loss function. We'll also write a function to compute dloss_doutpu"
],
"metadata": {
"id": "SxVTKp3IcoBF"
Expand Down

0 comments on commit 6b2f251

Please sign in to comment.