Skip to content

Commit

Permalink
Update CM20315_Loss.ipynb
Browse files Browse the repository at this point in the history
  • Loading branch information
pitmonticone committed Nov 30, 2023
1 parent a5d98bb commit 6b76bbc
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions CM20315/CM20315_Loss.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
"\n",
"We'll compute loss functions for maximum likelihood, minimum negative log likelihood, and least squares and show that they all imply that we should use the same parameter values\n",
"\n",
"In part II, we'll investigate binary classification (where the output data is 0 or 1). This will be based on the Bernouilli distribution\n",
"In part II, we'll investigate binary classification (where the output data is 0 or 1). This will be based on the Bernoulli distribution\n",
"\n",
"In part III we'll investigate multiclass classification (where the output data is 0,1, or, 2). This will be based on the categorical distribution."
],
Expand Down Expand Up @@ -178,7 +178,7 @@
{
"cell_type": "markdown",
"source": [
"The blue line i sthe mean prediction of the model and the gray area represents plus/minus two standardard deviations. This model fits okay, but could be improved. Let's compute the loss. We'll compute the the least squares error, the likelihood, the negative log likelihood."
"The blue line is the mean prediction of the model and the gray area represents plus/minus two standard deviations. This model fits okay, but could be improved. Let's compute the loss. We'll compute the the least squares error, the likelihood, the negative log likelihood."
],
"metadata": {
"id": "MvVX6tl9AEXF"
Expand Down Expand Up @@ -276,7 +276,7 @@
"beta_0, omega_0, beta_1, omega_1 = get_parameters()\n",
"# Use our neural network to predict the mean of the Gaussian\n",
"mu_pred = shallow_nn(x_train, beta_0, omega_0, beta_1, omega_1)\n",
"# Set the standard devation to something reasonable\n",
"# Set the standard deviation to something reasonable\n",
"sigma = 0.2\n",
"# Compute the likelihood\n",
"likelihood = compute_likelihood(y_train, mu_pred, sigma)\n",
Expand All @@ -292,7 +292,7 @@
{
"cell_type": "markdown",
"source": [
"You can see that this gives a very small answer, even for this small 1D dataset, and with the model fitting quite well. This is because it is the product of sveral probabilities, which are all quite small themselves.\n",
"You can see that this gives a very small answer, even for this small 1D dataset, and with the model fitting quite well. This is because it is the product of several probabilities, which are all quite small themselves.\n",
"This will get out of hand pretty quickly with real datasets -- the likelihood will get so small that we can't represent it with normal finite-precision math\n",
"\n",
"This is why we use negative log likelihood"
Expand Down Expand Up @@ -326,7 +326,7 @@
"beta_0, omega_0, beta_1, omega_1 = get_parameters()\n",
"# Use our neural network to predict the mean of the Gaussian\n",
"mu_pred = shallow_nn(x_train, beta_0, omega_0, beta_1, omega_1)\n",
"# Set the standard devation to something reasonable\n",
"# Set the standard deviation to something reasonable\n",
"sigma = 0.2\n",
"# Compute the log likelihood\n",
"nll = compute_negative_log_likelihood(y_train, mu_pred, sigma)\n",
Expand Down Expand Up @@ -397,7 +397,7 @@
"source": [
"# Define a range of values for the parameter\n",
"beta_1_vals = np.arange(0,1.0,0.01)\n",
"# Create some arrays to store the likelihoods, negative log likehoos and sum of squares\n",
"# Create some arrays to store the likelihoods, negative log likelihoods and sum of squares\n",
"likelihoods = np.zeros_like(beta_1_vals)\n",
"nlls = np.zeros_like(beta_1_vals)\n",
"sum_squares = np.zeros_like(beta_1_vals)\n",
Expand Down Expand Up @@ -482,7 +482,7 @@
"source": [
"# Define a range of values for the parameter\n",
"sigma_vals = np.arange(0.1,0.5,0.005)\n",
"# Create some arrays to store the likelihoods, negative log likehoos and sum of squares\n",
"# Create some arrays to store the likelihoods, negative log likelihoods and sum of squares\n",
"likelihoods = np.zeros_like(sigma_vals)\n",
"nlls = np.zeros_like(sigma_vals)\n",
"sum_squares = np.zeros_like(sigma_vals)\n",
Expand Down

0 comments on commit 6b76bbc

Please sign in to comment.