Skip to content

Commit

Permalink
correct std dev
Browse files Browse the repository at this point in the history
  • Loading branch information
pkmital committed Nov 2, 2016
1 parent 5511b31 commit b20977d
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions session-1/session-1.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -353,9 +353,9 @@
"<a name=\"instructions-2\"></a>\n",
"## Instructions\n",
"\n",
"Now use tensorflow to calculate the standard deviation and upload the standard deviation image averaged across color channels as a \"jet\" heatmap of the 100 images. This will be a little more involved as there is no operation in tensorflow to do this for you. However, you can do this by calculating the mean image of your dataset as a 4-D array. To do this, you could write e.g. `mean_img_4d = tf.reduce_mean(imgs, reduction_indices=0, keep_dims=True)` to give you a `1 x H x W x C` dimension array calculated on the `N x H x W x C` images variable. The reduction_indices parameter is saying to calculate the mean over the 0th dimension, meaning for every possible `H`, `W`, `C`, or for every pixel, you will have a mean composed over the `N` possible values it could have had, or what that pixel was for every possible image. This way, you can write `images - mean_img_4d` to give you a `N x H x W x C` dimension variable, with every image in your images array having been subtracted by the `mean_img_4d`. If you calculate the square root of the sum of the squared differences of this resulting operation, you have your standard deviation!\n",
"Now use tensorflow to calculate the standard deviation and upload the standard deviation image averaged across color channels as a \"jet\" heatmap of the 100 images. This will be a little more involved as there is no operation in tensorflow to do this for you. However, you can do this by calculating the mean image of your dataset as a 4-D array. To do this, you could write e.g. `mean_img_4d = tf.reduce_mean(imgs, reduction_indices=0, keep_dims=True)` to give you a `1 x H x W x C` dimension array calculated on the `N x H x W x C` images variable. The reduction_indices parameter is saying to calculate the mean over the 0th dimension, meaning for every possible `H`, `W`, `C`, or for every pixel, you will have a mean composed over the `N` possible values it could have had, or what that pixel was for every possible image. This way, you can write `images - mean_img_4d` to give you a `N x H x W x C` dimension variable, with every image in your images array having been subtracted by the `mean_img_4d`. If you calculate the square root of the expected squared differences of this resulting operation, you have your standard deviation!\n",
"\n",
"In summary, you'll need to write something like: `subtraction = imgs - tf.reduce_mean(imgs, reduction_indices=0, keep_dims=True)`, then reduce this operation using `tf.sqrt(tf.reduce_sum(subtraction * subtraction, reduction_indices=0))` to get your standard deviation then include this image in your zip file as <b>std.png</b>\n",
"In summary, you'll need to write something like: `subtraction = imgs - tf.reduce_mean(imgs, reduction_indices=0, keep_dims=True)`, then reduce this operation using `tf.sqrt(tf.reduce_mean(subtraction * subtraction, reduction_indices=0))` to get your standard deviation then include this image in your zip file as <b>std.png</b>\n",
"\n",
"<a name=\"code-2\"></a>\n",
"## Code\n",
Expand Down Expand Up @@ -389,8 +389,8 @@
"subtraction = imgs - mean_img_4d\n",
"\n",
"# Now compute the standard deviation by calculating the\n",
"# square root of the sum of squared differences\n",
"std_img_op = tf.sqrt(tf.reduce_sum(subtraction * subtraction, reduction_indices=0))\n",
"# square root of the expected squared differences\n",
"std_img_op = tf.sqrt(tf.reduce_mean(subtraction * subtraction, reduction_indices=0))\n",
"\n",
"# Now calculate the standard deviation using your session\n",
"std_img = sess.run(std_img_op)"
Expand Down

0 comments on commit b20977d

Please sign in to comment.