Skip to content

Commit

Permalink
Merge pull request jax-ml#2235 from gnecula/documentation
Browse files Browse the repository at this point in the history
Removed a couple of slow notebooks from RTD auto-rendering.
  • Loading branch information
gnecula authored Feb 15, 2020
2 parents 1842093 + 370558d commit a76bf33
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 2 deletions.
2 changes: 2 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,8 @@
'notebooks/neural_network_with_tfds_data.ipynb',
# Slow notebook
'notebooks/Neural_Network_and_Data_Loading.ipynb',
'notebooks/score_matching.ipynb',
'notebooks/maml.ipynb',
]

# The name of the Pygments (syntax highlighting) style to use.
Expand Down
3 changes: 2 additions & 1 deletion docs/developer.rst
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,8 @@ You can then see the generated documentation in
Update notebooks
----------------

Open the notebook with http://colab.research.google.com, update it, ``Run all cells`` then
Open the notebook with http://colab.research.google.com (then `Upload` from your
local repo), update it as needed, ``Run all cells`` then
``Download ipynb``. You may want to test that it executes properly, using ``sphinx-build`` as
explained above.

Expand Down
2 changes: 1 addition & 1 deletion docs/jaxpr.rst
Original file line number Diff line number Diff line change
Expand Up @@ -443,7 +443,7 @@ with 3 input parameters:

* ``c`` is a constvar and stands for the ``ones`` constant,
* ``b`` corresponds to the free variable ``arg`` captured in the ``inner`` function,
* ``a`` corresponds to the ``inner`` parameter ``x`.
* ``a`` corresponds to the ``inner`` parameter ``x``.

The primitive takes three arguments ``b a c``.

Expand Down
1 change: 1 addition & 0 deletions docs/notebooks/autodiff_cookbook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -672,6 +672,7 @@
},
"source": [
"This shape makes sense: if we start with a function $f : \\mathbb{R}^n \\to \\mathbb{R}^m$, then at a point $x \\in \\mathbb{R}^n$ we expect to get the shapes\n",
"\n",
"* $f(x) \\in \\mathbb{R}^m$, the value of $f$ at $x$,\n",
"* $\\partial f(x) \\in \\mathbb{R}^{m \\times n}$, the Jacobian matrix at $x$,\n",
"* $\\partial^2 f(x) \\in \\mathbb{R}^{m \\times n \\times n}$, the Hessian at $x$,\n",
Expand Down

0 comments on commit a76bf33

Please sign in to comment.