Skip to content

Commit

Permalink
slight update lesson 12
Browse files Browse the repository at this point in the history
  • Loading branch information
bertdv committed Jan 26, 2019
1 parent 3de6e1a commit 1bcb9e9
Show file tree
Hide file tree
Showing 2 changed files with 36 additions and 11 deletions.
2 changes: 1 addition & 1 deletion lessons/notebooks/03_Bayesian-Machine-Learning.ipynb

Large diffs are not rendered by default.

45 changes: 35 additions & 10 deletions lessons/notebooks/12_Dynamic-Latent-Variable-Models.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@
}
},
"source": [
"- Similar to our work on Gaussian Mixture models and latent Factor models, we can create a flexible dynamic system by introducing _latent_ (unobserved) variables $z^T \\triangleq \\left(z_1,z_2,\\dots,z_T\\right)$ (one $z_t$ for each observation $x_t$). In dynamic systems, the latent variables $z_t$ are usually called _state variables_."
"- Similar to our work on Gaussian Mixture models, we can create a flexible dynamic system by introducing _latent_ (unobserved) variables $z^T \\triangleq \\left(z_1,z_2,\\dots,z_T\\right)$ (one $z_t$ for each observation $x_t$). In dynamic systems, $z_t$ are called _state variables_."
]
},
{
Expand All @@ -168,7 +168,7 @@
}
},
"source": [
"- A very common computational assumption is to let state transitions be ruled by a _first-order Markov chain_ as\n",
"- A common assumption is to let state transitions be ruled by a _first-order Markov chain_ as\n",
"$$\n",
" p(z_t\\,|\\,z^{t-1}) = p(z_t\\,|\\,z_{t-1})\n",
"$$"
Expand All @@ -179,7 +179,7 @@
"metadata": {
"collapsed": true,
"slideshow": {
"slide_type": "fragment"
"slide_type": "subslide"
}
},
"source": [
Expand Down Expand Up @@ -390,6 +390,21 @@
"INsert a picture here similar to lesson 13 that demostrates the result of Kalman filter-based estimation of the position after 10 time steps"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"### Extensions of Generative Gaussian Models\n",
"\n",
"- Using the methods of the previous lessons, it is possible to create your own new models based on stacking Gaussian and categorical distributions in new ways: \n",
"\n",
"<img src=\"./figures/fig-generative-Gaussian-models.png\" width=\"600px\">"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand All @@ -400,18 +415,29 @@
"source": [
"### Recap Dynamical Models \n",
"\n",
"- In short, it is possible to analytically derive the Kalman filter for a linear dynamical system with Gaussian state and observation noise (eventhough it is not a fun exercise). "
"- Dynamical systems do not obey the sample-by-sample independence assumption, but still can be specified, and state and parameter estimation equations can be solved by similar tools as for static models."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
"slide_type": "slide"
}
},
"source": [
"- If anything changes in the model, e.g., the state noise is not Gaussian, then you have to re-derive the inference equations again from scratch and it may not lead to an analytically pleasing answer. "
"- Two of the more famous and powerful models with latent states include the hidden Markov model (with discrete states) and the Linear Gaussian dynamical system (with continuous states)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"- For the LGDS, the Kalman filter is a well-known recursive state estimation procedure. The Kalman filter can be derived through Bayesian update rules on Gaussian distributions. "
]
},
{
Expand All @@ -422,19 +448,18 @@
}
},
"source": [
"- $\\Rightarrow$ Generally, we will want to automate the inference process. This issue is discussed in the next lesson on inference by message passing in factor graphs."
"- If anything changes in the model, e.g., the state noise is not Gaussian, then you have to re-derive the inference equations again from scratch and it may not lead to an analytically pleasing answer. "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
"slide_type": "fragment"
}
},
"source": [
"### Extensions of Generative Gaussian Models\n",
"<img src=\"./figures/fig-generative-Gaussian-models.png\" width=\"600px\">"
"- $\\Rightarrow$ Generally, we will want to automate the inference process. This issue is discussed in the next lesson on inference by message passing in factor graphs."
]
},
{
Expand Down

0 comments on commit 1bcb9e9

Please sign in to comment.