Skip to content

Commit

Permalink
Minor rephrase of readme.
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 206643365
  • Loading branch information
lxuechen authored and tensorflower-gardener committed Jul 30, 2018
1 parent 2993989 commit 41ac62a
Showing 1 changed file with 5 additions and 6 deletions.
11 changes: 5 additions & 6 deletions tensorflow/contrib/eager/python/examples/l2hmc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,15 @@ This folder contains an implementation of [L2HMC](https://arxiv.org/pdf/1711.092
With eager execution enabled, longer sample chains can be handled compared to graph mode, since no graph is explicitly stored. Moreover, with eager execution enabled, there is no need to use a `tf.while_loop`.

## What is L2HMC?
L2HMC is an algorithm that learns a non-volume preserving transformation
for an HMC-like sampling algorithm. More specifically, the non-volume preserving
L2HMC is an adaptive Markov Chain Monte Carlo (MCMC) algorithm that learns a non-volume preserving transformation
for a Hamiltonian Monte Carlo (HMC) sampling algorithm. More specifically, the non-volume preserving
transformation is learned with neural nets instantiated within Normalizing Flows
(more precisely, real-NVPs).
(real-NVPs).

## Content

- `l2hmc.py`: Dynamics definitions and example energy functions,
including the 2D strongly correlated Gaussian, the rough well energy function,
and a Gaussian mixture model.
including the 2D strongly correlated Gaussian and the rough well energy function,
- `l2hmc_test.py`: Unit tests and benchmarks for training a sampler on the energy functions in both eager and graph mode.
- `neural_nets.py`: The neural net for learning the kernel on the 2D strongly correlated example.
- `main.py`: Run to train a samplers on 2D energy landscapes.
Expand All @@ -32,7 +31,7 @@ tensorboard and a plot of sampled chain from the trained sampler.
Specifying the optional argument `use_defun` will let the program use compiled
graphs when running specific sections and improve the overall speed.

## Boosting Performance with `defun`
## Boosting Performance with `tfe.defun`
Currently, some models may experience increased overhead with eager execution enabled.
To improve performance, we could wrap certain functions with the decorator `@tfe.defun`.
For example, we could wrap the function that does the sampling step:
Expand Down

0 comments on commit 41ac62a

Please sign in to comment.