Skip to content

Comparison of uncertainty estimation characteristics between Bayesian neural network based on dropout, Gaussian progress regression, and density network.

Notifications You must be signed in to change notification settings

taewankim1/uncertainty_deeplearning

Repository files navigation

Uncertainty in deep learning

This reposit compares the uncertainty in Gaussian Process Regression and other methods developed for deep learning such as Ensemble, Bayesian approach based on the dropout. It performs 1D regression simulations where the training data is corrupted with both epistemic and aleatoric unceratinty. Your can start with Comparison.ipynb in notebooks folder.

Summary of results

  • Training data

Two types uncertainty in training data.
  • GP regression

Good at epistemic uncertainty. Not valid for aleatoric uncertainty. Note that there is a variation of GP regression for heteroscedastic data. Check the reference.
  • Neural net

No generating uncertainty.
  • Ensemble

Good at epistemic uncertainty, but somewhat underestimate the uncertainty. Not valid for aleatoric uncertainty
  • Bayeisn neural net with dropout

Valid for epistemic uncertainty, but not good.
  • Single density network

Valid only for aleatoric uncertainty. Unstable in traning process due to negative log-likelihood loss.
  • Density + dropout
  • Density + ensemble
  • Mixture density network

References

About

Comparison of uncertainty estimation characteristics between Bayesian neural network based on dropout, Gaussian progress regression, and density network.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published