-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature request: add support for accessing log normalizer #29
Comments
Nice to hear from you again, Kevin!
I agree. That's
Ah, that's a cool use that I hadn't considered! Could we provide some functions to make this calculation more convenient? It looks like the quotient of log-normalizers (prior and likelihood) times the carrier measure of the data?
That's a good idea. I'd like to add something like that this week. Essentially, these examples (done for Gaussian, see below) should take the form:
Do you think that would add clarity? If you have an idea, pull requests are always welcome! |
yes, sounds great. You could even have a unit test like reproducing some
of the demos below
https://github.com/probml/pyprobml/blob/master/notebooks/book1/04/beta_binom_post_plot.ipynb
https://github.com/probml/pyprobml/blob/master/notebooks/book1/04/beta_binom_post_pred_plot.ipynb
https://github.com/probml/pyprobml/blob/master/notebooks/book2/03/gauss_seq_update_sigma_1d.ipynb
https://github.com/probml/pyprobml/blob/master/notebooks/book1/03/gauss_infer_2d.ipynb
[image: Screenshot 2024-11-18 at 10.25.44 AM.png]
…On Mon, Nov 18, 2024 at 10:05 AM Neil Girdhar ***@***.***> wrote:
Nice to hear from you again, Kevin!
Computing the log parition function
I agree. That's NaturalParametrization.log_normalizer in efax. All
distributions implement that.
for computing the marginal likelihood p(D)
Ah, that's a cool use that I hadn't considered! Could I provide some
functions to make this calculation more convenient? It looks like the
quotient of log-normalizers (prior and likelihood) times the carrier
measure of the data?
Also it would be nice to have a worked example of a
likelihood+prior=posterior computation for some simple familiar families,
like binomial+beta=beta, or gauss+gauss=gauss.
That's a good idea. I'd like to add something like that this week.
Essentially, these examples should take the form:
- Convert the prior and likelihood to natural parameters prior and
likelihood
- Add them posterior = efax.parameter_map(operator.add, prior,
likelihood)
- Convert them back to whatever source parametrization you want.
Do you think that would add clarity?
—
Reply to this email directly, view it on GitHub
<#29 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABDK6EBFYOOVSUUO26BC3CD2BG3XFAVCNFSM6AAAAABRWOPVUOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIOBSGUYDKMZVGM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Cool! I think the closest thing I have is this test. It samples from scipy, and then does maximum likelihood estimation (which is essentially prior, likelihood combination in the conjugate prior). What do you think? I'll take a closer look at your examples this week. |
Seems that getting the conjugate prior distribution from data (likelihood) and add it to an existing prior should work, maybe define a function for doing this? |
A simple example for Bernoulli, it would be great if some of the methods can be designed to make this smoother, e.g., I don't quite get the sufficient_statistics method import jax.numpy as jnp
from jax.random import PRNGKey
import efax
from efax import BernoulliNP, BernoulliEP, BetaNP, parameter_mean
# beta prior
prior = BetaNP(alpha_minus_one=jnp.array([1.0, 1.0]))
# bernoulli likelihood
n = (100,)
dist = BernoulliEP(probability=jnp.array([0.4]))
samples = dist.sample(PRNGKey(0), n)
ss = BernoulliNP.sufficient_statistics(samples)
ss_mean = parameter_mean(ss, axis=0)
print(ss_mean)
likelihood = ss_mean.conjugate_prior_distribution(jnp.array(n))
print("prior", prior)
print("likelihood", likelihood)
# posterior
posterior = efax.parameter_map(jnp.add, prior, likelihood)
print("posterior", posterior) |
I think I know what you're getting at, but let's make sure we're on the same page. There are two major ways of combining evidence:
Does this all make sense? What do you think?
I think your example looks perfect to me. Every part of it shows a clear intent with code that corresponds to your intent. I think it could be possible to wrap this up in a convenience function. Why don't we try coding that up, and then think about whether it belongs in your code or in efax? |
I think the examples make sense, and I guess in a real scenario we'll have a prior and some observations? Which will require us to get likelihood and add it to the prior |
I don't quite understand the current design of sufficient_statistics, seems that for beta ss is just saving the samples? |
Did you try reading expfam.pdf? |
Computing the log parition function for a conjugate family is a very useful quantity eg for computing the marginal likelihood p(D), which is needed for empirical Bayes and model selection. (See screenshot from my book below which shows p(D) = Z(post)/Z(prior))
Also it would be nice to have a worked example of a likelihood+prior=posterior computation for some simple familiar families, like binomial+beta=beta, or gauss+gauss=gauss.
The text was updated successfully, but these errors were encountered: