Maximum likelihood estimation is a method that determines values for the parameters of a model. The parameter values are found such that they maximize the likelihood that the process described by the model produced the data that were observed. For a discrete distribution, we maximize the PDF (Probability Density Function) whereas for a continuous distribution, we maximize the PMF (Probability Mass Function)
-
Likelihood Function: Given a set of observations
and a set of parameters , the likelihood function is defined as: where
is the probability (or probability density) of observing given parameters -
Log-Likelihood Function: To simplify computations, it's common to work with the log-likelihood function, which transforms products into sums:
The log-likelihood function is typically easier to work with when finding the parameter values that maximize the likelihood.
-
MLE Estimation: The goal of MLE is to find the parameter values that maximize the likelihood function: