Skip to content

prishasawhney/Parameter-Estimation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 Cannot retrieve latest commit at this time.

History

4 Commits
 
 
 
 

Repository files navigation

Parameter-Estimation

Prisha Sawhney

102116052

3CS10

Maximum Likelihood Estimation (MLE)

Introduction

Maximum likelihood estimation is a method that determines values for the parameters of a model. The parameter values are found such that they maximize the likelihood that the process described by the model produced the data that were observed. For a discrete distribution, we maximize the PDF (Probability Density Function) whereas for a continuous distribution, we maximize the PMF (Probability Mass Function)

Formulae

  1. Likelihood Function: Given a set of observations ( x = ( x 1 , , x n ) ) and a set of parameters ( θ ) , the likelihood function is defined as:

    L ( θ | x ) = i = 1 n P ( x i | θ )

    where ( P ( x i | θ ) ) is the probability (or probability density) of observing ( x i ) given parameters ( θ )

  2. Log-Likelihood Function: To simplify computations, it's common to work with the log-likelihood function, which transforms products into sums:

    ( θ | x ) = i = 1 n log ( P ( x i | θ ) )

    The log-likelihood function is typically easier to work with when finding the parameter values that maximize the likelihood.

  3. MLE Estimation: The goal of MLE is to find the parameter values that maximize the likelihood function:

    θ ^ = a r g m a x θ ( θ | x )

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published