Difference between revisions of "Maximum Likelihood Estimation"
Jump to navigation
Jump to search
(Created page with "==General== * Obtain an estimate for an unknown parameter theta using the data that we obtained from our sample. * Choose a value of theta that maximizes the likelihood of get...") Tags: Mobile edit Mobile web edit |
Tags: Mobile edit Mobile web edit |
||
Line 2: | Line 2: | ||
* Obtain an estimate for an unknown parameter theta using the data that we obtained from our sample. | * Obtain an estimate for an unknown parameter theta using the data that we obtained from our sample. | ||
* Choose a value of theta that maximizes the likelihood of getting the data we observed. | * Choose a value of theta that maximizes the likelihood of getting the data we observed. | ||
− | * Joint probability mass function | + | * Joint probability mass function: If the observations are independent you can just multiply the PDFs of the individual observations. |
− | |||
** <math>L(\theta)=\prod_{i=1}^n f(x_i;\theta)</math> (General formulation) | ** <math>L(\theta)=\prod_{i=1}^n f(x_i;\theta)</math> (General formulation) | ||
− | + | ||
− | + | ==Bernoulli Distribution== | |
− | + | * <math>f(x_i;p)=p^{x_i}(1-p)^{1-x_i}</math> for xi = 0 or 1 and 0 < p < 1. | |
− | + | * If the Xi are independent Bernoulli random variables with unknown parameter p, replace the general notation with the bernoulli notation: | |
+ | * <math>L(p)=p^{\sum x_i}(1-p)^{n-\sum x_i}</math> | ||
+ | * <math>log L(p) = (\sum x_i) log(p) + (n- \sum x_i) log( 1-p)</math> | ||
+ | |||
+ | |||
+ | ==Exponential Distribution== | ||
+ | * Suppose we have samples from an exponential distribution with parameter lambda: | ||
+ | ** <math>X_i \sim \textrm{Exp}( \lambda ) </math>, assuming i.i.d. | ||
+ | * Recall that the density is the product of <math>f( x_{\textrm{undertilde}} | \lambda ) = \prod_{i=1}^n \lambda e^{- \lambda x_i } = \lambda^n e ^{-\lambda \sum x_i}</math> | ||
+ | * <math>L( \lambda | x_{\textrm{undertilde}} ) = \lambda^n e ^{-\lambda \sum x_i}</math> |
Revision as of 15:44, 11 May 2020
General
- Obtain an estimate for an unknown parameter theta using the data that we obtained from our sample.
- Choose a value of theta that maximizes the likelihood of getting the data we observed.
- Joint probability mass function: If the observations are independent you can just multiply the PDFs of the individual observations.
- (General formulation)
Bernoulli Distribution
- for xi = 0 or 1 and 0 < p < 1.
- If the Xi are independent Bernoulli random variables with unknown parameter p, replace the general notation with the bernoulli notation:
Exponential Distribution
- Suppose we have samples from an exponential distribution with parameter lambda:
- , assuming i.i.d.
- Recall that the density is the product of