The probability is the probability of the outcome given parameter . The likelihood is the likelihood of given , i.e. the likelihood of some outcome given parameter . In the former, we view as fixed and as variable while in the latter we view as fixed and as variable. The use of likelihood is to infer a model given some outcome.
Definition.
The Maximum Likelihood Estimation of is defined as follows:
- The MLE of a parameter may be biased or unbiased.
- A convenient trick is that maximizing the likelihood is equivalent to maximizing the log-likelihood by the monotonicity of log.
- For i.i.d. random variables taken from a normal distribution, the maximum likelihood estimators for the mean and variance are the sample mean and naive sample variance (which is biased).
- A weakness of MLE is that it is highly unstable, i.e. small changes in data may cause wild fluctuations in the estimation.