What is the purpose of optimization?
To find the minimum of a scalar cost function
What are the requirements for parameter estimation?
What are properties of estimators and what do they mean?
Bias: Deviation from real value
Covariance: Dispersion of estimator around expected value
Mean Square Error: Captures the effects of both Bias and covariance
Consistency
Asymp. Normality
Asymp. Efficiency
What is consistency in estimators?
If an estimator converges in probability as the number of samples increases, it is called consistent. A consistent estimator is automatically unbiased
What is Asymptotic normality?
“The more samples we use, the smaller the confidence intervals on the parameters will be, i.e. the ‘surer’ one can be about them
What is asymptotic efficiency?
“An asymptotically efficient estimator makes the most of the available data, i.e. achieves the lowest possible parameter variances of all estimators”
What is the basic idea of the maximum likelihood methods?
Adjust the parameters so the probability of obtaining a set of measurements is maximized
What are the asymptotic properties of the maximum likelihood estimates?
What are the Three Estimation Models?
For a multivariate cost function, what are the requirements for a vector to be a minimum?
What is an estimator?
An estimator is a rule or method for calculating an estimate of a given quantity based on observed data.
What are the parameters and residuals of the Fisher Model?
What are the parameters and residuals of the Bayesian Model?
What are the parameters and residuals of the Least-Square Model?
What is a good way of comparing two estimators?
Comparing the mean square error (MSE)
How does the Bayesian Estimator work?
Prob. densities of _ and
residuals p(r) assumed to be known a priori, conditional prob. density is maximized p(_|z)
How does the Fisher Estimator work?
Estimator based on concept of likelihood function; Prob. to obtain measurements Z, given a set of parameters _ is maximized,
How does the Least Squares Estimator work?
(Best) estimate for parameters
is obtained from (weighted) sum of squares, no assumptions about prob. density of _ and residuals p(r) are made
What is the Bayesian Rule?
[p(z|\theta)p(\theta)]/ p(z) = p(\theta|z)