Explain properties of least-squares estimators: the Gauss–Markov theorem. (linear, Unbiased and efficient)
The correct answer and explanation is:
Certainly! Here’s an explanation of the properties of least-squares estimators with respect to the Gauss–Markov theorem, focusing on the concepts of linearity, unbiasedness, and efficiency:
Properties of Least-Squares Estimators: The Gauss–Markov Theorem
In linear regression analysis, the ordinary least squares (OLS) estimator is a common method to estimate the unknown parameters (coefficients) of a linear model. The Gauss–Markov theorem states that under certain conditions, the OLS estimator has some very desirable properties.
1. Linearity
The OLS estimator is a linear estimator because it can be expressed as a linear function of the observed dependent variable values.
- If we denote the vector of observed dependent variables as y, the OLS estimator β^\hat{\beta} for the parameter vector β\beta can be written as: β^=(X⊤X)−1X⊤y\hat{\beta} = (X^\top X)^{-1} X^\top y
- Here, XX is the matrix of independent variables (regressors), and β^\hat{\beta} is a linear combination of the observations yy.
2. Unbiasedness
An estimator is unbiased if its expected value equals the true parameter value it estimates.
- Under the assumptions that:
- The linear regression model is correctly specified,
- The error terms have zero mean,
- The regressors XX are fixed (non-random) or independent of the errors,
- This means that on average, the OLS estimator hits the true parameter value, making it an unbiased estimator.
3. Efficiency (Best Linear Unbiased Estimator — BLUE)
The Gauss–Markov theorem says the OLS estimator is the Best Linear Unbiased Estimator (BLUE), which means:
- Among all linear and unbiased estimators,
- The OLS estimator has the smallest variance.
- Formally, no other linear unbiased estimator has a variance-covariance matrix smaller than that of the OLS estimator.
- This makes OLS efficient in the class of linear unbiased estimators, producing estimates with the least uncertainty.
Summary
The Gauss–Markov theorem guarantees that, provided the classical linear regression assumptions hold (linearity, zero-mean errors, no perfect multicollinearity, and homoscedasticity), the least-squares estimator:
- Is linear in observations,
- Is unbiased (expected value equals true parameters),
- Is efficient (has minimum variance among linear unbiased estimators).
Thus, OLS is a powerful and reliable method for parameter estimation in linear models.