3.2 Limiting Distributions
3.2.2 Converge in distribution
Theorem 3.1 Continuous Mapping Theorem
Let \(X_n\) be a sequence of random variables such that \(X_n \xrightarrow{d} X\), where \(X\) is some random variable (or constant). Let \(g(\cdot)\) be a continuous function. Then
\[g(X_n) \xrightarrow{d} g(X).\]
If \(X\) is a constant \(c\), then \(g(X_n) \xrightarrow{p} g(c)\). (Convergence in distribution to a constant implies convergence in probability to that constant).
Slutsky’s Theorem is a collection of results concerning the asymptotic behavior of random variables. It is extremely useful in establishing the limiting distributions of estimators and test statistics.
Slutzky theorem
If \(X_n \xrightarrow{d} X\) and \(Y_n \xrightarrow{p} c\), where \(c\) is a constant, then If \(X_n \xrightarrow{p} X\) and \(Y_n \xrightarrow{p} Y\), thenSlutsky’s theorem essentially says that convergence in distribution and convergence in probability behave “nicely” with continuous functions and arithmetic operations, especially when one of the sequences converges to a constant. You can often treat limits of random variables much like limits of ordinary sequences, provided you are careful about the mode of convergence.