Definition
Let be a Sequence of random variables with CDF , be a Random Variable with cdf , and be a set of every point at which is continuous, then converges in distribution to if , and denoted by . is called the limiting distribution of or asymptotic distribution of
Facts
Convergence in Probability implies convergence in distribution
Continuous Mapping Theorem
Definition
Continuous functions preserve convergences (in probability, Almost Surely, or in distribution) of a Sequence of random variables to limits.
Consider a Sequence of random variables defined on same Probability Space, and a Continuous Function on the space. Then,
Link to original
- Convergence in Probability to a Constant:
- Convergence in Probability:
- Almost Sure Convergence:
- Convergence in Distribution:
Slutzky Theorem
Definition
Let be Sequence of random variables, be a Random Variable, be constants, then
Link to original
Link to originalLet be a sequence of distributions. Then, The convergence of the KL-Divergence to zero implies that the JS-Divergence also converges to zero. The convergence of the JS-Divergence to zero is equivalent to the convergence of the Total Variation Distance to zero. The convergence of the Total Variation Distance to zero implies that the Wasserstein Distance also converges to zero. The convergence of the Wasserstein Distance to zero is equivalent to the Convergence in Distribution of the sequence.