Definition

Convergence in Probability

Let be a Sequence of random vectors and be a Random Vector, then converges in probability to if , and denoted by

Let be a Sequence of random vectors and be a Random Vector, then

Convergence in Distribution

Let be a Sequence of random vectors with CDF , be a Random Variable with cdf , and be a set of every point at which is continuous, then converges in distribution to if , and denoted by . is also called limiting distribution of or asymptotic distribution of

Continuous Mapping Theorem

Let be a Continuous Function

MGF Technique

Let be a Sequence of random vectors with MGF and be a Random Vector with MGF , then

Central Limit Theorem

Let be i.i.d. Sequence of random vectors from a distribution with mean , variance-covariance matrix , then

For i.i.d random variables that have finite variance, the sample mean converges to Normal Distribution.

Delta Method

Let be a Sequence of p-dimensional random vectors with , , be a matrix, and , then

Facts

Let , then

Let be a Sequence of p-dimensional random vectors, , be a constant matrix, and be a dimensional constant vector, then