Moment Generating Function

Moment Generating Function

In this section, we will understand Moments of a probability distribution and Moment Generating Function.


📘

Moments:
They are statistical measures that describe the characteristics of a probability distribution, such as its central tendency, spread/variability, and asymmetry.
Note: We will discuss ‘raw’ and ‘central’ moments.

Important moments:

  1. Mean : Central tendency
  2. Variance : Spread/variability
  3. Skewness : Asymmetry
  4. Kurtosis : Tailedness


📘

Moment Generating Function:
It is a function that simplifies computation of moments, such as, the mean, variance, skewness, and kurtosis, by providing a compact way to derive any moment of a random variable through differentiation.

  • Provides an alternative to PDFs and CDFs of random variables.
  • A powerful property of the MGF is that its \(n^{th}\) derivative, when evaluated at \((t=0)\) = the \(n^{th}\) moment of the random variable.
\[ \text{MGF of random variable X is the expected value of } e^{tX} \\ MGF = M_X(t) = E[e^{tX}] \]

Not all probability distributions have MGFs; sometimes, integrals may not converge and moment may not exist.

\[ \begin{aligned} e^x = 1 + \frac{x}{1!} + \frac{x^2}{2!} + \frac{x^3}{3!} + \cdots + \frac{x^n}{n!} + \cdots\\ e^{tX} = 1 + \frac{tx}{1!} + \frac{(tx)^2}{2!} + \frac{(tx)^3}{3!} + \cdots + \frac{(tx)^n}{n!} + \cdots\\ \end{aligned} \]

Since, \(MGF = M_X(t) = E[e^{tX}] \), we can write the MGF as:

\[ \begin{aligned} M_X(t) &= E[e^{tX}] \\ &= 1 + \frac{t}{1!}E[X] + \frac{t^2}{2!}E[X^2] + \frac{t^3}{3!}E[X^3] + \cdots + \frac{t^n}{n!}E[X^n] + \cdots \\ \end{aligned} \]

Say, \( E[X^n] = m_n \), where \(m_n\) is the \(n^{th}\) moment of the random variable, then:

\[ M_X(t) = 1 + \frac{t}{1!}m_1 + \frac{t^2}{2!}m_2 + \frac{t^3}{3!}m_3 + \cdots + \frac{t^n}{n!}m_n + \cdots \\ \]

Important: Differentiating \(M_X(t)\) with respect to \(t\), \(i\) times, and setting \((t =0)\) gives us the \(i^{th}\) moment of the random variable.

\[ \frac{dM_X(t) }{dt} = M'_X(t) = 0 + m_1 + \frac{2t}{2!}m_2 + \frac{3t^2}{3!}m_3 + \cdots + \frac{nt^n-1}{n!}m_n + \cdots \\ \]

Set \(t=0\), to get the first moment:

\[ \frac{dM_X(t) }{dt} = M'_X(t) = m_1 \]

Similarly, second moment = \(M''_X(t) = m_2 = E[X^2]\)
And, \(n^{th}\) derivative = \(M^{(n)}_X(t) = m_n = E[X^n]\)

\[ E[X^n] = \frac{d^nM_X(t)}{dt^n} \bigg|_{t=0} \]

e.g: \( Var[X] = M''_X(0) - (M'_X(0))^2 \)

Note: If we know the MGF of a random variable, then we do NOT need to do integration or summation to find the moments.
Continuous:

\[ M_X(t) = E[e^{tX}] = \int_{-\infty}^{\infty} e^{tx} f_X(x) dx \]

Discrete:

\[ M_X(t) = E[e^{tX}] = \sum_{i=1}^{\infty} e^{tx} P(X=x) \]



End of Section