2 minute read
In this section, we will understand Moments of a probability distribution and Moment Generating Function.
Moments:
They are statistical measures that describe the characteristics of a probability distribution,
such as its central tendency, spread/variability, and asymmetry.
Note: We will discuss ‘raw’ and ‘central’ moments.
Important moments:
Moment Generating Function:
It is a function that simplifies computation of moments, such as, the mean, variance, skewness, and kurtosis,
by providing a compact way to derive any moment of a random variable through differentiation.
Not all probability distributions have MGFs; sometimes, integrals may not converge and moment may not exist.
Since, \(MGF = M_X(t) = E[e^{tX}] \), we can write the MGF as:
Say, \( E[X^n] = m_n \), where \(m_n\) is the \(n^{th}\) moment of the random variable, then:
\[ M_X(t) = 1 + \frac{t}{1!}m_1 + \frac{t^2}{2!}m_2 + \frac{t^3}{3!}m_3 + \cdots + \frac{t^n}{n!}m_n + \cdots \\ \]Important: Differentiating \(M_X(t)\) with respect to \(t\), \(i\) times, and setting \((t =0)\) gives us the \(i^{th}\) moment of the random variable.
\[ \frac{dM_X(t) }{dt} = M'_X(t) = 0 + m_1 + \frac{2t}{2!}m_2 + \frac{3t^2}{3!}m_3 + \cdots + \frac{nt^n-1}{n!}m_n + \cdots \\ \]Set \(t=0\), to get the first moment:
\[ \frac{dM_X(t) }{dt} = M'_X(t) = m_1 \]Similarly, second moment = \(M''_X(t) = m_2 = E[X^2]\)
And, \(n^{th}\) derivative = \(M^{(n)}_X(t) = m_n = E[X^n]\)
e.g: \( Var[X] = M''_X(0) - (M'_X(0))^2 \)
Note: If we know the MGF of a random variable, then we do NOT need to do integration or summation to find the moments.
Continuous:
Discrete:
\[ M_X(t) = E[e^{tX}] = \sum_{i=1}^{\infty} e^{tx} P(X=x) \]End of Section