17 Moment Generating Function, Characteristic Function and Probability Generating Function
17.1 Moment Generating Function
The moment generating function (mgf) \(\phi_X(t)\) (sometimes denoted \(M_X(t)\) of the random variable \(X\) is defined for all values \(t\) by: \[\phi_X(t) = \text E[e^{tX}] = \begin{cases} \sum_x e^{tx}p(x) & \text{if $X$ is discrete} \\ \int_{-\infty}^\infty e^{tx}f(x) & \text{if $X$ is continuous}\end{cases}\] We call \(\phi_X(t)\) the moment generating function because all of the moments of \(X\) can be obtained by successively differentiating \(\phi(t)\): \[\text E[X^n] = \phi^{(n)}_X(0) \qquad \text{for } n \ge 1\] The mgf for a random vector \(\textbf X =(X_1, ..., X_n)\) is defined as: \[\phi_{\textbf X}(\textbf t) = \text E[e^{\textbf t^T\textbf X}]\]
Proposition: \(X_1, ..., X_n\) are independent random variables if and only if \[\phi_{X_1, ..., X_n}(t_1,...,t_n) = \phi_{X_1}(t_1) ... \phi_{X_n}(t_n)\] where \(\phi_{X_i}(t_i) = \phi_{X_1, ..., X_n}(\textbf t^T e_i)\) where \(e_i\in \mathbb R^n\) is the unit vector along the i-th axis.
Proposition: If \(X\) and \(Y\) are two random variables and for all values of \(t\), \(\phi_X(t) = \phi_Y(t)\), then \(F_X(x) = F_Y(x)\).
However, the mgf might not exist. If some moment n-th of \(X\) is not finite, then the mgf does not exist (the converse is not true). ## Characteristic Function The characteristic function \(\varphi_X(t)\) of the random variable \(X\) is defined for all values \(t\) by: \[\varphi_X(t) = \text E[e^{itX}]\] where \(i^2 = -1\). The characteristic for a random vector \(\textbf X =(X_1, ..., X_n)\) is also defined as: \[\phi_{\textbf X}(\textbf t) = \text E[e^{i\textbf t^T\textbf X}]\] The two above propositions also holds for the characteristic function. Contrary to mgf, the characteristic function always exists for any random variable \(X\). This results from the fact that \(|\varphi_X(t)| \le E[|e^{itX}|] = 1\).
Proposition: \[\text E[X^n] = \frac{\varphi^{(n)}_X(t)}{i^n}\] Proposition: \(\varphi_{aX+b}(t) = e^{ibt} \varphi_X(at)\) ## Probability Generating Function The probability generating function (pgf) of a discrete random variable having the support being the non-negative integers is a power series representation (i.e. generating function) of the pmf of the random variable. It is defined as \[G_X(z) = \text E[z^X] = \sum_{x=0}^\infty z^x p_X(x) \] The probability mass function of \(X\) is recovered by taking derivatives of \(G_X(t)\): \[p_X(k) = P\{X = k\} = \frac{G_X^{(k)}(0)}{k!}\] Proposition: \[\text E[X(X-1)...(X-k+1)] = G_X^{(k)} (1)\] Proposition: Suppose that \(X_1,... , X_n\) are independent random variables, and let \(Y = X_1 + . . . + X_n\). Then \[G_Y(z) = \prod_{i=1}^n G_{X_i}(z)\] The pgf specifies a unique set of probabilities.