# Generating function example

Okay, so in the last post I used generating functions to compute the expected number of coin flips to get ${r}$ consecutive heads – and the computation was very ugly. In fact there are much easier ways to solve that problem. So here is an example of where generating functions make life easier: Let ${X_n\stackrel{iid }{\sim}\mathrm{Ber}(p)}$ and ${N\sim\mathrm{Pois}(\lambda)}$, with ${N}$ independent of the ${X_n}$. Define ${S_n=\sum_{i=1}^n X_i}$. What is the distribution of ${S_N}$?

The direct computation proceeds as follows:

\displaystyle \begin{aligned} \mathbb P(S_N=k) &= \sum_{n=0}^\infty \mathbb P(S_N=k|N=n)\mathbb P(N=n)\\ &= \sum_{n=k}^\infty \mathbb P(S_n=k)\mathbb P(N=n)\\ &= \sum_{n=k}^\infty \binom nk p^k(1-p)^{n-k}\frac{e^{-\lambda}\lambda^n}{n!}\\ &= \frac{e^{-\lambda}p^k}{k!} \sum_{n=k}^\infty \frac{(1-p)^{n-k}\lambda^n}{(n-k)!}\\ &= \frac{e^{-\lambda}p^k}{k!} \sum_{n=0}^\infty \frac{(1-p)^n\lambda^{n+k}}{n!}\\ &= \frac{e^{-\lambda}(\lambda p)^k}{k!} \sum_{n=0}^\infty \frac{(\lambda (1-p))^n}{n!}\\ &= \frac{e^{-\lambda p}(\lambda p)^k}{k!}, \end{aligned}

so that ${S_N\sim\mathrm{Pois}(\lambda p)}$. However, if we observe that the generating functions of ${X_1}$ and ${N}$ are ${P(s) = 1 - p - ps}$ and ${P_N(s) = e^{-\lambda(1-s)}}$, respectively, then the generating function of ${S_N}$ is

$\displaystyle P_{S_N}(s) = P_N(P(s)) = e^{-\lambda(1- (1-p + ps))}=e^{-\lambda p(1-s)},$

which is the generating function of a ${\mathrm{Pois}(\lambda p)}$ random variable. Wasn’t that much easier?