# Peeter Joot's (OLD) Blog.

• 324,651

## Application of the central limit theorem to a product of random vars

Posted by peeterjoot on February 1, 2013

Our midterm had a question asking what the central limit theorem said about a product of random variables. Say, $Y = X_1 X_2 \cdots X_N$, where the random variables $X_k$ had mean and variance $\mu$ and $\sigma^2$ respectively. My answer was to state that the Central limit theorem didn’t apply since it was for a sum of independent and identical random variables. I also stated the theorem and said what it said of such summed random variables.

Wondering if this was really all the question required, I went looking to see if there was in fact some way to apply the central limit theorem to such a product and found http://math.stackexchange.com/q/82133. The central limit theorem can be applied to the logarithm of such a product (provided all the random variables are strictly positive)

For example, if we write

\begin{aligned}Z = \ln Y = \sum_{k = 1}^N \ln X_k,\end{aligned} \hspace{\stretch{1}}(1.0.1)

now we have something that the central limit theorem can be applied to. It will be interesting to see if this is the answer that the midterm was looking for. It is one that wasn’t obvious enough for me to think of it at the time. In fact, it’s also not something that we can even state a precise central limit theorem result for, because we don’t have enough information to state the mean and variance of the logarithm of the random vars $X_k$. For example, if the random vars are continuous, we have

\begin{aligned}\left\langle{{\ln X}}\right\rangle = \int \rho(X) \ln X dX.\end{aligned} \hspace{\stretch{1}}(1.0.2)

Conceivably, if we knew all the moments of $X$ we could expand the logarithm in Taylor series. In fact we need more than that. If we suppose that $0 < X < 2 \mu$, so that $\left\lvert {X/\mu - 1} \right\rvert \le 1$, we can write

\begin{aligned}\ln X &= \ln \mu + (X - \mu) \\ &= \ln \mu + \ln \left( { 1 + \left(\frac{X}{\mu} - 1\right) } \right) \\ &= \ln \mu + \sum_{k = 1}^{\infty} (-1)^{k+1} \frac{\left( {\frac{X}{\mu} -1} \right)^k}{k}.\end{aligned} \hspace{\stretch{1}}(1.0.3)

With such a bounding for the random variable $X$ we’d have

\begin{aligned}\left\langle{{\ln X}}\right\rangle = \ln \mu + \sum_{k = 1}^{\infty} \frac{(-1)^{k+1}}{k} \left\langle{{\left( {\frac{X}{\mu} -1} \right)^k}}\right\rangle\end{aligned} \hspace{\stretch{1}}(1.0.4)

We need all the higher order moments of $X/\mu - 1$ (or equivalently all the moments of $X$), and can’t just assume that $\left\langle{{\ln X}}\right\rangle = \ln \mu$.

Suppose instead that we just assume that it is possible to find the mean and variance of the logarithm of the random variables $X_k$, say

\begin{subequations}

\begin{aligned}\mu_{\mathrm{ln}} = \left\langle{{\ln X}}\right\rangle\end{aligned} \hspace{\stretch{1}}(1.0.5a)

\begin{aligned}\sigma_{\mathrm{ln}}^2 = \left\langle{{(\ln X)^2}}\right\rangle - \left\langle{{\ln X}}\right\rangle^2.\end{aligned} \hspace{\stretch{1}}(1.0.5b)

\end{subequations}

Now we can state that for large $N$ the random variable $Z$ has a distribution approximated by

\begin{aligned}\rho(Z) = \frac{1}{{\sigma_{\mathrm{ln}} \sqrt{2 \pi N}}} \exp\left( - \frac{ (\ln X - N \mu_{\mathrm{ln}})^2}{2 N \sigma_{\mathrm{ln}}^2} \right).\end{aligned} \hspace{\stretch{1}}(1.0.6)

Given that, we can say that the random variable $Y = X_1 X_2 \cdots X_N$, is the exponential of random variable with the distribution given approximately (for large $N$) by 1.0.6.

It will be interesting to see if this is the answer that we were asked to state. I’m guessing not. If it was, then a lot more cleverness than I had was expected.