## PHY452H1S Basic Statistical Mechanics. Problem Set 1: Binomial distributions

Posted by peeterjoot on January 20, 2013

[Click here for a PDF of this post with nicer formatting]

# Disclaimer

This is an ungraded set of answers to the problems posed.

## Question: Limiting form of the binomial distribution

Starting from the simple case of the binomial distribution

derive the Gaussian distribution which results when and .

## Answer

We’ll work with the logarithms of .

Note that the logarithm of the Stirling approximation takes the form

Using this we have

Adding , we have

Recall that we can expand the log around with the slowly converging Taylor series

but if the first order term will dominate, so in this case where we assume , we can approximate this sum of factorial logs to first order as

Putting the bits together, we have

Exponentiating gives us the desired result

## Question: Binomial distribution for biased coin

Consider the more general case of a binomial distribution where the probability of a head is and a tail is (a biased coin). With and , obtain the binomial distribution for obtaining a total of from coin tosses. What is the limiting form of this distribution when and ? The latter condition simply means that I need to carry out any Taylor expansions in X about its mean value . The mean can be easily computed first in terms of “r”.

## Answer

Let’s consider 1, 2, 3, and N tosses in sequence to understand the pattern.

**1 toss**

The base case has just two possibilities

- Heads, ,
- Tails, ,

If for respectively, we have

As a check, when we have

**2 tosses**

Our sample space is now a bit bigger

- , ,
- , ,
- , ,
- , ,

Here is the probability of the ordered sequence, but we are interested only in the probability of each specific value of . For there are ways of picking a heads, tails combination.

Enumerating the probabilities, as before, with for respectively, we have

**3 tosses**

Increasing our sample space by one more toss our possibilities for all ordered triplets of toss results is

- , ,
- , ,
- , ,
- , ,
- , ,
- , ,
- , ,
- , ,
- , ,

Here is the probability of the ordered sequence, but we are still interested only in the probability of each specific value of . We see that we have

ways of picking some ordering of either or

Now enumerating the possibilities with for respectively, we have

**n tosses**

To generalize we need a mapping between our random variable , and the binomial index , but we know what that is from the fair coin problem, one of or . To get the signs right, let’s evaluate for and

Mapping between and for :

X | (N-X)/2 | (N+X)/2 |

-3 | 3 | 0 |

-1 | 2 | 1 |

1 | 1 | 2 |

3 | 0 | 3 |

Using this, we see that the generalization to unfair coins of the binomial distribution is

Checking against the fair result, we see that we have the factor when as expected. Let’s check for (two heads, one tail) to see if the exponents are right. That is

Good, we’ve got a (two heads) term as desired.

**Limiting form**

To determine the limiting behavior, we can utilize the Central limit theorem. We first have to calculate the mean and the variance for the case. The first two moments are

and the variance is

The Central Limit Theorem gives us

however, we saw in [1] that this theorem was derived for continuous random variables. Here we have random variables that only take on either odd or even integer values, with parity depending on whether is odd or even. We’ll need to double the CLT result to account for this. This gives us

As a check we note that for we have and , so we get

Observe that both this and 1.0.8 do not integrate to unity, but to . This is expected given the parity of the discrete random variable . An integral normalization check is really only approximating the sum over integral values of our discrete random variable, and here we want to skip half of those values.

# References

[1] Peter Young. *Proof of the central limit theorem in statistics*, 2009. URL http://physics.ucsc.edu/ peter/116C/clt.pdf. [Online; accessed 13-Jan-2013].

## Leave a Reply