# Peeter Joot's (OLD) Blog.

• ## Archives

 Adam C Scott on avoiding gdb signal noise… Ken on Scotiabank iTrade RESP …… Alan Ball on Oops. Fixing a drill hole in P… Peeter Joot's B… on Stokes theorem in Geometric… Exploring Stokes The… on Stokes theorem in Geometric…

• 317,363

## One dimensional random walk

Posted by peeterjoot on February 3, 2013

## Question: One dimensional random walk

Random walk in 1D by unit steps. With the probability to go right of $p$ and a probability to go left of $1 -p$ what are the first two moments of the final position of the particle?

This was a problem from the first midterm. I ran out of time and didn’t take the answer as far as I figured I should have. Here’s a more casual bash at the problem.

First we need the probabilities.

One step: $N = 1$

Our distance (from the origin) can only be $X = \pm 1$.

\begin{aligned}P_{X = -1} = p^{0} (1 -p)^1\end{aligned} \hspace{\stretch{1}}(1.0.1)

\begin{aligned}P_{X = 1} = p^1 (1-p)^{1 - 1}\end{aligned} \hspace{\stretch{1}}(1.0.2)

Two steps: $N = 2$

We now have three possibilities

\begin{aligned}P_{X = -2} = p^{0} (1 -p)^{2 - 0}\end{aligned} \hspace{\stretch{1}}(1.0.3)

\begin{aligned}P_{X = 0} = 2 p^1 (1-p)^{2 - 1}\end{aligned} \hspace{\stretch{1}}(1.0.4)

\begin{aligned}P_{X = 2} = p^2 (1-p)^{2 - 2}\end{aligned} \hspace{\stretch{1}}(1.0.5)

Three steps: $N = 3$

We now have three possibilities

\begin{aligned}P_{X = -3} = p^{0} (1 - p)^{3 - 0}\end{aligned} \hspace{\stretch{1}}(1.0.6)

\begin{aligned}P_{X = -1} = 3 p^1 (1 - p)^{3 - 1}\end{aligned} \hspace{\stretch{1}}(1.0.7)

\begin{aligned}P_{X = 1} = 3 p^2 (1 - p)^{3 - 2}\end{aligned} \hspace{\stretch{1}}(1.0.8)

\begin{aligned}P_{X = 3} = p^3 (1-p)^{3 - 3}\end{aligned} \hspace{\stretch{1}}(1.0.9)

General case

The pattern is pretty clear, but we need a mapping from the binomial index to the the final distance. With an index $k$, and a guess

\begin{aligned}D(k) = a k + b,\end{aligned} \hspace{\stretch{1}}(1.0.10)

where

\begin{aligned}D(0) = -N = b\end{aligned} \hspace{\stretch{1}}(1.0.11)

\begin{aligned}D(N) = a N + b = (a - 1)N = N.\end{aligned} \hspace{\stretch{1}}(1.0.12)

So

\begin{aligned}D(k) = 2 k - N,\end{aligned} \hspace{\stretch{1}}(1.0.13)

and

\begin{aligned}k = \frac{D + N}{2}.\end{aligned} \hspace{\stretch{1}}(1.0.14)

Our probabilities are therefore

\begin{aligned}\boxed{P_{X = D} = \binom{N}{(N + D)/2} p^{(N + D)/2}(1 - p)^{(N - D)/2}.}\end{aligned} \hspace{\stretch{1}}(1.0.15)

First moment

For the expectations let’s work with $k$ instead of $D$, so that the expectation is

\begin{aligned}\left\langle{{X}}\right\rangle &= \sum_{k = 0}^N (2 k - N) \binom{N}{k} p^k (1 - p)^{N - k} \\ &= 2 \sum_{k = 0}^N k \frac{N!}{(N-k)!k!} p^k (1 - p)^{N - k} - N \\ &= 2 N p \sum_{k = 1}^N \frac{(N-1)!}{(N - 1 - (k - 1))!(k-1)!} p^{k-1} (1 - p)^{N - 1 - (k - 1)} - N \\ &= 2 N p \sum_{s = 0}^{N-1} \binom{N-1}{s} p^{s} (1 - p)^{N -1 - s} - N.\end{aligned} \hspace{\stretch{1}}(1.0.16)

This gives us

\begin{aligned}\boxed{\left\langle{{X}}\right\rangle = N( 2 p - 1 ).}\end{aligned} \hspace{\stretch{1}}(1.0.17)

Second moment

\begin{aligned}\left\langle{{X^2}}\right\rangle &= \sum_{k = 0}^N (2 k - N)^2 \binom{N}{k} p^k (1 - p)^{N - k} \\ &= 4 \sum_{k = 0}^N k^2 \binom{N}{k} p^k (1 - p)^{N - k}- 4 N^2 p+ N^2 \\ &= 4 N p \sum_{k = 1}^N k \frac{(N-1)!}{(N - 1 - (k - 1))! (k-1)!} p^{k-1} (1 - p)^{N - k} + N^2(1 - 4 p) \\ &= 4 N p \sum_{s = 0}^N (s + 1) \frac{(N-1)!}{(N - 1 - s)! s!} p^s (1 - p)^{N - 1 - s} + N^2(1 - 4 p) \\ &= 4 N p ((N-1) p + 1) + N^2(1 - 4 p) \\ &= N^2 ( 1 - 4 p + 4 p^2 ) + 4 N p ( 1 - p ).\end{aligned} \hspace{\stretch{1}}(1.0.18)

So the second moment is

\begin{aligned}\boxed{\left\langle{{X^2}}\right\rangle = N^2 ( 1 - 2 p )^2 + 4 N p ( 1 - p )}\end{aligned} \hspace{\stretch{1}}(1.0.19)

From this we see that the variance is just this second term

\begin{aligned}\sigma^2 = \left\langle{{X^2}}\right\rangle - \left\langle{{X}}\right\rangle^2 = 4 N p ( 1 - p ).\end{aligned} \hspace{\stretch{1}}(1.0.20)