In the first chapter, we embarked on an exploratory journey into the world of stochastic calculus, starting with the basic yet profound concept of the random walk. We began by demystifying the complexity of probability theory, revealing its inherent beauty and simplicity through the lens of discrete-time random walks. We delved into the fundamental principles of probability, using the concept of a ‘fair coin’ to introduce the Binomial distribution and its application in modeling random walks. This foundation sets the stage for a deeper exploration into the mathematical intricacies of random walks.
By now, we can derive some mathematical representations of the random walk. We can start with the most simple one, an equation for a forward path:
$$S_n = X_0 + X_1 + \cdots + X_n$$
Where:
\(X = \begin{cases} +1 &\text{with probability} \frac{1}{2}, \\ -1 & \text{with probability} \frac{1}{2}, \end{cases}\)
Now, let’s think about what we have here:
- \(S_n\): is a number that represent a position after \(n\) steps;
- \({X_n}\): is a series of random variables, a single realization of the trajectory that leads to the \(S_n\).
Since \(X_0, X_1, \cdots , X_n\) are iid (independent and identically distributed), let’s do a trick and rearrange some elements in the series a little bit:
We swap each even element with an odd:
\(X_0 , X_1 , X_2, X_3 ,\cdots\) ⇾ \(X_1 , X_0 , X_3, X_2 ,\cdots\)Note, that the sum of each of the series is still \(S_n\), since values are the same, but what changed is the trajectory by which we got to the \(S_n\). We can illustrate it by plotting an original and new trajectory:
The symmetry in these trajectories is notably striking. It begs the question, ‘Are there additional trajectories that would similarly lead to the same \(S_n\) destination?
Here’s another example where all possible trajectories leading to the point \(S_n\) is plotted:
Interestingly, we can count precisely the number of trajectories that lead to \(S_n\) from the origin, bear with me:
If we forget about time axis for a second and consider only a position axis (y-axis), then we can express the position \(S_n\) after \(n\) steps as a function of two variables:
- # of steps forward (positive direction), let’s call it \(k\)
- # of steps backwards (negative direction), it’ll be \(n-k\)
Solving for \(k\), we get:
\(k = \frac{m+n}{2}\)
Now, you can see that to end up \(m\) steps away from the origin, you need to take \(\frac{m+n}{2}\) steps forward, and the rest backward.
Since we’re looking for all possible combinations (ways of write down #k +1s and #(n-k) -1s), we can use the binomial coefficient:
$$\text{# of trajectories}={n \choose k}={n \choose \frac{m+n}{2}}$$
For example, we can find that after 10 steps (\(n=10\)) the number of paths lead to point \(m=2\) is exactly \({10 \choose \frac{10+2}{2}=6 }= 210\)
Even though the number of trajectories gives us a clue about the likelihood of ending up exactly \(m\) steps from the origin, it’s still challenging to compare different outcomes. This difficulty arises because the number of trajectories increases with \(n\). We need a factor that is also a function of \(n\), but in the opposite way, so that the measure of likelihood be in the same scale disregard of the \(n\).
Let’s start by rearranging back all trajectories \(\left\{ S\right\}_n\) that results at \(m\) by sorting by outcome
\(S_{n,i} = X_0 + X_1 + \cdots + X_n\) ⇾
⇾ \(S_n^{\text{sorted}} = \text{sort}(S_{n,i})= X_0^{+} + X_1^{+} + \cdots + X_k^{+} \cdots X_0^{-} + X_1^{-} + \cdots + X_{n-k}^{-}\)
- \(S_{n,i}\) represents a specific permutation of the random walk after \(n\) steps.
- \(S_n^{\text{sorted}}\) reflects the sorted, common form of any permutation of \(S_{n}\)
Now, let’s answer the question “How likely you can see a row of \(\left[ X^{+}=+1\right]\) \(k\) times in the row followed by a series of \(\left[ X^{-}=-1 \right]\) \((n-k)\) times straight”.
Since we’re dealing with a symmetric random walk and \(P(X^+)= P(X^-)=\frac{1}{2}\), it’ll be:
$$(\frac{1}{2})^k \times (\frac{1}{2})^{n-k}=(\frac{1}{2})^{n}$$
You can see that this scaling factor is exactly what we’re looking for since it scales down with \(n\) balancing out our combinatorial part. By putting it all together, we can finally derive the Probability Mass Function (PMF) for a simply symmetric random walk:
$$P(m|n) = {n \choose k}p^{n-k}(1-p)^k={n \choose \frac{m+n}{2}}p^n$$
NOTE: for a non-symmetric (drift) random walk , the \(P(X^+)\ne P(X^-)\) and hence we should use the full form.
Finally, let’s return once again to the property of time symmetry that I mentioned in the beginning. We can now postulate it from the PMF quite easily:
- Since we’re using a desperate step of \(X=\pm1\), \(S_n\) should be one step away (either below or above) \(S_{n-1}\) and
- the position \(S_t\) depends only on the position \(S_{t-1}\) Given that, we can derive the PMF for \(S_n\) as a function of PMF of \(S_{n-1}\):
$$P(m|n)= \frac{1}{2} P(m-1 | n-1) +\frac{1}{2}P(m+1|n-1)$$
The symmetric random walk, despite its seemingly simple nature, is a fundamental model with far-reaching implications and applications. This model serves as a cornerstone in understanding and analyzing complex phenomena across various disciplines including physical, biological, finance, and other systems where even a straightforward mathematical concept can offer profound insights into the complexities of a system.
The versatility of a symmetric random walk allows for modifications and adaptations that can tackle more specific and complex scenarios. For instance, introducing asymmetry or additional constraints can simulate more realistic conditions in various fields.
Its ability to model a wide range of real-world systems underlines the importance of probabilistic thinking and statistical principles in understanding our world.