GS 2003/Math - Collins
Markov Processes and Transition Matrices
In this lab you will create some transition matrices and use
MATLAB to study how the probability distribution changes over time.
A Markov process depends on two things: the transition matrix T
and the initial probability distribution X(1) = X
In MATLAB you write matrices starting with [ and ending with ],
using commas (or spaces) to separate elements and semi-colons
or returns to separate rows.
T = [0.5 0.3; 0.5 0.7];
X = [1;0];
You can check that T is a stochastic matrix (columns
sum to 1), by typing sum(T) and seeing
if all the values are 1 (or 1.0000).
Once we have the transition matrix T and starting vector X
we can compute the probability distributions at future times by
for i = 1:99
X(:,i+1) = T*X(:,i);
After this code runs, X will be a 2x100 matrix with the
values in column k representing the probability distributions
after k-1 transitions. You can see a plot of the results
by typing plot(X'). You can add a legend
with the legend command.
For some processes you can assign a value to each state and you
want to know what the average value is at each transition, the
so called expected value. To compute it in MATLAB, you just set
up a row vector val which has the values of each state
and then compute exval = val*X; Then
you can plot the expected values by plot(exval)
1. For each of these transition matrices, determine what the steady state
distribution is (i.e. the distribution as the number of transitions
goes to infinity). Also, compute the probability of being in state 2
after 3 transitions if you started in state 1.
a. T = [0.5 0.3; 0.5 0.7];
b. T = [0.8 0 0.2; 0.1 0.6 0.1; 0.1 0.4 0.7];
2. A fox hunts in 3 territories: A, B, and C. He never hunts in the same
territory two days in a row. If he hunts in A, he hunts in C the next
day. If he hunts in B or C, he is twice as likely to hunt in A than
in the other territory.
a. Over the long run, what proportion of his time does he spend in
b. If he hunts in A on Sunday, what is the probability he will hunt
in A on Wednesday?
3. You and a friend play a dice game where you both roll one die and
if the number showing on your die is greater than or equal to the
value on your friends die, then your friend pays you $1. If your
die is less, then you pay your friend $1. You stop the game when
you are broke or have $4.
a. If you start the game with $1, what is the probability that you
will quit with $4? What about if you start with $2?
b. If you start with $1, what is the average amount of money you'll
have after 1, 2 or 3 games?
a. Brownian Motion: Consider a particle in the middle of a NxN grid.
At each transition, it has equal probabilities of moving one
unit in each direction (NESW). Assume when it hits the boundary
it stops. Take N=7 to start, increase as desired.
i. What is the probability that the particle is still moving after
100 transitions? 200? 500? 1000?
ii. What is the expected number of transitions before it stops?
b. RISK: In a typical battle in the game of RISK, the attacker
rolls 3 dice and the defender rolls 2 dice. The top 2 dice
of the attacker are compared one-to-one with the top dice
of the defender. When the attackers die is greater, the
defender loses an army, and when the attackers die is
less than or equal to the defender's, the attacker loses an
army. There is always a change of 2 armies with each battle,
i.e either A loses 2, A and D each lose 1, or D loses 2.
The probabilities for these are: 799/2731, 1420/4229, 792/2131
Determine the probability that the defender army will
be reduced to 1 or less before the attacker army is reduced
to 3 or less, if the attacker starts with 20 armies and the
defender starts with 5, 10, or 15 armies.
Last Modified: November 11, 1923