The reading for the first Homework consists of the Review Chapter (Chapter 1 of Lefebvre) along with review of
STAT 410 level material from any other books.
Homework 1 (due Friday Feb. 5 in class, 8 problems in all): Do problems 4, 6, 15, 19, 30, 33, 43, pp. 35-43
in Lefebvre. Also do the same problem as #26 but with the N(μ, σ2) distribution replaced by the
Gamma(α,β) distribution (which has density of the form
C xα-1 exp(-βx) for x>0).
Homework 2 (due Monday Feb.22 in class, 8 problems): Do problems 11, 15, 20, 21, 23, 29, 31, 38
(using a different 4x4 transition matrix that I will give you to replace the one in #38). pp. 134-159 in Lefebvre (Sec.3.4).
For problem number 38, use the 4x4 transition matrix P on the state-space S
{0,1,2,3} given by
| 0.5 0.5 0 0 |
| 0.25 0.25 0.5 0 |
| 0 0.5 0 0.5 |
| 0 0 1 0 |
NOTE that as a result of snow and student requests, the due date for HW2 was changed to Monday,
February 22. I also sent a course-mail message to everybody (2/17) about HW2 Problem #15.
That message is reproduced in full here.
Homework 3 (due Wednesday March 2 in class, 8 problems in all): Do problems 33, 43, 49, 51, 61, 67 from pp. 152-162 in Lefebvre.
In addition, do the following two problems.
(I). Look at the statement of Proposition 3.2.2, p.86 in Lefebvre. This assertion
(which the book says is "easy to show") is actually wrong. Give a concrete example of a finite-state homogeneous-transition Markov
Chain for which it fails.
(II). Consider the process X_n = Z_{n-1}+Z_{n-2} where Z_{k} are the
binary {0,1} outcomes of Bernoulli(1/2) trials, for n > 2. Using the definitions, show that Y_n = (Z_n, Z_{n-1}) is a HMC,
but that X_n is not a Markov Chain.
Homework 4 (extended due-date Wednesday March 23 in class, 8 problems in all): Do problem 53 in Lefebvre,
problems 39, 43
and 47 in Serfozo (Chapter 1, pp.91-93). But note that there are a couple of
misprints in Serfozo's Problem 47:
first, the definition of the waiting time τ should be
τ = inf{ n ≥ 0: Xn = 101}. Also, the equation for G1(s) that you
are supposed to verify should be: G1(s) = s (p G1(s) + q G2(s)).
In addition, as part of the same HW4 set, do the problems:
(I). Let {Xn: n=0,1,2,...} be a branching process for which X0=1 and the
offspring distribution is
pZ(k) = 0.1 for k=0, =0.6 for k=1, and = 0.3 for k=2.
(a) Find the extinction probability q = P(Xn=0 for some t>0 |
X0=1).
(b) Find the expected value and variance of X4.
(c) Find E(X4 | X3>1).
(II).(a) Let Mn denote the size at time n of a population governed by a pure-death Markov-chain with M0=100
and transition probabilities P_k,k-1 = min(0.5, .01*k) = 1-Pk,k for k=0,1,2,... Find the
expectation and variance of the time T0 until the population dies out. (b) Suppose we modify the chain
so that all transitions P0,k = 0.01 for k=1,...,100 (leaving all other transitions Pk,j for k>0 just as
in part (a)). Find the stationary distribution for the chain.
(III).(a) In problem #50 in Lefebvre, find E(T0 | X0=0).
(b) In problem #43 in Lefebvre, find E(T5 | X0=4).
Hint: use first-step analysis.
(IV) Let X(n) be a birth-death Markov chain with μk = Pk,k+1 = p if k is odd
and r if k is even, k=1,2,... where both p, r are between 0 and 1, and where λk = Pk,k-1=
1-Pk,k+1 for positive k, and P0,1=1. Give a necessary and sufficient condition on p,r
for the chain to be recurrent, and a necessary and sufficient condition for the chain to be positive-recurrent.
Homework 5 (due Wednesday March 30 in class, 6 problems will be on continuous-time Markov chain definitions and
basic properties.): Do problems # 18, 72 and 87 in Lefebvre. In addition:
(I). Define a continuous time chain with state-space S={0,1,2,3,4} from the discrete-time transition
matrix P with p0,1 = p4,3=1 and for k=1,2,3,
pk,k+1 = p = 1- pk,k-1, letting the rates λk of
transition away from each state k ∈ S be 1.
(a) Find E1(T0) for this continuous-time
chain.
(b) Show that the stationary distribution π for the
discrete-time chain also has the property that if P(X0=j) = πj, then for all
t>0, P(Xt=j) = πj.
(II). Give and solve an ordinary differential equation satisfied by Pij(t) for the
continuous-time chain with discrete transitions on {0,1,2,3} governed by the transition matrix with
p3,2 = 1 = p0,1, p1,0 = p1,2 = p2,1 = p2,3
= ½ and all rates λj = 1.
(III). For the chain in problem (II), find E0(X1) and
E0(T0) .
Reading for the next segment of the course is: Lefebvre Chapter 3 Secs. 3.3.2-3.3.5 (pp.121-142) and
Serfozo Ch.4 (pp.241-255, through Sec.4.5) on continuous-time Markov chains, and Lefebvre Ch.5 pp. 231-247
on Poisson processes. Course-Outline topics that we are covering now and for the next 5 or 6 lectures are:
3(a) long-time limit theory and explicit solutions in continuous-times chains, and 6(a),(c),(d) on
Poisson processes and birth-death processes, with some queueing examples (Sec.7(a)).
Homework 6 (due Monday April 18 in class, 7 problems): Do problems #77, 79, 92 in Lefebvre Ch.3,
pp. 165-170, and problems #1, 8, 16 in Serfozo Ch.4, on pp. 323-327.
In addition: do
(I). Find the explicit solution for the transition probability functions P1,k(t) for
k=1,2,3,4,5, in the continuous-time homogeneous Markov chain with transition rates λj,k
(for j,k=1,...,5) equal to 1 for |j-k| equal to 1 or 2, equal to 0 for |j-k| > 2, and equal respectivly to
-2,-3,-4,-3,-2 when j=k is respectively equal to 1,2,3,4,5. Give the value of the limiting probabilities
P1,k(∞) = πk, and the dominant terms ck exp(-β t)
of the differences P1,k(t) - πk when t is large.
For the next portion of the course and the next HW read: Serfozo Section 1.11 especially Proposition 69 about "Cycles and Costs", and Serfozo Chapter 2 on Renewal and Regenerative processes, through section 2.6; also read Section 5.6 in Lefebvre on Renewal ideas.
Homework 7 (due Friday April 29 in class, 7 problems): Do problems #22 (p.161) in Serfozo Chapter 2, and #5,13, 19 (pp.226-229) in Serfozo Chapter 3. Also do Problems #9,13,25 in Lefebvre Chapter 5, pp.292-295.
For the final portion of the course and the next HW, read: Lefebvre pp.101-104 on martingales (and Serfozo Section 5.5 if you can); Lefebvre Sections 5.2-5.3 on nonhomogeneous and compound Poisson processes, and Serfozo Sec.3.5 on Poisson processes on more general spaces like Rk.
Homework 8 (due Wednesday May 11 at the review session, 7 problems): Do problems #29(c) and 36 in Ch.5 in Lefebvre (pp.296,298) and #17 in Chapter 3 (p.229) of Serfozo.
In addition: do
(I). Suppose that N(t) is a nonhomogeneous Poisson process with rate-function 2t2 for t ≥ 0. Show that for any bounded function a(s), Z(t) = ∫0t a(s) dN(s) is a Compound Poisson random variable of the form ∑i=1ν Xi for a Poisson random variable ν and independent iid random variables Xi; give the parameter of ν and the distribution of Xi, and interpret the meaning of Z(t) in a setting where N(t) is counting failure events.
(II). Suppose that N(E) is a homogeneous Poisson(λ)
process on the positive quadrant in the plane, with t1 ≥ 0,
t2 ≥ 0. Show that the processes M1(t) = N({(t1, t2): 0 ≤ t1 ≤ t2 ≤ t}) and M2(t) = N({(t1, t2): 0 ≤
t1 ≤ t, 0 ≤ t2 ≤ 1}) are (possibly nonhomogeneous) Poisson processes, and give their rate functions.
(III). Suppose that Y(t) = ∑k=1N(t) Xk is a compound Poisson process, where Xk are i.i.d. with probability mass function p(0)=.2, p(1)=.5, p(2)=.3. Find a number r ≠ 1 such that rY(t) is a martingale with mean 1. (This number r should not depend on the rate λ of the Poisson process N(t).)
(IV). A gambling game is conducted by giving a player an initial stake of $10, doing a series of Bernoulli(0.4) trials each of which adds $1 to his fortune with probability 0.4 and subtracts $1 from it with the remaining probability 1/6. The game ends the first time that the player's fortune hits $15 or $0, whichever comes first. What is the expected fortune at the end ?