Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: THEend8_
CMPSC 497: Advanced Algorithms
Notice: Type your answers using LaTeX and make sure to upload the answer file on Gradescope before the deadline.
For more details and instructions, read the syllabus.
Problem 1. Stationarity
(a) A transition matrix P is symmetric if P(x,y) = P(y,x) for all x,y∈Ω. Show that if P is symmetric, then the uniform
distribution on Ω is stationary for P.
(b) A transition matrix P is doubly stochastic if the sum of each of its rows and columns is 1. Show that if P is doubly
stochastic, then the uniform distribution on Ω is stationary for P.
(c) Let P be a transition matrix for a Markov chain with state space Ω. The matrix P is reversible with respect to the
probability distribution π over Ω if for all x,y ∈Ω: π(x)P(x,y) = π(y)P(y,x). Show that if P is reversible with respect
to π , then π is a stationary distribution.
(d) Show that the transition matrix P2 corresponding to two steps of the chain is also reversible with respect to π .
Problem 2. Card Shuffling
Consider the following variation on shuffling for a deck of n cards. At each step, two specific cards are chosen
uniformly at random from the deck, and their positions are exchanged. (It is possible both choices give the same card,
in which case no change occurs.)
(a) Argue that the following is an equivalent process: at each step, a specific card is chosen uniformly at random from
the deck, and a position i from [1,n] is chosen uniformly at random; then the card at position i exchanges positions
with the specific card chosen.
(b) Consider the coupling where the two choices of card and position are the same for both copies of the chain. Let
Xt be the number of cards whose positions are different in the two copies of the chain. Show that Xt is non-increasing
over time.
(c) Show that:
Pr[Xt+1 ≤ Xt −1 | Xt > 0]≥
(
Xt
n
)2
(d) Argue that the expected time until Xt is 0 is O(n2) regardless of the starting state of the two chains.
1
Problem 3. The hard-core model: weighted independent sets
Consider the set Ω of independent sets of a graph G = (V,E). Suppose that each independent set I of is assigned
probability:
µ(I) =
λ |I|
Z
,
where λ is a model parameter and Z = ∑I∈Ωλ |I| is the normalizing constant. (This distribution is known as the
hard-core model in statistical physics.)
Consider now the following Markov chain to sample from µ . Given an independent set It at time t, one step of the
chain is as follows:
1. Pick a vertex v uniformly at random.
2. If there are neighbors of v in It do nothing.
3. Otherwise, set It+1 = It ∪{v} with probability λλ+1 and It+1 = It \{v} with probability 1λ+1 .
(a) Prove that this Markov chain is reversible with respect to µ and thus converges to it.
(b) Using the coupling method show that when λ < 1/(∆−1) the mixing time of this chain is O(n logn), where ∆ is
the maximum degree of G.
Problem 4. Conductance for random walks on graphs
(a) Consider the lazy random walk on the complete graph on n vertices Kn. Show that its conductance is Θ(1).
(b) Let K2n/3 = (V1,E1) and Kn/3 = (V2,E2) be the cliques (i.e., complete graphs) with 2n/3 and n/3 vertices respec-
tively. Let G = (V,E) be the n-vertex connected graph that results from joining K2n/3 and Kn/3 with a single edge
{u,v}. Formally, V = V1 ∪V2 and E = E1 ∪E2 ∪{u,v}, where u ∈ V1 and v ∈ V2. Show that the conductance of the
random walk on G is Θ(n−2).
Problem 5. Summing sets
Consider two sets A and B of integers in the [0,n] range. Let A+B= {a+b | a ∈ A,b ∈ B}. That is, A+B contains all
numbers c that can be obtained as the sum of an element from A and an element with B. For example, {1,3}+{3,4}=
{4,5,6,7}. Note that the values of A+B can be computed in O(n2) time with a brute force algorithm. Design an
algorithm that finds the elements of A+B in O(n logn) time.