This is the final episode of coin flipping problems (previous discussions can be found here and here). Last time we ended the discussion with the expected number of toss such that the sequence HH (or other combinations like HT or THH) appears for the first time. Now we consider playing games involving H and T combinations. The first type of game concerns the probability of combination A appearing before B does while flipping an unbiased coin.
CT08
Keep flipping a fair coin. What is the probability that the sequence HTT appears before HHT does?
CT08 - Answer:
This can be solved by setting up the problem as a Markov chain and calculating the absorption probabilities. Alternatively, letting A = HTT appears before HHT, we can consider the conditional probabilities conditioning on the first 3 flips:
P(A)
= P(A|HHH) + P(A|HHT) + P(A|HTH) + P(A|HTT) + P(A|THH) + P(A|THT) + P(A|TTH) + P(A|TTT)
Now it's time to simplify terms:
P(A|HHH) = 0
P(A|HHT) = 0
P(A|HTH) = P(A|H)
P(A|HTT) = 1
P(A|H) = P(A|T) = P(A)
So
P(A) = 1/8 * 0 + 1/8 * 0 + 1/8 * P(A) + 1/8 * 1 + 1/2 * P(A)
=> P(A)= 1/3
The second type of game is a little different from the previous case:
CT09
Two players take turn tossing a fair coin. When the sequence HT shows up for the first time, the person who flipped the T wins. What are the probabilities of winning for each player?
CT09 - Answer:
Once again conditional probability saves the day. Let A = First player wins,
P(A) = 0.5 * P(A|H) + 0.5 * P(A|T)
But P(A|T) = P(B) = 1 - P(A) (because "the second player becomes the first" if the first toss give a T)
Also, P(A|H) = 0.5 * P(A|HT) + 0.5 * P(A|HH) = 0.5 * 0 + 0.5 * (1 - P(A|H))
=> P(A|H) = 1/3
So
P(A) = 0.5 * 1/3 + 0.5 * (1 - P(A))
P(A) = 4/9
See here for more interview questions/brainteasers
No comments:
Post a Comment