The first part is the classic coin flipping question: For a fair coin, what is the expected number of trials in order to get the first Head? And the answer is 2 (see here for discussion and related problems).
Now, put the coin aside and consider instead a deck of 52 cards. What is the expected number of trials in order to get the first red card? Would it be less than, equal to, or larger than 2? In the coin case, one way to get the answer is to do the infinite sum
$$ E[H] = \sum_{i=1}^{\infty} \frac{1}{2^i} i $$
Now, along the same line we can have a finite series
$$ E[Red] = \sum_{i=1}^{52} P_i i$$
The only problem is that for this case the $P_i$'s are uglier. Just write down the first few terms and we will see
$$ E[Red] = \frac {26}{52}1 + \frac {26}{52} \frac {26}{51}2 + \frac {26}{52} \frac {25}{51} \frac {26}{50}3 + \sum_{i=4}^{52} P_i i$$
As you can see, some of the fractions are greater than 1/2 and some are less. Can we easily conclude if the finite sum is less than or greater than 2?
See here for more interview questions/brainteasers
Showing posts with label coin tossing. Show all posts
Showing posts with label coin tossing. Show all posts
Friday, October 11, 2013
Tuesday, January 8, 2013
More Coin Problems
CT10
What is P(# of H out of 4 tosses >= # of H out of 5 tosses)?
CT10 - Answer:
1/2 (symmetry consideration)
CT11
Flip n unbiased coins. What is P(# of H = n/2)?
CT11 - Answer:
nC_(n/2)/2^n
CT12
Toss 4 coins and win the number of H. There is an option to re-toss. What is the value of the game?
CT12 - Answer:
([1 4 6 4 1] * [2 2 2 3 4]') / 16 = $2.375 (dynamic programming/backward induction)
CT13
What is the expected product of (# of H) * (# of T) when tossing 10 fair coins? (Hint: the answer is not 25)
CT13 - Answer:
22.5 (E[(# of H) * (10 - # of H)])
CT14
Keep flipping a fair coin. What is the probability that the sequence HHT appears before THH does?
CT14 - Answer:
1/4 (Once you have T, THH would always precede HHT; likewise, once you have HH, HHT would always precede THH)
CT15
Toss 4 coins and win the number of H. There is an option to re-toss one coin. What is the value of the game?
CT15 - Answer:
2 + 0.5 * 15/16 = $2.469 (dynamic programming/backward induction)
See here for more interview questions/brainteasers
What is P(# of H out of 4 tosses >= # of H out of 5 tosses)?
CT10 - Answer:
1/2 (symmetry consideration)
CT11
Flip n unbiased coins. What is P(# of H = n/2)?
CT11 - Answer:
nC_(n/2)/2^n
CT12
Toss 4 coins and win the number of H. There is an option to re-toss. What is the value of the game?
CT12 - Answer:
([1 4 6 4 1] * [2 2 2 3 4]') / 16 = $2.375 (dynamic programming/backward induction)
CT13
What is the expected product of (# of H) * (# of T) when tossing 10 fair coins? (Hint: the answer is not 25)
CT13 - Answer:
22.5 (E[(# of H) * (10 - # of H)])
CT14
Keep flipping a fair coin. What is the probability that the sequence HHT appears before THH does?
CT14 - Answer:
1/4 (Once you have T, THH would always precede HHT; likewise, once you have HH, HHT would always precede THH)
CT15
Toss 4 coins and win the number of H. There is an option to re-toss one coin. What is the value of the game?
CT15 - Answer:
2 + 0.5 * 15/16 = $2.469 (dynamic programming/backward induction)
See here for more interview questions/brainteasers
Tuesday, November 27, 2012
Those damn coins... (Part III)
This is the final episode of coin flipping problems (previous discussions can be found here and here). Last time we ended the discussion with the expected number of toss such that the sequence HH (or other combinations like HT or THH) appears for the first time. Now we consider playing games involving H and T combinations. The first type of game concerns the probability of combination A appearing before B does while flipping an unbiased coin.
CT08
Keep flipping a fair coin. What is the probability that the sequence HTT appears before HHT does?
CT08 - Answer:
This can be solved by setting up the problem as a Markov chain and calculating the absorption probabilities. Alternatively, letting A = HTT appears before HHT, we can consider the conditional probabilities conditioning on the first 3 flips:
P(A)
= P(A|HHH) + P(A|HHT) + P(A|HTH) + P(A|HTT) + P(A|THH) + P(A|THT) + P(A|TTH) + P(A|TTT)
Now it's time to simplify terms:
P(A|HHH) = 0
P(A|HHT) = 0
P(A|HTH) = P(A|H)
P(A|HTT) = 1
P(A|H) = P(A|T) = P(A)
So
P(A) = 1/8 * 0 + 1/8 * 0 + 1/8 * P(A) + 1/8 * 1 + 1/2 * P(A)
=> P(A)= 1/3
The second type of game is a little different from the previous case:
CT09
Two players take turn tossing a fair coin. When the sequence HT shows up for the first time, the person who flipped the T wins. What are the probabilities of winning for each player?
CT09 - Answer:
Once again conditional probability saves the day. Let A = First player wins,
P(A) = 0.5 * P(A|H) + 0.5 * P(A|T)
But P(A|T) = P(B) = 1 - P(A) (because "the second player becomes the first" if the first toss give a T)
Also, P(A|H) = 0.5 * P(A|HT) + 0.5 * P(A|HH) = 0.5 * 0 + 0.5 * (1 - P(A|H))
=> P(A|H) = 1/3
So
P(A) = 0.5 * 1/3 + 0.5 * (1 - P(A))
P(A) = 4/9
See here for more interview questions/brainteasers
CT08
Keep flipping a fair coin. What is the probability that the sequence HTT appears before HHT does?
CT08 - Answer:
This can be solved by setting up the problem as a Markov chain and calculating the absorption probabilities. Alternatively, letting A = HTT appears before HHT, we can consider the conditional probabilities conditioning on the first 3 flips:
P(A)
= P(A|HHH) + P(A|HHT) + P(A|HTH) + P(A|HTT) + P(A|THH) + P(A|THT) + P(A|TTH) + P(A|TTT)
Now it's time to simplify terms:
P(A|HHH) = 0
P(A|HHT) = 0
P(A|HTH) = P(A|H)
P(A|HTT) = 1
P(A|H) = P(A|T) = P(A)
So
P(A) = 1/8 * 0 + 1/8 * 0 + 1/8 * P(A) + 1/8 * 1 + 1/2 * P(A)
=> P(A)= 1/3
The second type of game is a little different from the previous case:
CT09
Two players take turn tossing a fair coin. When the sequence HT shows up for the first time, the person who flipped the T wins. What are the probabilities of winning for each player?
CT09 - Answer:
Once again conditional probability saves the day. Let A = First player wins,
P(A) = 0.5 * P(A|H) + 0.5 * P(A|T)
But P(A|T) = P(B) = 1 - P(A) (because "the second player becomes the first" if the first toss give a T)
Also, P(A|H) = 0.5 * P(A|HT) + 0.5 * P(A|HH) = 0.5 * 0 + 0.5 * (1 - P(A|H))
=> P(A|H) = 1/3
So
P(A) = 0.5 * 1/3 + 0.5 * (1 - P(A))
P(A) = 4/9
See here for more interview questions/brainteasers
Tuesday, November 6, 2012
Those damn coins... (Part II)
Last time we saw a number of coin-tossing related interview questions. In this post we look at two types of more advanced coin-related problems:
1) (possibly) unfair coins with odds drawn from a continuous distribution, and
2) games involving trying to obtain a specific sequence of Heads and Tails
1) Coins with odds drawn from a continuous distribution
If we cannot be sure about the odds of the coin, there would be another extra degree of randomness.
CT05
Suppose you have an unfair coin with P(Tail) = p, p is an unknown constant. What is the probability distribution of N, where N is the event: 'The first Head shows up at the N-th toss', and what is the expectation value of N?
CT05 - Answer:
The probability distribution is simply the Geometric Distribution p^(N-1) (1-p).
The expectation value of N is E[N] = 1/(1-p), which can be found by
E[N] = Sum_(i=1)^(inf) i*p^(i-1) (1-p)
Noting that
i*p^(i-1) = (d/dp)p^i
Using this and exchanging the order of summation and differentiation, we get
E[N] = 1/(1-p)
Meanwhile, if the odds of Head itself is drawn from a known distribution, then we can compute the desired expected values with integrals:
CT06
Suppose you have an unfair coin, with probability of a head p ~ uniform(0,1). What is the probability of getting 2 (or n) heads with 2 (or n) tosses?
CT06 - Answer:
If the coin is fair, then P(2 Heads) = 1/4. But now we would have to integrate
\int_0^1 p^n U(0,1) dp = 1/(n+1)
So for 2 tosses P(2 Heads) = 1/3.
Note that while for a fair coin with n tosses, the probabilites of obtaining {1,2,3,...,n} Heads are binomial coefficients, in this case the probabilites of obtaining {1,2,3,...,n} Heads are identical.
2) Games involving trying to obtain a specific sequence of Heads and Tails
This is yet another popular genre of interview questions, possibly because the questions can be varied easily (for example, by modifying the desired sequence of H and T). There are two sub-types: i) what is the probability that sequence A would show up before sequence B does? and ii) what is the probability that sequence A would be obtained during the first/second player's turn? (assuming that the two take turns) The two sub-types are to be solved in different ways. Let's start with the expected number of toss for obtaining a particular sequence.
CT07
Keep flipping a fair coin. What is the expected number of toss such that the sequence HH appears for the first time? What about HT and HHH?
CT07 - Answer:
One can take the most general coute and solve this as a Markov chain problem. For short sequences, especially for repeating sequences (i.e. HH, TT, HHH, etc.), writing down a recursive formula should be faster. Let N(A) be the expected number of toss to obtain the sequence A. Then
N(T) = N(H) = 2
N(HH) = N(H) + 0.5 * 1 + 0.5 * (1 + N(HH)) => N(HH) = 6
The recursive formula can be understood as the two possible outcomes upon obtaining the first H: either another H immediately, in which case the sequence HH is complete, or a T, in which case one has to start over from scratch, hence the N(HH) on the right hand side. It is not hard to prove by induction that, for B = k consecutive H,
N(B) = 2^(k+1) - 2
is a close-form solution. For general sequences, the recursive formula approach still works, though you would have to know the expected number of tosses for the sub-swquences. For instance:
N(HT) = N(H) + 0.5 * 1 + 0.5 * (1 + N(T))
N(HTH) = N(HT) + 0.5 * 1 + 0.5 * (1 + 0.5 * N(HTH))
N(HHT) = N(HH) + 0.5 * 1 + 0.5 * (1 + N(T))
What if we want to know the probability that sequence A would show up before sequence B does? We'll discuss that next time.
See here for more interview questions/brainteasers
1) (possibly) unfair coins with odds drawn from a continuous distribution, and
2) games involving trying to obtain a specific sequence of Heads and Tails
1) Coins with odds drawn from a continuous distribution
If we cannot be sure about the odds of the coin, there would be another extra degree of randomness.
CT05
Suppose you have an unfair coin with P(Tail) = p, p is an unknown constant. What is the probability distribution of N, where N is the event: 'The first Head shows up at the N-th toss', and what is the expectation value of N?
CT05 - Answer:
The probability distribution is simply the Geometric Distribution p^(N-1) (1-p).
The expectation value of N is E[N] = 1/(1-p), which can be found by
E[N] = Sum_(i=1)^(inf) i*p^(i-1) (1-p)
Noting that
i*p^(i-1) = (d/dp)p^i
Using this and exchanging the order of summation and differentiation, we get
E[N] = 1/(1-p)
Meanwhile, if the odds of Head itself is drawn from a known distribution, then we can compute the desired expected values with integrals:
CT06
Suppose you have an unfair coin, with probability of a head p ~ uniform(0,1). What is the probability of getting 2 (or n) heads with 2 (or n) tosses?
CT06 - Answer:
If the coin is fair, then P(2 Heads) = 1/4. But now we would have to integrate
\int_0^1 p^n U(0,1) dp = 1/(n+1)
So for 2 tosses P(2 Heads) = 1/3.
Note that while for a fair coin with n tosses, the probabilites of obtaining {1,2,3,...,n} Heads are binomial coefficients, in this case the probabilites of obtaining {1,2,3,...,n} Heads are identical.
2) Games involving trying to obtain a specific sequence of Heads and Tails
This is yet another popular genre of interview questions, possibly because the questions can be varied easily (for example, by modifying the desired sequence of H and T). There are two sub-types: i) what is the probability that sequence A would show up before sequence B does? and ii) what is the probability that sequence A would be obtained during the first/second player's turn? (assuming that the two take turns) The two sub-types are to be solved in different ways. Let's start with the expected number of toss for obtaining a particular sequence.
CT07
Keep flipping a fair coin. What is the expected number of toss such that the sequence HH appears for the first time? What about HT and HHH?
CT07 - Answer:
One can take the most general coute and solve this as a Markov chain problem. For short sequences, especially for repeating sequences (i.e. HH, TT, HHH, etc.), writing down a recursive formula should be faster. Let N(A) be the expected number of toss to obtain the sequence A. Then
N(T) = N(H) = 2
N(HH) = N(H) + 0.5 * 1 + 0.5 * (1 + N(HH)) => N(HH) = 6
The recursive formula can be understood as the two possible outcomes upon obtaining the first H: either another H immediately, in which case the sequence HH is complete, or a T, in which case one has to start over from scratch, hence the N(HH) on the right hand side. It is not hard to prove by induction that, for B = k consecutive H,
N(B) = 2^(k+1) - 2
is a close-form solution. For general sequences, the recursive formula approach still works, though you would have to know the expected number of tosses for the sub-swquences. For instance:
N(HT) = N(H) + 0.5 * 1 + 0.5 * (1 + N(T))
N(HTH) = N(HT) + 0.5 * 1 + 0.5 * (1 + 0.5 * N(HTH))
N(HHT) = N(HH) + 0.5 * 1 + 0.5 * (1 + N(T))
What if we want to know the probability that sequence A would show up before sequence B does? We'll discuss that next time.
See here for more interview questions/brainteasers
Monday, October 15, 2012
Those damn coins...
Coin tossing (or coin flipping) is one of the popular settings for probability questions. It might be of interest to contrast coin tossing and dice rolling:
- Coin tossing follows a binomial distribution (which converges to Gaussian); die rolling follows a uniform distribution.
- Tossing m coins gives an expectation value of m/2; rolling an m-sided die gives an expectation value of (m+1)/2.
Because of the binomial distribution, coin tossing problems usually are more involved than dice rolling in terms of arithmetics. The most straight-forward kind of coin problems would simply ask you about the mean and variance of the number of Head or Tail:
CT01
What is the expected gain if you are paid $1 for each head in tossing 4 fair coins? What is the standard deviation?
CT01 - Answer:
Remember that in n binary trials with single success probability p, the mean is np and the variance is np(1-p). So the expected gain is $2 and the standard deviation is $1.
Sometimes the question is framed as a game:
CT02
What is the probability that the number of Heads is greater than or equals to 2 when tossing 4 coins?
CT02 - Answer:
The Pascal numbers for 4 trials are {1,4,6,4,1}. Hence the required probability is (6+4+1)/16 = 11/16.
Once the number of tossing becomes large, the Law of Large Number is handy.
CT03
What is the probability that there are more than 60 Heads out of tossing 100 coins?
CT03 - Answer:
The last thing you want to do is to calculate the binomial coefficients explicitly. Instead, observe that the probability distribution converges to a Gaussian as number of trials goes to infinity, with mean equals fifty and variance equals 25 (hence the standard deviation is 5). 60 is two standard deviations away from the mean, and the 68-95-997 rule gives (1 - 95%)/2 = 2.5%.
Another way to ask coin-tossing brainteeasers is to play with Bayesian statistics.
CT04
There are three coins, one with a H side and a T side (coin A); one with both sides being T (coin B); one with both sides being H (coin C).If you draw one among them at random and toss it to observe a Head, what is the conditional probability that coin C is picked?
CT04 - Answer:
This is a typical Bayesian inference question. Let X = [coin C is picked] and Y = [the toss gives an H], then
P(X|Y) = P(Y|X)*P(X)/P(Y) = 1 * (1/3) / (1/3*0 + 1/3*1/2 + 1/3*1) = 2/3
Hope you find this helpful. More to come!
See here for more interview questions/brainteasers
- Coin tossing follows a binomial distribution (which converges to Gaussian); die rolling follows a uniform distribution.
- Tossing m coins gives an expectation value of m/2; rolling an m-sided die gives an expectation value of (m+1)/2.
Because of the binomial distribution, coin tossing problems usually are more involved than dice rolling in terms of arithmetics. The most straight-forward kind of coin problems would simply ask you about the mean and variance of the number of Head or Tail:
CT01
What is the expected gain if you are paid $1 for each head in tossing 4 fair coins? What is the standard deviation?
CT01 - Answer:
Remember that in n binary trials with single success probability p, the mean is np and the variance is np(1-p). So the expected gain is $2 and the standard deviation is $1.
Sometimes the question is framed as a game:
CT02
What is the probability that the number of Heads is greater than or equals to 2 when tossing 4 coins?
CT02 - Answer:
The Pascal numbers for 4 trials are {1,4,6,4,1}. Hence the required probability is (6+4+1)/16 = 11/16.
Once the number of tossing becomes large, the Law of Large Number is handy.
CT03
What is the probability that there are more than 60 Heads out of tossing 100 coins?
CT03 - Answer:
The last thing you want to do is to calculate the binomial coefficients explicitly. Instead, observe that the probability distribution converges to a Gaussian as number of trials goes to infinity, with mean equals fifty and variance equals 25 (hence the standard deviation is 5). 60 is two standard deviations away from the mean, and the 68-95-997 rule gives (1 - 95%)/2 = 2.5%.
Another way to ask coin-tossing brainteeasers is to play with Bayesian statistics.
CT04
There are three coins, one with a H side and a T side (coin A); one with both sides being T (coin B); one with both sides being H (coin C).If you draw one among them at random and toss it to observe a Head, what is the conditional probability that coin C is picked?
CT04 - Answer:
This is a typical Bayesian inference question. Let X = [coin C is picked] and Y = [the toss gives an H], then
P(X|Y) = P(Y|X)*P(X)/P(Y) = 1 * (1/3) / (1/3*0 + 1/3*1/2 + 1/3*1) = 2/3
Hope you find this helpful. More to come!
See here for more interview questions/brainteasers
Friday, June 10, 2011
Quant Interview Questions (Model Validation 2)
1) Consider the SDE dr = a r dt + b dW. What is the solution for r?
Ans:
This is a special case of O-U process. Try taking derivative of $ f \equiv r_t e^{-at} $
2) Toss a fair coin for 100 times. What is (approximately) the probability Pr(# of H >= 60)?
Ans:
Consider the binomial distribution
n C k*p^k*(1-p)^(n-k)
The mean of the distribution is n*p and the variance is n*p*(1-p). When n is large, the binomial distribution converges to normal distribution with mean = n*p and variance = n*p*(1-p). Hence the standard deviation in this case is 5, and H >= 60 would be two standard deviation away from the mean. The answer is therefore (1 - 95.4%)/2 = 2.3%.
3) What is copula and how can it be used for modeling correlated processes?
Ans:
Let F_X and G_Y be the cdf of random variables X and Y, and they are the marginal distribution of the bivariate distribution H_XY. Using the fact that U = F_X(X) and V = G_Y(Y) both have uniform distribution from 0 to 1 (because F_X(X) is the percentile function). Sklar's Theorem states that there exists a copula function C such that
H_XY = C(F_X, G_Y)
In other words, we can construct the joint distribution using the marginal distributions.
Reference: http://en.wikipedia.org/wiki/Copula_%28statistics%29
4) Why is holding the equity piece of a CDO longing default correlation?
Ans:
Holding the senior tranche is a short position in correlation - that is easy to understand, because the probability of loss for the senior tranche is low as long as not too many assets in the pool default together in a bad credit environment. However, for the equity piece it seems that it will not actively benefit from high correlation. The way to think of it is to consider when credit environment is good, a high correlation ensures that no asset in the pool goes default.
5) Suppose x is a continuous observable variable, Y* is a binary observable variable, and Y is a latent variable. The following relations hold:
Y = b x + e, e ~ N(0,1)
Y* = 1 (if Y >= 0) or 0 (if Y < 0)
Suppose we have a set of data (x, Y*). How can be calibrate the model (i.e. fit b)?
Ans:
This problem can be solved by MLE. Since e ~ N(0,1), we have Y ~ N(b x, 1). The probability of Y* = 1 (or equivalently Y >= 0) is just the cumulative normal probability density. For each of the data point (x_i, Y*_i) we have such an F(Y). Thus the likelihood function is
Product(F_i(Y))
and we fit b by maximizing this function.
6) A, B and C are playing a game with gemstones. Each of them starts with 5 stones with different colors (blue, red, green, yellow, white). There are 3 rounds in the game. In the first round, A pick one stone randomly from B and one from C; in the second round B pick one stone randomly from C and one from A; in the third round C pick one stone randomly from A and one from B. What is the probability that at the end of the game each person would have 5 stones with different colors?
Ans:
0.015873 (MC verification)
Edit: Thanks to M Millar, the right answer should be 0.0413. The tree looks like
-- C gets the right stones (1/9)
-- B takes original from A (3/7) --
| -- C fails (dead)
-- A draws same color (1/5) --
| -- B fails to take original from A (dead)
--
| -- B takes original from A (2/7) -- C gets the right stones (1/9)
| |
| | -- B takes the right color from C (1/4) -- C gets the right stones (1/9)
-- A draws diff colors (4/5) ---- B takes the other duplicated color from A (2/7) --
| -- B takes the wrong color from C (dead)
-- B removes one of the three lone colors (dead)
See here for more interview questions/brainteasers
Ans:
This is a special case of O-U process. Try taking derivative of $ f \equiv r_t e^{-at} $
2) Toss a fair coin for 100 times. What is (approximately) the probability Pr(# of H >= 60)?
Ans:
Consider the binomial distribution
n C k*p^k*(1-p)^(n-k)
The mean of the distribution is n*p and the variance is n*p*(1-p). When n is large, the binomial distribution converges to normal distribution with mean = n*p and variance = n*p*(1-p). Hence the standard deviation in this case is 5, and H >= 60 would be two standard deviation away from the mean. The answer is therefore (1 - 95.4%)/2 = 2.3%.
3) What is copula and how can it be used for modeling correlated processes?
Ans:
Let F_X and G_Y be the cdf of random variables X and Y, and they are the marginal distribution of the bivariate distribution H_XY. Using the fact that U = F_X(X) and V = G_Y(Y) both have uniform distribution from 0 to 1 (because F_X(X) is the percentile function). Sklar's Theorem states that there exists a copula function C such that
H_XY = C(F_X, G_Y)
In other words, we can construct the joint distribution using the marginal distributions.
Reference: http://en.wikipedia.org/wiki/Copula_%28statistics%29
4) Why is holding the equity piece of a CDO longing default correlation?
Ans:
Holding the senior tranche is a short position in correlation - that is easy to understand, because the probability of loss for the senior tranche is low as long as not too many assets in the pool default together in a bad credit environment. However, for the equity piece it seems that it will not actively benefit from high correlation. The way to think of it is to consider when credit environment is good, a high correlation ensures that no asset in the pool goes default.
5) Suppose x is a continuous observable variable, Y* is a binary observable variable, and Y is a latent variable. The following relations hold:
Y = b x + e, e ~ N(0,1)
Y* = 1 (if Y >= 0) or 0 (if Y < 0)
Suppose we have a set of data (x, Y*). How can be calibrate the model (i.e. fit b)?
Ans:
This problem can be solved by MLE. Since e ~ N(0,1), we have Y ~ N(b x, 1). The probability of Y* = 1 (or equivalently Y >= 0) is just the cumulative normal probability density. For each of the data point (x_i, Y*_i) we have such an F(Y). Thus the likelihood function is
Product(F_i(Y))
and we fit b by maximizing this function.
6) A, B and C are playing a game with gemstones. Each of them starts with 5 stones with different colors (blue, red, green, yellow, white). There are 3 rounds in the game. In the first round, A pick one stone randomly from B and one from C; in the second round B pick one stone randomly from C and one from A; in the third round C pick one stone randomly from A and one from B. What is the probability that at the end of the game each person would have 5 stones with different colors?
Ans:
Edit: Thanks to M Millar, the right answer should be 0.0413. The tree looks like
-- C gets the right stones (1/9)
-- B takes original from A (3/7) --
| -- C fails (dead)
-- A draws same color (1/5) --
| -- B fails to take original from A (dead)
--
| -- B takes original from A (2/7) -- C gets the right stones (1/9)
| |
| | -- B takes the right color from C (1/4) -- C gets the right stones (1/9)
-- A draws diff colors (4/5) ---- B takes the other duplicated color from A (2/7) --
| -- B takes the wrong color from C (dead)
-- B removes one of the three lone colors (dead)
See here for more interview questions/brainteasers
Labels:
coin tossing,
credit,
interview,
statistics,
stochastic calculus,
wishlist
Thursday, June 9, 2011
Quant Interview Questions (Interest Rate Strat.)
1)
Suppose you have a fair coin. How do you create an event A such that Pr(A) = 1/3?
Ans:
Toss the coin twice to produce the following space: {HH, HT, TH, TT}. Discard one of the element in the set. Each of the remaining outcome represents an event A.
Note: This recipe can be extended to producing an event with probability m/n, m and n being integers.
2)
Suppose you have a fair coin. How do you create an event A such that Pr(A) = r, r being an irrational number between 0 and 1?
Ans:
Translate r into a binary string. For example, 0.27314... would be {10}{0111}{011}{01}{100}... We start tossing the coin until we can tell if the string formed by coin tossing is greater/less than r in binary, with H = 1 and T = 0. For example, the first step is to toss twice. If we have {TH} = {01}, then the string formed by coin tossing must be less than r in binary. We terminate and conclude that the event belongs to A; If we have {HH} = {11}, then the string formed by coin tossing must be greater than r in binary. We terminate and conclude that the event does not belong to A. If the two tosses produce {HT} = {10} then we cannot conclude, and have to move on to the next digit {111}, and so on. (MC verification)
Note: We have to append a zero in front of {111}, {11} and {1} because otherwise we cannot create a greater binary, i.e. 111 is the greatest 3-digit binary number.
See here for more interview questions/brainteasers
Suppose you have a fair coin. How do you create an event A such that Pr(A) = 1/3?
Ans:
Toss the coin twice to produce the following space: {HH, HT, TH, TT}. Discard one of the element in the set. Each of the remaining outcome represents an event A.
Note: This recipe can be extended to producing an event with probability m/n, m and n being integers.
2)
Suppose you have a fair coin. How do you create an event A such that Pr(A) = r, r being an irrational number between 0 and 1?
Ans:
Translate r into a binary string. For example, 0.27314... would be {10}{0111}{011}{01}{100}... We start tossing the coin until we can tell if the string formed by coin tossing is greater/less than r in binary, with H = 1 and T = 0. For example, the first step is to toss twice. If we have {TH} = {01}, then the string formed by coin tossing must be less than r in binary. We terminate and conclude that the event belongs to A; If we have {HH} = {11}, then the string formed by coin tossing must be greater than r in binary. We terminate and conclude that the event does not belong to A. If the two tosses produce {HT} = {10} then we cannot conclude, and have to move on to the next digit {111}, and so on. (MC verification)
Note: We have to append a zero in front of {111}, {11} and {1} because otherwise we cannot create a greater binary, i.e. 111 is the greatest 3-digit binary number.
See here for more interview questions/brainteasers
Wednesday, June 8, 2011
Quant Interview Questions (Model Validation)
1) Suppose Prob(Tail) = p. What is the probability distribution of N, where N is the event:
The first Head shows up at the N-th toss.
Ans:
The (discrete) probability density is simply
Pr(event) = p^(N-1) (1-p)
What is the expectation value of N?
Ans:
E[N] = Sum_(i=1)^(inf) i*p^(i-1) (1-p)
How to do this sum? Trick:
i*p^(i-1) = (d/dp)p^i
Using this and exchanging the order of summation and differentiation, we get
E[N] = 1/(1-p)
2) Suppose X and Y are two random variables with the same cdf, X, Y > 0 and E[X/Y] = 1. Show that E[X] = E[Y].
Ans:
Cauchy-Schwarz inequality, but how?
See here for more interview questions/brainteasers
The first Head shows up at the N-th toss.
Ans:
The (discrete) probability density is simply
Pr(event) = p^(N-1) (1-p)
What is the expectation value of N?
Ans:
E[N] = Sum_(i=1)^(inf) i*p^(i-1) (1-p)
How to do this sum? Trick:
i*p^(i-1) = (d/dp)p^i
Using this and exchanging the order of summation and differentiation, we get
E[N] = 1/(1-p)
2) Suppose X and Y are two random variables with the same cdf, X, Y > 0 and E[X/Y] = 1. Show that E[X] = E[Y].
Ans:
Cauchy-Schwarz inequality, but how?
See here for more interview questions/brainteasers
Thursday, March 10, 2011
Interview Question
Suppose you have an unfair coin, with probability of a head p ~ uniform(0,1). What is the probability of getting 2 (or n) heads?
Hints: For 2 heads the answer is not 1/4.
Ans: The answer is 1/3 for 2 heads and 1/(n+1) for n heads. Just integrate
\int_0^1 p^n U(0,1) dp = 1/(n+1)
The implication is that under uniform distribution, TH and HT are not distinguished.
See here for more interview questions/brainteasers
Hints: For 2 heads the answer is not 1/4.
Ans: The answer is 1/3 for 2 heads and 1/(n+1) for n heads. Just integrate
\int_0^1 p^n U(0,1) dp = 1/(n+1)
The implication is that under uniform distribution, TH and HT are not distinguished.
See here for more interview questions/brainteasers
Wednesday, November 17, 2010
Combinations of H and T
One of the very popular brain teasers:
What's the mean number of coin toss to get HH/HT/HHH...ect?
Approach 1: Considering absorption state of a Markov chain -> solving a system of simultaneous equation
Approach 2: Recursive formulae. For example, let H = expected number of tosses to get a head, HH = expected number of tosses to get two consecutive heads and so on. Then
H = 0 + 0.5*1 + 0.5*(1+H); T = 0 + 0.5*1 + 0.5*(1+T)
HH = H + 0.5*1 + 0.5*(1+HH)
HT = H + 0.5*1 + 0.5*(1+T)
HTH = HT + 0.5*1 + 0.5*(1+HTH)
HHT = HH + 0.5*1 + 0.5*(1+T)
See here for more interview questions/brainteasers
What's the mean number of coin toss to get HH/HT/HHH...ect?
Approach 1: Considering absorption state of a Markov chain -> solving a system of simultaneous equation
Approach 2: Recursive formulae. For example, let H = expected number of tosses to get a head, HH = expected number of tosses to get two consecutive heads and so on. Then
H = 0 + 0.5*1 + 0.5*(1+H); T = 0 + 0.5*1 + 0.5*(1+T)
HH = H + 0.5*1 + 0.5*(1+HH)
HT = H + 0.5*1 + 0.5*(1+T)
HTH = HT + 0.5*1 + 0.5*(1+HTH)
HHT = HH + 0.5*1 + 0.5*(1+T)
See here for more interview questions/brainteasers
Thursday, October 28, 2010
Quant Interview Questions 2
1. sigma_A = 0.2, sigma_B = 0.3 and correlation = 0.5. Find the portfolio that has the lowest portfolio sigma.
Ans: 6/7 of A and 1/7 of B
2. Each cereal box contains a piece of toy. There are 4 kinds in total. What is the expected number of boxes you have to buy in order to get the entire collection?
Ans: 8.33333
3. What is the expected number of toss in order to get 3 consecutive H from a fair coin?
Ans: 1 H takes 2 tosses; 2 H's takes 6 tosses; 3 H's takes 14 tosses; n H's takes 2^(n+1)-2 tosses. See Zhou, under Markov chain, for details.
4. There are n people in the room, everyone has shaken hand with everyone else. If there are totally 66 handshakes, what is n?
Ans: 12
See here for more interview questions/brainteasers
Ans: 6/7 of A and 1/7 of B
2. Each cereal box contains a piece of toy. There are 4 kinds in total. What is the expected number of boxes you have to buy in order to get the entire collection?
Ans: 8.33333
3. What is the expected number of toss in order to get 3 consecutive H from a fair coin?
Ans: 1 H takes 2 tosses; 2 H's takes 6 tosses; 3 H's takes 14 tosses; n H's takes 2^(n+1)-2 tosses. See Zhou, under Markov chain, for details.
4. There are n people in the room, everyone has shaken hand with everyone else. If there are totally 66 handshakes, what is n?
Ans: 12
See here for more interview questions/brainteasers
Subscribe to:
Posts (Atom)