# Math 1580 — HW Solutions

$\def\ZZ{{\mathbb{Z}}} \def\FF{{\mathbb{F}}} \def\a{\alpha} \def\b{\beta} \def\g{\gamma} \def\d{\delta} \def\Prob{\operatorname{Pr}}$

Problem # 5.20 Use the definition (5.15) of the probability of an event to prove the following basic facts about probability theory.

(a) Let $E$ and $F$ be disjoint events. Then $\Prob(E\cup F) = \Prob(E)+\Prob(F).$

(b) Let $E$ and $F$ be events that need not be disjoint. Then $\Prob(E\cup F) = \Prob(E)+\Prob(F) - \Prob(E\cap F).$

(c) Let $E$ be an event. Then $\Prob(E^c)=1-\Prob(E)$.

The formulas in (b) and (d) and their generalization to $n$ events are known as the inclusion--exclusion principle.
Solution # 5.20

(a) Label the elements of $\Omega$ so that $E=\{\omega_1,\ldots,\omega_m\}$ and $F=\{\omega_{m+1},\ldots,\omega_n\}$. Then $E\cup F=\{\omega_1,\ldots,\omega_n\}$, so $\Prob(E\cup F) = \sum_{i=1}^n \Prob(\omega_i) = \sum_{i=1}^m \Prob(\omega_i) + \sum_{i=m+1}^n \Prob(\omega_i) = \Prob(E) + \Prob(F).$

(b) Label the elements of $\Omega$ so that $E\cap F=\{\a_1,\ldots,\a_m\}$ and $E=\{\a_{1},\ldots,\a_m,\b_1,\ldots,\b_n\}$ and $F=\{\a_{1},\ldots,\a_m,\g_1,\ldots,\g_\ell\}$. Then $E\cup F=\{\a_{1},\ldots,\a_m,\b_1,\ldots,\b_n,\g_1\ldots,\g_\ell\}$, so $\Prob(E)+\Prob(F)-\Prob(E\cap F) = \left( \sum_{i=1}^m \Prob(\a_i) + \sum_{i=1}^n \Prob(\b_i) \right) + \left( \sum_{i=1}^m \Prob(\a_i) + \sum_{i=1}^\ell \Prob(\g_i) \right) - \sum_{i=1}^m \Prob(\a_i) = \sum_{i=1}^m \Prob(\a_i) + \sum_{i=1}^n \Prob(\b_i) + \sum_{i=1}^\ell \Prob(\g_i) = \Prob(E\cup F).$

(c) Label the elements of $\Omega$ so that $E=\{\omega_1,\ldots,\omega_m\}$ and $\Omega=\{\omega_1,\ldots,\omega_n\}$. Then $E^c=\{\omega_{m+1},\ldots,\omega_n\}$, so $\Prob(E^c) = \sum_{i=m+1}^n \Prob(\omega_i) = \sum_{i=1}^n \Prob(\omega_i) - \sum_{i=1}^m \Prob(\omega_i) = \Prob(\Omega) - \Prob(E) = 1- \Prob(E).$

Problem # 5.21 We continue with the coin tossing scenario from Example 5.23, so our experiment consists in tossing a fair coin ten times. Compute the probabilities of the following events.

(a) The first and last tosses are both heads.

(b) Either the first toss or the last toss (or both) are heads.

(c) Either the first toss or the last toss (but not both) are heads.
Solution # 5.21 We label the events in the parts of this problem as $E_{\text{(a)}}$, $E_{\text{(b)}}$, $E_{\text{(c)}}$.

(a) $\Prob(E_{\text{(a)}}) = \frac{1}{4}$.

(b) $\Prob(E_{\text{(b)}}) = 1 - \Prob(E_{\text{(b)}}^c) = 1 - \frac{1}{4}=\frac{3}{4}$.

(c) $\Prob(E_{\text{(c)}}) = \Prob(E_{\text{(b)}}) - \Prob(E_{\text{(a)}}) = \frac{1}{2}$.

Problem # 5.22 Alice offers to make the following bet with you. She will toss a fair coin 14 times. If exactly 7 heads come up, she will give you $4; otherwise you must give her$1. Would you take this bet? If so, and if you repeated the bet 10000 times, how much money would you expect to win or lose?
Solution # 5.22 The probability of winning the bet is $\binom{14}{7}\cdot\frac{1}{2^{14}} = \frac{3432}{16384} = \frac{429}{2048} \approx 0.2095.$ Thus your probability of winning the bet is slightly larger than $\frac{1}{5}$, so it is worthwhile making the bet. (Note that if the probability of winning were exactly $\frac{1}{5}$, then in five trials you would expect to win once for plus $4 and lose four times for minus$4, so you would end up even.) In 10000 trials, you would expect to win the bet approximately 2095 times, for a gain of $8380, and to lose the bet approximately 7905 times, for a loss of$7905. Hence your average net gain for 10000 trials is $475. Problem # 5.23 Let$E$and$F$be events. (a) Prove that$\Prob(E\mid E)=1$. Explain in words why this is reasonable. (b) If$E$and$F$are disjoint, prove that$\Prob(F\mid E)=0$. Explain in words why this is reasonable. Solution # 5.23 (a)$\displaystyle \Pr(E\mid E)=\frac{\Pr(E\cap E)}{\Pr(E)}=\frac{\Pr(E)}{\Pr(E)}=1$. It is clear that if we know that$E$occurs, then the probability that$E$occurs is$1$. (b)$\displaystyle \Pr(F\mid E)=\frac{\Pr(F\cap E)}{\Pr(E)} =\frac{\Pr(\emptyset)}{\Pr(E)}=0$. If$E$occurs and$F$is disjoint from$E$, then none of the individual events in$F$can possibly occur, so the probability of$F$is clearly$0. Problem # 5.24 There are two urns containing pens and pencils. Urn #1 contains three pens and seven pencils and Urn #2 contains eight pens and four pencils. (a) An urn is chosen at random and an object is drawn. What is the probability that it is a pencil? (b) An urn is chosen at random and an object is drawn. If the object drawn is a pencil, what is the probability that it came from Urn #1? (c) If an urn is chosen at random and two objects are drawn simultaneously, what is the probability that both are pencils? Solution # 5.24 Define events \begin{align*} E&=\{\text{Urn #1 is selected}\},\\ F&=\{\text{A pencil is selected}\}. \end{align*} (a) \begin{align*} \Prob(F) &= \Prob(F\mid E)\Prob(E) + \Prob(F\mid E^c)\Prob(E^c) \\ &= \frac{7}{10}\cdot\frac{1}{2} + \frac{4}{12}\cdot\frac{1}{2} \\ &= \frac{31}{60}\approx 0.517. \end{align*} (b) We compute \begin{align*} \Prob(E\mid F) &= \frac{\Prob(F\mid E)\Prob(E)}{\Prob(F)} &&\text{Baye's law,}\\ &=\frac{(7/10)\cdot(1/2)}{31/60} &&\text{using (a) to get\Prob(F),} \\ &= \frac{21}{31} \approx 0.677. \end{align*} (c) We need slightly different events, so we let \begin{align*} E&=\{\text{Urn #1 is selected}\},\\ F&=\{\text{First item selected is a pencil}\},\\ G&=\{\text{Second item selected is a pencil}\}. \end{align*} Then $\Prob(F \text{and} G) = \Prob(F)\Prob(G\mid F).$ We already know\Prob(F)=31/60$from (a). To compute$\Prob(G\mid F), we do a calculation similar to the calculation in (a). Thus \begin{align*} \Prob(G\mid F) &= \Prob(G\mid F~\&~ E)\cdot\Prob(F~\&~ E) + \Prob(G\mid F~\&~E^c)\cdot\Prob(F~\&~E^c) \\ &= \Prob(G\mid F~\&~ E)\cdot\Prob(F\mid E)\cdot\Prob(E) + \Prob(G\mid F~\&~E^c)\cdot\Prob(F\mid E^c)\cdot\Prob(E^c) \\ &= \frac{6}{9}\cdot\frac{7}{10}\cdot\frac{1}{2} + \frac{3}{11}\cdot\frac{4}{12}\cdot\frac{1}{2} \\ &=\frac{46}{165}\approx 0.279. \end{align*} Problem # 5.25 An urn contains 20 silver coins and 10 gold coins. You are the sixth person in line to randomly draw and keep a coin from the urn. (a) What is the probability that you draw a gold coin? (b) If you draw a gold coin, what is the probability that the five people ahead of you all drew silver coins? Solution # 5.25 (a) It doesn't matter if you are the sixth to draw a coin, or the first, or the last, your chance of getting a gold coin is10/30$, since there are$10$gold coins and$30coins altogether. (If you had some information about the color of the coins drawn by the people ahead of you, that would change the answer, but the problem does not give you any such information.) (b) This part is more difficult. We define events: \begin{align*} E &=\{\text{You draw a gold coin}\}, \\ F &=\{\text{Previous 5 people drew silver coins}\}. \end{align*} We want to compute\Prob(F\mid E)$and we will use Baye's law in the form $\Prob(F\mid E) = \frac{\Prob(E\mid F)\Prob(F)}{\Prob(E)}.$ As already explained,$\Prob(E)=1/3$. Similarly, it is easy to compute$\Prob(E\mid F)$. The assumption that$F$is true means that when you draw your coin, the urn now contains$15$silver coins and$10$gold coins, so your probability of drawing a gold coin is$\Prob(E\mid F)=10/25=2/5$. \par Finally, to compute$\Prob(F)$, define events$F_1,\ldots,F_5by $F_i=\{\text{Person #i draws a silver coin}\}.$ Then \begin{align*} \Prob(F) &=\Prob(F_1~\&~F_2~\&~F_3~\&~F_4~\&~F_5) \\ &=\Prob(F_1)\cdot\Prob(F_2\mid F_1)\cdot \Prob(F_3\mid F_1~\&~F_2)\cdot \Prob(F_4\mid F_1~\&~F_2~\&~F_3)\hspace{2em}\\ &\hspace{5em}\cdot\Prob(F_5\mid F_1~\&~F_2~\&~F_3~\&~ F_4)\\ &=\frac{20}{30}\cdot\frac{19}{29}\cdot\frac{18}{28}\cdot\frac{17}{27} \cdot\frac{16}{26}\\ &= \frac{2584}{23751} \approx 0.109. \end{align*} We now have the values needed to solve the problem: $\Prob(F\mid E) = \frac{\Prob(E\mid F)\Prob(F)}{\Prob(E)} = \frac{(2/5)\cdot(2584/23751)}{1/3} = \frac{5168}{39585} \approx 0.131.$ Thus with no other knowledge, there is approximately an11\%$chance that the first five coins chosen are silver, but if we know that the sixth coin chosen is gold, then the probability that the first five were silver increases to approximately$13\%$. Problem # 5.28 Let$\mathcal{S}$be a set, let$A$be a property of interest, and suppose that for$m\in\mathcal{S}$, we have $\Prob(\text{m does not have property A})=\delta.$ Suppose further that a Monte Carlo algorithm applied to$m$and a random number$r$satisfy: 1. If the algorithm returns YES, then$m$definitely has property$A$. 2. If$m$has property$A$, then the probability that the algorithm returns YES is at least$p$. Notice that we can restate (1) and (2) as conditional probabilities: 1.$\Prob(\text{$m$ has property $A$} \mid \text{algorithm returns YES}) = 1$, 2.$\Prob(\text{algorithm returns YES} \mid \text{$m$ has property $A$}) \ge p$. Suppose that we run the algorithm$N$times on the number$m$, and suppose that the algorithm returns NO every single time. Derive a lower bound, in terms of$\delta$,$p$, and$N$, for the probability that$m$does not have property$A$. (This generalizes the version of the Monte Carlo method that we studied in Section 5.3.3 with$\delta=0.01$and$p=\frac{1}{2}$. Be careful to distinguish$p$from \text{$1-p} in your calculations.) Solution # 5.28 Let \begin{align*} E&=\{\text{an elementm\in\mathcal{S}$does not have property$A$}\}. \\ F&=\{\text{the algorithm returns NO$Ntimes in a row}\}. \end{align*} We want a lower bound for the conditional probability\Prob(E\mid F)$, that is, the probability that$m$does not have property$A$despite the fact that the algorithm returned NO$N$times. We compute this probability using Bayes's formula $\Prob(E\mid F) = \frac{\Prob(F\mid E)\Prob(E)} {\Prob(F\mid E)\Prob(E) + \Prob(F\mid E^c)\Prob(E^c)}.$ We are given that the probability of not having property$A$is$\d$, so $\Prob(E) = \Prob(\text{not A}) = \d \qquad\text{and}\qquad \Prob(E^c)= \Prob(A) = 1-\d.$ Next consider$\Prob(F\mid E)$. If$m$does not have property$A$, which is our assumption on this conditional probability, then the algorithm always returns NO, since Property (1) tells us that a YES output forces$m$to have property$A$. Thus $\Prob(NO\mid \text{not A}) = \Prob(A \mid YES) = 1,$ from which it follows that$\Prob(F\mid E)=\Prob(NO\mid \text{not $A$})^N=1$. Finally, we must compute the value of$\Prob(F\mid E^c)$. Since the algorithm is run$Nindependent times, we have \begin{align*} \Prob(F\mid E^c) &= \Prob(\text{Output is NO} \mid \text{m$has property$A$})^N \\ &= \bigl(1 - \Prob(\text{Output is YES} \mid \text{$m$has property$A})\bigr)^N \\ &\le \left(1 - p\right)^N \qquad\text{from Property (2) of the Monte Carlo method.} \end{align*} Substituting these values into Bayes's formula, we find that if the algorithm returns NON$times in a row, then the probability that the integer$m$does not have property$A$is $\Prob(E\mid F) \ge \frac{1\cdot\d}{1\cdot\d + (1-p)^N\cdot(1-\d)} = \frac{\d}{\d + (1-p)^N\cdot(1-\d)}.$ If$\d$and$p$are not too small and$N$is large, this can be approximated by $\Prob(E\mid F) \ge 1 - \frac{(1-p)^N\cdot(1-\d)}{\d + (1-p)^N\cdot(1-\d)} \approx 1 - \frac{(1-p)^N\cdot(1-\d)}{\d} = 1 - (1-p)^N\cdot(\d^{-1}-1).$ Problem # 5.30 If an integer$n$is composite, then the Miller–Rabin test has at least a$75\%$chance of succeeding in proving that$n$is composite, while it never misidentifies a prime as being composite. (See Table 3.2 in Section 3.4 for a description of the Miller–Rabin test.) Suppose that we run the Miller–Rabin test$N$times on the integer$n$and that it fails to prove that$n$is composite. Show that the probability that$n$is prime satisfies (approximately) $\Prob(\text{n is prime}\mid \text{the Miller–Rabin test fails N times}) \ge 1 - \frac{\ln(n)}{4^N}.$ (Hint. Use Exercise 5.28 with appropriate choices of$A$,$\mathcal{S}$,$\d$, and$p$. You may also use the estimate from Section 3.4.1 that the probability that$n$is prime is approximately$1/\ln(n)$.) Solution # 5.30 In Exercise 5.28 we let$A$be the property of being composite and we let$p=\frac34$, since we know that if$n$is composite, then the Miller–Rabin test returns YES at least$75\%$of the time. Further, we have$\d\approx1-1/\ln(n)$, since$\d$is the probability that$n$is composite, which is$1minus the probability that it is prime. The solution to that exercise says that (approximately) \begin{align*} \Prob(\text{n$is prime}&\mid \text{the Miller–Rabin test fails$Ntimes}) \\ &\ge 1 - \frac{(1-p)^N}{\d^{-1}-1} \\ &\approx 1 -\frac{\ln(n)-1}{4^N} \\ &\approx 1 -\frac{\ln(n)}{4^N}. \end{align*} Problem # 5.32 Letf_X(k)$be the binomial density function (5.23). Prove directly, using the binomial theorem, that$\sum_{k=0}^n f_X(k)=1$. Solution # 5.32 Let$q=1-p$, so$p+q=1$. Then we use the binomial theorem to compute $\sum_{k=0}^n f_X(k) = \sum_{k=0}^n \binom{n}{k}p^k(1-p)^{n-k} = \sum_{k=0}^n \binom{n}{k}p^kq^{n-k} = (p+q)^n = 1^n = 1.$ Problem # 5.34 In each case, compute the expectation of the random variable$X$. (a) The values of$X$are uniformly distributed on the set \text{$\{0,1,2,\ldots,N-1\}$}. (See Example 5.28.) (b) The values of$X$are uniformly distributed on the set$\{1,2,\ldots,N\}. Solution # 5.34 (a) \begin{align*} E(X) &= 0\cdot\frac{1}{N}+1\cdot\frac{1}{N}+2\cdot\frac{1}{N}+\cdots +(N-1)\cdot\frac{1}{N} \\ &=\frac{0+1+2+\cdots+(N-1)}{N} \\ &= \frac{\frac12(N-1)N}{N} \\ &= \frac{N-1}{2}. \end{align*} (b) \begin{align*} E(X) &= 1\cdot\frac{1}{N}+2\cdot\frac{1}{N}+3\cdot\frac{1}{N}+\cdots +N\cdot\frac{1}{N} \\ &=\frac{1+2+3+\cdots+N}{N} \\ &= \frac{\frac12N(N+1)}{N} \\ &= \frac{N+1}{2}. \end{align*} Problem # 5.36 (a) {In a group of 23 strangers, what is the probability that at least two of them have the same birthday? How about if there are 40 strangers? In a group of 200 strangers, what is the probability that one of them has the same birthday as your birthday? (Hint. See the discussion in Section 5.4.1.)} (b) Suppose that there areN$days in a year (where$N$could be any number) and that there are$n$people. Develop a general formula, analogous to (5.28), for the probability that at least two of them have the same birthday. (\Hint Do a calculation similar to the proof of (5.28) in the collision theorem (Theorem 5.38), but note that the formula is a bit different because the birthdays are being selected from a single list of$Ndays.) (c) Find a lower bound of the form $\Prob(\text{at least one match}) \ge 1 - e^{-\text{(some function of n and N)}}$ for the probability in (b), analogous to the estimate (5.29) Solution # 5.36 (b) We start by doing (b). \begin{align*} \Prob( \text{at least one match inn$attempts} ) &= 1- \Prob(\text{all$n$birthdays are different}) \\ &= 1 - \prod_{i=1}^{n} \Prob(\text{$i^{\text{th}}$birthday is different from all of the previous$i-1birthdays}) \\ &= 1 - \prod_{i=1}^{n} \frac{N - (i-1)}{N} \\ &= 1 - \prod_{i=1}^{n-1} \left(1 - \frac{i}{N} \right). \end{align*} (a) Then the answer to the first part of (a) is obtained by settingN=365$and$n=23$, which gives $\text{(a)}\qquad \Prob(\text{match}) = 1 - \prod_{i=1}^{22} \left(1 - \frac{i}{365} \right) \approx 50.73\%.$ Similarly,$N=365$and$n=40gives the answer to the second part of (a), $\text{(b)}\qquad \Prob(\text{match}) = 1 - \prod_{i=1}^{39} \left(1 - \frac{i}{365} \right) \approx 89.12\%.$ The final part of (a) is \begin{align*} \Prob(\text{someone has your birthday}) &= 1 - \Prob(\text{no one has your birthday}) \\ &= 1 - \Prob(\text{one person does not have your birthday})^{200} \\ &= 1 - \left(\frac{364}{365}\right)^{200} \\ &\approx 42.23\%. \end{align*} (c) We use the lower bounde^{-x}\ge1-x$with$x=i/Nto compute \begin{align*} \Prob(\text{ at least one match innattempts}) &= 1 - \prod_{i=1}^{n-1} \left(1 - \frac{i}{N} \right) \\ &\ge 1 - \prod_{i=1}^{n-1} e^{-i/N} \\ &= 1 - e^{-(1+2+\cdots+(n-1))/N} \\ &= 1 - e^{-(n-1)n/2N} \\ &\approx 1 - e^{-n^2/2N}. \end{align*} Notice that we have used the well known formula $1+2+\cdot+(n-1) = \frac{n(n-1)}{2}.$ Problem # 5.37 A deck of cards is shuffled and the top eight cards are turned over. (a) What is the probability that the king of hearts is visible? (b) A second deck is shuffled and its top eight cards are turned over. What is the probability that a visible card from the first deck matches a visible card from the second deck? (Note that this is slightly different from Example 5.39 because the cards in the second deck are not being replaced.) Solution # 5.37 (a) There are 52 cards in the deck, so the probability of any particular card coming up in the first 8 is\frac{8}{52}=\frac{2}{13}\approx15.38\%$(b) We calculate the probability that there are no matches. The probability that the first card is not a match is$\frac{44}{52}$, then, having used up that card, there are 51 cards left, of which 8 are still bad, so the probability that the second card is not a match is$\frac{43}{51}$. And so on. Hence $\Prob(\text{no match in 8 tries}) = \prod_{i=0}^7 \frac{44-i}{52-i}.$ Thus $\Prob(\text{match}) = 1 - \Prob(\text{no match}) = 1 - \prod_{i=0}^7 \frac{44-i}{52-i} = \frac{13633279}{57887550} \approx 23.55%.$ Problem # 5.38 (a) Prove that $e^{-x}\ge1-x\quad\text{for all values of x.}$ (Hint. Look at the graphs of$e^{-x}$and$1-x$, or use calculus to compute the minimum of the function$f(x)=e^{-x}-(1-x)$.) Solution # 5.38 (a) Let$f(x) = e^{-x} - 1 + x$. Then$f(0)=f'(0)=0$. Then generalized mean value theorem says that $f(x) = f(0) + f'(0)x + \frac{1}{2}f''(z)x^2 \quad\text{for some 0\le z\le x,}$ so we find that$f(x)=\frac{1}{2}e^{-z}x^2\ge0$. This is the desired inequality. Problem # 5.39 Solve the discrete logarithm problem$10^x=106$in the finite field$\mathbb{F}_{811}$by finding a collision among the random powers$10^i$and$106\cdot 10^i\$ that are listed in Table 5.17.
Solution # 5.39 >From Table 5.17 we see that $10^{234} = 106\cdot 10^{399} =304 \quad\text{in \mathbb{F}_{811}.}$ Hence $10^{234}\cdot 10^{-399}=10^{-165}=10^{645}=106 \quad\text{in \mathbb{F}_{811}.}$