P302-01 (2)
pdf
keyboard_arrow_up
School
University of British Columbia *
*We aren’t endorsed by this school
Course
302
Subject
Mathematics
Date
Jan 9, 2024
Type
Pages
35
Uploaded by DeanTitanium8523
Math 302
Final Examination, December 2015
Page 1 of 1
Instructions:
The duration for this exam is 150 minutes (2 hours 30 minutes). There are a total of 8
questions worth 100 points in total. A scientific calculator is allowed and a table of common distributions is
provided.
1. (15 points)
(a) Define the moment generating function of a random variable
X
.
(b) Define the covariance of two random variables
X, Y
.
(c) Define what it means for two events
A
and
B
to be independent.
(d) State the Chebyshev Inequality.
(e) State the Central Limit Theorem.
2. (8 points) Assume that
X, Y
are independent random variables such that
E
(
X
) = 1
,
Var(
X
) = 2
,
E
(
Y
) =
3 and Var(
Y
) = 4.
(a) Calculate
E
(2
X
+ 4
XY
-
Y
).
(b) Find Var(5
X
+ 3
Y
).
3. (14 points) A die is tossed seven times.
(a) What is the probability that exactly two outcomes are 1?
(b) What is the probability that all outcomes are odd, given that the first outcome was greater than 3?
4. (11 points) let
X, Y
be independent exponential random variables with the same parameter
λ
. Find the
probability density function of
X
-
Y
.
5. (14 points) A random variable
X
has moment generating function
g
X
(
t
) = (1
-
3
t
)
-
2
, defined for
t <
1
3
.
(a) Find
E
(
X
).
(b) Find Var(
X
).
6. (16 points) Assume that the random variables
X, Y
have joint density function
f
(
x, y
) =
braceleftBigg
c
(3
x
2
+ 2
y
)
0
≤
x
≤
1
0
≤
y
≤
2
0
otherwise
.
(a) Find the constant
c
.
(b) Find the marginal probability density function of
Y
.
(c) Find the conditional expectation
E
(
X
|
Y
= 1).
7. (11 points) Let
X
and
Y
be independent geometric random variables with the same parameter
p
. What
is the probability that
X
+
Y
is odd?
8. (11 points) Let
S
200
be the number of heads that turn up in 200 tosses of a fair coin. Use the Central
Limit Theorem to estimate the probability
P
(90
≤
S
200
≤
110).
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Math 302
Final Examination, December 2015
Page 1 of 4
Instructions:
The duration for this exam is 150 minutes (2 hours 30 minutes). There are a total of 8
questions worth 100 points in total. A scientific calculator is allowed and a table of common distributions is
provided.
1. (15 points)
(a) Define the moment generating function of a random variable
X
.
Solution:
The moment generating function
M
X
(
t
) of
X
is defined by
M
X
(
t
) =
E
(
e
tX
) when-
ever the expectation is finite.
(b) Define the covariance of two random variables
X, Y
.
Solution:
The covariance of two random variables
X, Y
is defined by the formula Cov(
X, Y
) =
E
((
X
-
E
(
X
))(
Y
-
E
(
Y
)) =
E
(
XY
)
-
E
(
X
)
E
(
Y
)
.
(c) Define what it means for two events
A
and
B
to be independent.
Solution:
A
and
B
are independent if their probabilities satisfy
P
(
A
∩
B
) =
P
(
A
)
P
(
B
).
(d) State the Chebyshev Inequality.
Solution:
If
X
has mean
μ
and variance
σ
2
, then for all
k >
0,
P
(
|
X
-
μ
| ≥
kσ
)
≤
1
k
2
.
(e) State the Central Limit Theorem.
Solution:
Let
{
X
i
}
be independent and identically distributed random variables each with
mean
μ
and variance
σ
2
. Suppose
S
n
=
∑
n
i
=1
X
i
. Then
S
n
−
nμ
σ
√
n
converges in distribution to
Z
,
the standard normal random variable.
2. (8 points) Assume that
X, Y
are independent random variables such that
E
(
X
) = 1
,
Var(
X
) = 2
,
E
(
Y
) =
3 and Var(
Y
) = 4.
(a) Calculate
E
(2
X
+ 4
XY
-
Y
).
Solution:
Using linearity of expectation and since
E
(
XY
) =
E
(
X
)
E
(
Y
) if
X, Y
are indepen-
dent,
E
(2
X
+ 4
XY
-
Y
) = 2
E
(
X
) + 4
E
(
X
)
E
(
Y
)
-
E
(
Y
) = 2 + 12
-
3 = 11
.
(b) Find Var(5
X
+ 3
Y
).
Solution:
Since
X, Y
are independent, Var(
X
+
Y
) = Var(
X
) + Var(
Y
). Furthermore, since
Var(
aX
) =
a
2
Var(
X
), then
Var(5
X
+ 3
Y
) = 25Var(
X
) + 9Var(
Y
) = 50 + 36 = 86
.
3. (14 points) A die is tossed seven times.
(a) What is the probability that exactly two outcomes are 1?
Math 302
Final Examination, December 2015
Page 2 of 4
Solution:
The number of 1s is distributed binomially with
n
= 7
, p
=
1
6
. Hence the probability
there are 2 1s is
(
7
2
)
(
1
6
)
2
(
5
6
)
5
=
21
∗
5
5
6
7
≈
0
.
23
.
(b) What is the probability that all outcomes are odd, given that the first outcome was greater than
3?
Solution:
Let
A
be the event that all outcomes are odd and
B
be the event that the first
outcome was greater than 3. We want to compute
P
(
A
|
B
). Since
P
(
B
) =
1
2
, and
P
(
A
∩
B
) =
1
6
(
1
2
)
6
(the event
A
∩
B
is the event that the first roll is 5 and all other rolls are odd.) Hence,
P
(
A
|
B
) =
P
(
A
∩
B
)
P
(
B
)
=
1
6
(
1
2
)
5
=
1
192
≈
0
.
0052
.
4. (11 points) let
X, Y
be independent exponential random variables with the same parameter
λ
. Find the
probability density function of
X
-
Y
.
Solution:
We will compute the cumultative density function of
X
-
Y
and then compute its prob-
ability density function from the cumulative density function. The joint density function of (
X, Y
)
is
λ
2
e
−
λ
(
x
+
y
)
for
x, y
≥
0 since
X, Y
are independent and each exponentially distributed with pa-
rameter
λ
. Let
c
∈
R
. Note that
X
-
Y
≤
c
is equivalent to the condition that
Y
≥
X
-
c
and
furthermore each
X, Y
must be positive. Hence, if
c
is negative, then
P
(
X
-
Y
≤
c
) =
integraldisplay
∞
0
integraldisplay
∞
x
−
c
λ
2
e
−
λ
(
x
+
y
)
dy dx
=
λ
2
integraldisplay
∞
0
e
−
λx
e
−
λy
-
λ
vextendsingle
vextendsingle
vextendsingle
∞
x
−
c
=
λ
integraldisplay
∞
0
e
−
2
λx
+
λc
dx
=
λe
λc
e
−
2
λx
-
2
λ
vextendsingle
vextendsingle
vextendsingle
∞
0
=
e
λc
2
.
(1)
and if
c
is positive, then
P
(
X
-
Y
≤
c
) =
λ
2
bracketleftbiggintegraldisplay
c
0
integraldisplay
∞
0
e
−
λ
(
x
+
y
)
dy dx
+
integraldisplay
∞
c
integraldisplay
∞
x
−
c
e
−
λ
(
x
+
y
)
dy dx
bracketrightbigg
.
(2)
Since
integraldisplay
∞
c
integraldisplay
∞
x
−
c
λ
2
e
−
λ
(
x
+
y
)
=
λe
λc
integraldisplay
∞
c
e
−
2
λx
dx
=
λe
λc
e
−
2
λc
2
λ
=
e
−
λc
2
and
integraldisplay
c
0
integraldisplay
∞
0
λ
2
e
−
λ
(
x
+
y
)
dy dx
=
λ
2
integraldisplay
c
0
e
−
λx
dx
integraldisplay
∞
0
e
−
λy
dy
=
λ
2
parenleftbigg
1
-
e
−
λc
λ
parenrightbigg parenleftbigg
1
λ
parenrightbigg
= 1
-
e
−
λc
.
Then
P
(
X
-
Y
≤
c
) = 1
-
e
−
λc
+
e
−
λc
2
= 1
-
e
−
λc
2
Math 302
Final Examination, December 2015
Page 3 of 4
for positive
c
.
Therefore, since
P
(
X
-
Y
≤
c
) =
braceleftBigg
e
λc
2
c
≤
0
1
-
e
-
λc
2
c >
0
, then the probability density function is
f
(
c
) =
braceleftBigg
λ
e
λc
2
c
≤
0
λ
e
-
λc
2
c >
0
=
λe
−
λc
2
.
5. (14 points) A random variable
X
has moment generating function
g
X
(
t
) = (1
-
3
t
)
−
2
, defined for
t <
1
3
.
(a) Find
E
(
X
).
Solution:
By definition of moment generating function,
E
(
X
) =
g
′
X
(0). Since
g
′
X
(
t
) = 6(1
-
3
t
)
−
3
, then
E
(
X
) = 6.
(b) Find Var(
X
).
Solution:
Since
E
(
X
2
) =
g
′′
X
(0) and
g
′′
X
(
t
) = 54(1
-
3
t
)
−
4
, then
Var(
X
) =
E
(
X
2
)
-
E
(
X
)
2
= 54
-
6
2
= 18
.
6. (16 points) Assume that the random variables
X, Y
have joint density function
f
(
x, y
) =
braceleftBigg
c
(3
x
2
+ 2
y
)
0
≤
x
≤
1
0
≤
y
≤
2
0
otherwise
.
(a) Find the constant
c
.
Solution:
Since
integraltext
2
0
integraltext
1
0
3
x
2
dx dy
=
integraltext
2
0
dy
= 2 and
integraltext
2
0
integraltext
1
0
2
y dx dy
= 4, then
1 =
integraldisplay
2
0
integraldisplay
1
0
f
(
x, y
)
dx dy
=
c
(2 + 4) = 6
c
implies
c
=
1
6
for
f
(
x, y
) to be a probability density function.
(b) Find the marginal probability density function of
Y
.
Solution:
The marginal probability density of
Y
is defined by
f
Y
(
y
) =
integraltext
f
(
x, y
)
dx
. Hence
f
Y
(
y
) =
integraldisplay
1
0
c
(3
x
2
+ 2
y
)
dx
=
c
(
x
3
+ 2
xy
)
|
1
0
=
c
(1 + 2
y
) =
1
6
(1 + 2
y
)
.
(c) Find the conditional expectation
E
(
X
|
Y
= 1).
Solution:
The conditional density for
X
at
Y
= 1 is defined by
f
X
|
Y
=1
(
x
) =
f
X,Y
(
x,
1)
f
Y
(1)
=
1
6
(3
x
2
+ 2)
1
2
=
3
x
2
+ 2
3
=
x
2
+
2
3
.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Math 302
Final Examination, December 2015
Page 4 of 4
Hence,
E
(
X
|
Y
= 1) =
integraldisplay
1
0
(
x
3
+
2
x
3
)
dx
=
parenleftbigg
x
4
4
+
x
2
3
parenrightbigg
vextendsingle
vextendsingle
vextendsingle
1
0
=
1
4
+
1
3
=
7
12
.
7. (11 points) Let
X
and
Y
be independent geometric random variables with the same parameter
p
. What
is the probability that
X
+
Y
is odd?
Solution:
If
X
+
Y
is odd, then
X
is even and
Y
is odd or
X
is odd and
Y
is even. Let
A
be the
event that
X
is even and
B
be the event that
X
is odd. Note that the probability
X
is odd (or
even) is the same is the probability that
Y
is odd (or even), since
X, Y
have the same distribution.
Since
X
and
Y
are independent, then
P
(
X
+
Y
= odd) = 2
P
(
A
∩
B
) = 2
P
(
A
)
P
(
B
) = 2
P
(
A
)(1
-
P
(
A
))
,
using the fact that
A, B
are disjoint events and
P
(
A
∪
B
) = 1.
Hence it suffices to find
P
(
A
). The probability
P
(
A
) is can be expressed by the sum
P
(
A
) =
∞
summationdisplay
k
=2 and
k
even
p
(1
-
p
)
k
−
1
=
p
∞
summationdisplay
i
=1
(1
-
p
)
2
i
−
1
=
p
1
-
p
∞
summationdisplay
i
=1
(1
-
p
)
2
i
.
Using the identity
∞
summationdisplay
i
=1
a
i
=
a
1
-
a
, then
P
(
A
) =
p
(1
-
p
)
(1
-
p
)
2
1
-
(1
-
p
)
2
=
p
(1
-
p
)
2
p
-
p
2
=
1
-
p
2
-
p
.
Hence,
P
(
X
+
Y
= odd) =
2
-
2
p
2
-
p
parenleftbigg
1
-
1
-
p
2
-
p
parenrightbigg
=
2
-
2
p
(2
-
p
)
2
.
8. (11 points) Let
S
200
be the number of heads that turn up in 200 tosses of a fair coin. Use the Central
Limit Theorem to estimate the probability
P
(90
≤
S
200
≤
110).
Solution:
Let
X
i
be the result of a single coin toss, taking value 1 if it turns up heads and 0 if
it turns out tails.
Since the coin is fair,
E
(
X
) =
1
2
and Var(
X
) =
1
4
.
Hence by the central limit
theorem, since
S
200
=
∑
200
i
=1
X
i
and each
{
X
i
}
is identical and independently distributed,
S
200
-
200(
1
2
)
√
200
2
=
S
200
-
100
5
√
2
is approximately distributed as a standard normal random variable.
Hence, using a standard normal table and the observation above,
P
(90
≤
S
200
≤
110)
≈
P
(
-
10
5
√
2
≤
Z
≤
10
5
√
2
) =
P
(
|
Z
| ≤
√
2)
≈
2(0
.
4207)
≈
0
.
84
.
Math 302
Final Examination, April 2013
Page 1 of 1
Instructions:
The duration for this exam is 150 minutes (2 hours 30 minutes). There are a total of 6 ques-
tions worth 17 points each (102 points total). Simplfy your answer as much as possible but answers may in-
clude factorials, “choose” symbols, or exponentials. You may also use the function
φ
(
a
) =
1
√
2
π
integraltext
a
-∞
e
-
x
2
/
2
dx
in your answer. No calculators, books, notebooks, or any other written materials are allowed.
1.
(a) Carefully define (with formulas) what it means for three events
A, B
and
C
to be independent.
(b) Let
A, B, C
be independent events with
P
(
A
) =
P
(
B
) =
P
(
C
) =
1
2
.
i. Compute
P
(
A
∪
B
∪
C
).
ii. Let
X
be the indicator random variable of the event
A
∪
B
and
Y
the indicator random variable
of the event
B
∪
C
(that is,
X
= 1 if
A
∪
B
occurred and 0 otherwise,
Y
= 1 if
B
∪
C
occurred
and 0 otherwise). Compute
E
(
XY
).
2.
(a) Die #1 has 6 sides numbered 1
,
· · ·
,
6 and die #2 has 8 sides numbered 1
,
· · ·
,
8.
One of these
two dice is chosen at random and rolled 10 times. Find the conditional probability that you have
selected die #1 given that precisely three 1s were rolled.
(b) Let
X
and
Y
be independent Poisson random variables with mean 1.
Are
X
-
Y
and
X
+
Y
independent? Justify your answer.
(c) Let
X
be a geometric random variable with parameter
p
. Find
E
[min(
X,
5)].
3. Five distinct families arrive to a party.
Each family consists of 3 people.
The 15 participants of the
party are arranged randomly in a line.
(a) What is the probability that members of the Smith family sit next to each other?
(b) What is the probability that all the members of the Smith family sit next to each other, but not all
the members of the Johnson family sit next to each other?
(c) Let
X
be the number of families that their members sit next to each other. Find
E
(
X
) and Var(
X
).
4. Let
R
be the triangle in the
xy
plane with corners at (
-
1
,
0), (0
,
1) and (1
,
0). Assume (
X, Y
) is uniformly
distributed over
R
, that is,
X
and
Y
have a joint density which is a constant
c
on
R
, and equal to 0 on
the complement of
R
.
(a) Find
c
.
(b) Find the marginal densities of
X
and
Y
.
(c) Are
X
and
Y
independent? Justify your answer.
(d) Are
X
and
Y
uncorrelated? Justify your answer.
5.
(a) Let
Z
be a normal random variable with mean 0 and variance 3. Compute
E
(
|
Z
|
).
(b) Let
X
and
Y
be independent exponential random variables with mean 1. Find the density function
of
X
-
Y
.
6. The waiting time in hours of Mrs. Cohen at the clinic is a continuous random variable with density
f
(
y
) =
braceleftBigg
cy
(2
-
y
)
0
≤
y
≤
2
0
otherwise
.
(a) Find
c
.
(b) What is the probability that she waits more than an hour?
(c) Mrs. Cohen goes to the clinic each day for 100 days. The waiting time in each day is independent
and has the same distribution. Let
A
be the event that Mrs. Cohen waits more than an hour in at
least (
≥
) 60 days. Use Markov’s inequality to bound
P
(
A
).
(d) Use the central limit theorem to approximate
P
(
A
).
Math 302
Final Examination, April 2013
Page 1 of 7
Instructions:
The duration for this exam is 150 minutes (2 hours 30 minutes). There are a total of 6 ques-
tions worth 17 points each (102 points total). Simplfy your answer as much as possible but answers may in-
clude factorials, “choose” symbols, or exponentials. You may also use the function
φ
(
a
) =
1
√
2
π
integraltext
a
-∞
e
-
x
2
/
2
dx
in your answer. No calculators, books, notebooks, or any other written materials are allowed.
1.
(a) Carefully define (with formulas) what it means for three events
A, B
and
C
to be independent.
Solution:
The three events are independent if
P
(
A
∩
B
∩
C
) =
P
(
A
)
P
(
B
)
P
(
C
),
P
(
A
∩
B
) =
P
(
A
)
P
(
B
),
P
(
B
∩
C
) =
P
(
B
)
P
(
C
) and
P
(
A
∩
C
) =
P
(
A
)
P
(
C
).
(b) Let
A, B, C
be independent events with
P
(
A
) =
P
(
B
) =
P
(
C
) =
1
2
.
i. Compute
P
(
A
∪
B
∪
C
).
Solution: Method 1
: The event
A
∪
B
∪
C
is the same as (
A
c
∩
B
c
∩
C
c
)
c
. Since
A, B, C
are independent, so are
A
c
, B
c
, C
c
. Hence,
P
(
A
∪
B
∪
C
) = 1
−
P
(
A
c
∩
B
c
∩
C
c
) = 1
−
P
(
A
c
)
P
(
B
c
)
P
(
C
c
)
.
Since
P
(
A
c
) =
P
(
B
c
) =
P
(
C
c
) =
1
2
by assumption that
P
(
A
) =
P
(
B
) =
P
(
C
) =
1
2
, then
P
(
A
∪
B
∪
C
) = 1
−
(
1
2
)
3
=
7
8
.
Method 2
: The principle of inclusion and exclusion may be used to compute
P
(
A
∪
B
∪
C
).
Since
P
(
A
∪
B
∪
C
) =
P
(
A
) +
P
(
B
) +
P
(
C
)
−
P
(
A
∩
B
)
−
P
(
B
∩
C
)
−
P
(
A
∩
C
) +
P
(
A
∩
B
∩
C
)
.
and by independence,
P
(
A
∩
B
) =
P
(
B
∩
C
) =
P
(
A
∩
C
) =
1
4
and
P
(
A
∩
B
∩
C
) =
1
8
, then
P
(
A
∪
B
∪
C
) =
3
2
−
3
4
+
1
8
=
7
8
.
ii. Let
X
be the indicator random variable of the event
A
∪
B
and
Y
the indicator random variable
of the event
B
∪
C
(that is,
X
= 1 if
A
∪
B
occurred and 0 otherwise,
Y
= 1 if
B
∪
C
occurred
and 0 otherwise). Compute
E
(
XY
).
Solution:
The random variable
XY
takes the value 1 when both events
A
∪
B
and
B
∪
C
occur, which is exactly when the event
B
∪
(
A
∩
C
) occurs, and is 0 otherwise. Hence,
E
(
XY
) =
P
(
B
∪
(
A
∩
C
)) =
P
(
A
∩
C
) +
P
(
B
)
−
P
(
A
∩
C
∩
B
)
.
By independence,
E
(
XY
) =
1
4
+
1
2
−
1
8
=
5
8
.
2.
(a) Die #1 has 6 sides numbered 1
,
· · ·
,
6 and die #2 has 8 sides numbered 1
,
· · ·
,
8.
One of these
two dice is chosen at random and rolled 10 times. Find the conditional probability that you have
selected die #1 given that precisely three 1s were rolled.
Solution:
The number of 3s rolled if Die #1 is selected is distributed as a binomial random
variable with
n
= 10
, p
=
1
6
, and similarly the number of 3s rolled if Die #2 is selected is
binomially distributed with
n
= 10
, p
=
1
8
. From this, we can conclude that
P
(dice 1 selected and 3 1s rolled) =
1
2
parenleftbigg
10
3
parenrightbigg parenleftbigg
1
6
parenrightbigg
3
parenleftbigg
5
6
parenrightbigg
7
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Math 302
Final Examination, April 2013
Page 2 of 7
and
P
(3 1s rolled) =
1
2
parenleftbigg
10
3
parenrightbigg
bracketleftBigg
parenleftbigg
1
6
parenrightbigg
3
parenleftbigg
5
6
parenrightbigg
7
+
parenleftbigg
1
8
parenrightbigg
3
parenleftbigg
7
8
parenrightbigg
7
bracketrightBigg
.
Hence using the definition of conditional probability
P
(dice 1 selected
|
3 1s rolled) =
(
1
6
)
3
(
5
6
)
7
(
1
6
)
3
(
5
6
)
7
+ (
1
8
)
3
(
7
8
)
7
≈
0
.
63
.
(b) Let
X
and
Y
be independent Poisson random variables with mean 1.
Are
X
−
Y
and
X
+
Y
independent? Justify your answer.
Solution:
No. These random variables are not independent. Firstly recall that
X, Y
can take
the values 0
,
1
,
2
,
· · ·
only.
Method 1
:
If the random variables are independent, then
P
(
X
−
Y
= 0
|
X
+
Y
= 0) =
P
(
X
−
Y
= 0)
<
1, but
P
(
X
−
Y
= 0
|
X
+
Y
= 0) = 1, since if
X
+
Y
= 0, then
X
= 0 and
Y
= 0 must hold.
Method 2
: If the random variables are independent, then
P
((
X
+
Y
= 1)
∩
(
X
−
Y
= 0)) =
P
(
X
+
Y
= 1)
P
(
X
−
Y
= 0)
negationslash
= 0. However,
P
((
X
+
Y
= 1)
∩
(
X
−
Y
= 0)) = 0, since
X, Y
must be integer-valued.
(c) Let
X
be a geometric random variable with parameter
p
. Find
E
[min(
X,
5)].
Solution:
Let
Y
= min(
X,
5). Since
Y
=
X
when
X <
5, then
P
(
Y
= 1) =
p
P
(
Y
= 2) = (1
−
p
)
p
P
(
Y
= 3) = (1
−
p
)
2
p
P
(
Y
= 4) = (1
−
p
)
3
p
The probability that
Y
= 5 can be found using
P
(
Y
= 5) = 1
−
∑
4
i
=1
P
(
Y
=
i
).
Hence,
P
(
Y
= 5) = 1
−
p
[1 + (1
−
p
) + (1
−
p
)
2
+ (1
−
p
)
3
] = 1
−
p
1
-
(1
-
p
)
4
1
-
(1
-
p
)
= (1
−
p
)
4
. Hence using
the definition of expectation,
E
(
Y
) =
p
+ 2
p
(1
−
p
) + 3
p
(1
−
p
)
2
+ 4
p
(1
−
p
)
3
+ 5(1
−
p
)
4
.
3. Five distinct families arrive to a party.
Each family consists of 3 people.
The 15 participants of the
party are arranged randomly in a line.
(a) What is the probability that members of the Smith family sit next to each other?
Solution: Method 1
:
The first member of the Smith family may sit in position
i
where
i
= 1
,
· · ·
,
13 and the other two members will sit in positions
i
+ 1
, i
+ 2. There are 6 possible
ways to arrange the Smith family members of
i, i
+ 1
, i
+ 2 are their positions and 12! possible
ways to arrange everyone else. Hence, there are (13)(6)(12!) possibilities out of 15! arrangements
where the Smith family sits next to each other. Since each arrangement is equally probable,
then the probability is
(13)(6)(12!)
15!
=
1
35
.
Method 2
: The positions of the families can be represented 3 element subsets of
{
1
,
· · ·
,
15
}
,
and there are 13 sets of the form
{
i, i
+ 1
, i
+ 2
}
, which represent families sitting in consecutive
seats. Hence, the probability that the Smith family sits next to each other is
13
(
15
3
)
=
1
35
.
Math 302
Final Examination, April 2013
Page 3 of 7
(b) What is the probability that all the members of the Smith family sit next to each other, but not all
the members of the Johnson family sit next to each other?
Solution: Method 1
: Let
A
be the event that the Smith family sits next to each other and
B
be the event that the Johnson family sits next to each other. We are asked to compute the
probability
P
(
A
∩
B
c
), which can be computed from
P
(
A
∩
B
c
) =
P
(
A
)
−
P
(
A
∩
B
).
Hence, we need to count the number of configurations where both the Smith and the Johnson
families sit next to each other.
•
If the Smith family sits at position 1 or 13, then the Johnson family can sit in 10 possible
positions.
•
If the Smith family sits at position 2 or 12, then the Johnson family can sit in 9 possible
positions.
•
If the Smith family sits at position 3 or 11, then the Johnson family can sit in 8 possible
positions.
•
If the Smith family sits at any other position
j
, then the Johnson family can sit in
(
j
−
3) + (13
−
(
j
+ 3) + 1) = 8 possible positions.
Hence, there are [2(10) + 2(9) + 9(8)]9!(3!)
2
= 110(9!)(3!)
2
= 11!(3!)
2
possible arrangements
where both the Smith and Johnson families sit next to each other since there are 2(10)+2(9)+
9(8) possible positions for the first member of their families, 3! ways to arrange each family,
and 9! ways to arrange the remainder of the people. Hence
P
(
A
∩
B
) =
11!36
15!
=
36
15
∗
14
∗
13
∗
12
=
1
910
. This implies that using the result of Part (a),
P
(
A
∩
B
c
) =
1
35
−
1
910
=
25
910
=
5
182
.
Method 2
: To compute
P
(
A
∩
B
), we can compute the number of ways we can pick 2 disjoint
subsets
A, B
⊂ {
1
,
· · ·
,
15
}
with
A, B
containing consecutive numbers. The same analysis as
before yields 110 pairs (
A, B
) satisfying the given condition and there are
(
15
3
)(
12
3
)
possible pairs
of subsets in total. Hence
P
(
A
∩
B
) =
110
(
15
3
)(
12
3
)
=
1
910
.
The probability
P
(
A
∩
B
c
) can then be
computed from
P
(
A
) and
P
(
A
∩
B
) as before.
(c) Let
X
be the number of families that their members sit next to each other. Find
E
(
X
) and Var(
X
).
Solution:
Let
U
i
be the random variable which takes the value 1 if positions
i, i
+ 1
, i
+ 2 are
occupied by the same family and 0 otherwise. Then
X
=
∑
13
i
=1
U
i
.
Since the probability that
i
+ 1 and
i
are from the same family is
2
14
and the probability
that
i
+ 2 and
i
+ 1 are from the same family given
i
+ 1 and
i
are the same is
1
13
, then
P
(
U
i
= 1) =
E
(
U
i
) =
1
91
. This implies that
E
(
X
) =
13
91
=
1
7
, by linearity of expectation.
To compute the variance, we must first compute
E
(
U
i
U
j
) for every pair (
i, j
) with 1
≤
i
≤
13
, j
≥
i
. Since
U
2
i
=
U
i
, then
E
(
U
2
i
) =
1
91
. Otherwise
E
(
U
i
U
j
) = 0 if
j
=
i
+ 1
, i
+ 2 since each
family has 3 members only. Finally if
j
≥
i
+ 3, then
E
(
U
i
U
j
) =
2
14
1
13
2
11
1
10
=
4
(14)(13)(11)(10)
=
1
5
*
7
*
11
*
13
(following the previous calculation for
E
(
U
i
)). Hence, we are now ready to compute
Var(
X
) using the formula.
Math 302
Final Examination, April 2013
Page 4 of 7
Var(
X
) =
13
summationdisplay
i
=1
Var(
U
i
) + 2
summationdisplay
1
≤
i<j
≤
13
Cov(
U
i
, U
j
)
For each
U
i
, Var(
U
i
) =
E
(
U
2
i
)
−
E
(
U
i
)
2
=
1
91
(1
−
1
91
).
Next, Cov(
U
i
, U
j
) =
E
(
U
i
U
j
)
−
E
(
U
i
)
E
(
U
j
), so
Cov(
U
i
, U
j
) =
braceleftBigg
−
1
91
2
j
=
i
+ 1
, i
+ 2
1
5
*
7
*
11
*
13
−
1
91
2
j
≥
i
+ 3
.
All together, this implies that
Var(
X
) =
13
91
(1
−
1
91
) + 2
13
summationdisplay
i
=1
i
+2
summationdisplay
j
=
i
+1
−
1
91
2
+
13
summationdisplay
i
=1
13
summationdisplay
j
=
i
+3
parenleftbigg
1
5
∗
7
∗
11
∗
13
−
1
91
2
parenrightbigg
.
Now we evaluate the sums. The first sum
∑
13
i
=1
∑
i
+2
j
=
i
+1
−
1
91
2
has (11)(2) + 1 = 23 terms and
the second sum has
10(11)
2
= 55 terms. Hence,
Var(
X
) =
13
91
parenleftbigg
1
−
1
91
parenrightbigg
−
46
91
2
+
110
5
∗
7
∗
11
∗
13
−
110
91
2
=
15
91
−
169
91
2
=
92
637
.
4. Let
R
be the triangle in the
xy
plane with corners at (
−
1
,
0), (0
,
1) and (1
,
0). Assume (
X, Y
) is uniformly
distributed over
R
, that is,
X
and
Y
have a joint density which is a constant
c
on
R
, and equal to 0 on
the complement of
R
.
(a) Find
c
.
Solution:
The area of the triangle is
(2)(1)
2
= 1, so
c
=
1
1
= 1 since (
X, Y
) is jointly uniformly
distributed over
R
.
(b) Find the marginal densities of
X
and
Y
.
Solution:
For a given
x
with
−
1
≤
x
≤
1, then 0
≤
y
≤
1
− |
x
|
. Hence,
f
X
(
x
) =
integraldisplay
∞
-∞
f
(
x, y
)
dx
=
integraldisplay
1
-|
x
|
0
1
dy
= 1
− |
x
|
and is 0 otherwise.
For a given
y
with 0
≤
y
≤
1,
x
ranges from
y
−
1 to 1
−
y
. Hence,
f
Y
(
y
) =
integraldisplay
∞
-∞
f
(
x, y
)
dx
=
integraldisplay
1
-
y
y
-
1
1
dx
= 2
−
2
y
and is 0 otherwise.
(c) Are
X
and
Y
independent? Justify your answer.
Solution:
The variables
X
and
Y
are independent if and only if the joint density
f
(
x, y
) is
equal to the product
f
X
(
x
)
f
Y
(
y
). Since 1
negationslash
= (2
−
2
y
)(1
− |
x
|
) so
X
and
Y
are not independent.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Math 302
Final Examination, April 2013
Page 5 of 7
(d) Are
X
and
Y
uncorrelated? Justify your answer.
Solution:
To determine if
X
and
Y
are correlated, we need to compute Cov(
X, Y
) =
E
(
XY
)
−
E
(
X
)
E
(
Y
). Since
E
(
X
) = 0 by symmetry, then
E
(
X
)
E
(
Y
) = 0. Next,
E
(
XY
) =
integraldisplay
1
0
integraldisplay
1
-
y
y
-
1
xy dx dy
=
integraldisplay
1
0
x
2
y
2
vextendsingle
vextendsingle
vextendsingle
1
-
y
y
-
1
=
1
2
integraldisplay
1
0
[(1
−
y
)
2
y
−
(
y
−
1)
2
y
] = 0
.
Hence Cov(
X, Y
) = 0 and
X
and
Y
are uncorrelated.
5.
(a) Let
Z
be a normal random variable with mean 0 and variance 3. Compute
E
(
|
Z
|
).
Solution:
The probability density of
Z
is given by
f
Z
(
z
) =
1
√
6
π
e
-
z
2
6
. Hence
E
(
|
Z
|
) =
integraldisplay
∞
-∞
|
z
|
1
√
6
π
e
-
z
2
6
dz
=
2
√
6
π
integraldisplay
∞
0
ze
-
z
2
6
dz.
Using the substitution
u
=
z
2
6
,
integraldisplay
∞
0
ze
-
z
2
6
dz
=
integraldisplay
∞
0
3
e
-
u
du
= 3
.
Therefore,
E
(
|
Z
|
) =
6
√
6
π
=
radicalbigg
6
π
.
(b) Let
X
and
Y
be independent exponential random variables with mean 1. Find the density function
of
X
−
Y
.
Solution:
We will firstly determine the cumulative distribution function of
X
−
Y
and then
differentiate to determine the density function.
Firstly, since
X
and
Y
are independent, the
joint density of (
X, Y
) is
e
-
(
x
+
y
)
for all
x, y
≥
0 (and is zero otherwise).
Let
c
≤
0 be given. Then
X
−
Y
≤
c
when
Y
≥
X
−
c
. Hence
P
(
X
−
Y
≤
c
) =
integraldisplay
∞
0
integraldisplay
∞
x
-
c
e
-
(
x
+
y
)
dy dx
=
integraldisplay
∞
0
−
e
-
x
e
-
y
vextendsingle
vextendsingle
vextendsingle
∞
x
-
c
=
integraldisplay
∞
0
e
-
2
x
+
c
=
e
-
2
x
+
c
−
2
vextendsingle
vextendsingle
vextendsingle
∞
0
=
e
c
2
.
Now let
c >
0 be given. If
X
−
Y
≤
c
, then
Y
≥
max(0
, X
−
c
). Hence,
P
(
X
−
Y
≤
c
) =
integraldisplay
c
0
integraldisplay
∞
0
e
-
(
x
+
y
)
dy dx
+
integraldisplay
∞
c
integraldisplay
∞
x
-
c
e
-
x
+
y
dy dx.
The first integral is
integraldisplay
c
0
integraldisplay
∞
0
e
-
(
x
+
y
)
dy dx
=
parenleftbiggintegraldisplay
c
0
e
-
x
dx
parenrightbigg parenleftbiggintegraldisplay
∞
0
e
-
y
dy
parenrightbigg
= 1
−
e
-
c
and the second integral is
integraldisplay
∞
c
integraldisplay
∞
x
-
c
e
-
x
+
y
dy dx
=
integraldisplay
∞
c
e
-
2
x
+
c
dx
=
e
-
2
c
+
c
2
=
e
-
c
2
.
Math 302
Final Examination, April 2013
Page 6 of 7
Hence for
c >
0,
P
(
X
−
Y
≤
c
) = 1
−
e
-
c
+
e
-
c
2
= 1
−
e
-
c
2
.
Hence, the cumultative distribution
function is
P
(
X
−
Y
≤
c
) =
braceleftBigg
e
c
2
c
≤
0
1
−
e
-
c
2
c >
0
,
which implies that the density function of
X
−
Y
is
f
(
c
) =
braceleftBigg
e
c
2
c
≤
0
e
-
c
2
c >
0
=
1
2
e
-
c
.
6. The waiting time in hours of Mrs. Cohen at the clinic is a continuous random variable with density
f
(
y
) =
braceleftBigg
cy
(2
−
y
)
0
≤
y
≤
2
0
otherwise
.
(a) Find
c
.
Solution:
The constant
c
satisfies
integraltext
2
0
cy
(2
−
y
)
dy
= 1. Since
integraltext
2
0
y
(2
−
y
)
dy
= (
y
2
−
y
3
3
)
|
2
0
=
4
−
8
3
=
4
3
, then
c
4
3
= 1 implies that
c
=
3
4
.
(b) What is the probability that she waits more than an hour?
Solution:
The probability she waits more than an hour is given by the integral
integraltext
2
1
3
4
y
(2
−
y
).
Since
3
4
integraldisplay
2
1
(2
y
−
y
2
)
dy
=
3
4
(
y
2
−
y
3
3
)
vextendsingle
vextendsingle
vextendsingle
2
1
=
3
4
parenleftbigg
4
3
−
2
3
parenrightbigg
=
1
2
then the probability she waits more than one hour is
1
2
.
(c) Mrs. Cohen goes to the clinic each day for 100 days. The waiting time in each day is independent
and has the same distribution. Let
A
be the event that Mrs. Cohen waits more than an hour in at
least (
≥
) 60 days. Use Markov’s inequality to bound
P
(
A
).
Solution:
Let 1
A
i
be the indicator random variable which takes the value 1 if Ms. Cohen waits
more than one hour on day
i
and is 0 otherwise. Let
X
=
∑
100
i
=1
1
A
i
, and this problem requests
a bound on
P
(
X
≥
60). Since
E
(
X
) = 100
P
(
A
i
= 1) =
100
2
= 50 by linearity of expectation,
Markov’s inequality implies that
P
(
A
) =
P
(
X
≥
60)
≤
E
(
X
)
60
=
5
6
.
(d) Use the central limit theorem to approximate
P
(
A
).
Solution:
If 1
A
i
are the indicator random variables defined in the previous problem, then
Math 302
Final Examination, April 2013
Page 7 of 7
Var(1
A
i
) =
E
(1
A
i
)
−
E
(1
A
i
)
2
=
1
4
.
Therefore, by the central limit theorem, if
X
is the number of days Ms. Cohen needs to wait
more than an hour, then
X
-
100(
1
2
)
10
2
=
X
-
50
5
is approximately normal. Hence
P
(
A
) =
P
(
X
≥
60)
≈
P
(
Z
≥
60
−
50
5
) =
P
(
Z
≥
2) = 1
−
Φ(2)
where Φ(2) is the error function Φ(
x
) =
integraltext
x
-∞
1
√
2
π
e
-
z
2
2
dz
.
Numerically Φ(2)
≈
0
.
977, so
P
(
A
)
≈
0
.
023
.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Math 302
Final Examination, December 2012
Page 1 of 1
Instructions:
Calculators are allowed but no other materials are permitted. A table of common distribu-
tions and values for the cumulative distribution function of the normal distribution were given.
1. (15 points)
(a) Define precisely the covariance of random variables
X
and
Y
.
(b) Define precisely the correlation coefficient of random variables
X
and
Y
.
(c) Define precisely what it means for events
A, B, C
to be independent.
2. (10 points) A fair coin is tossed 4 times.
(a) What is the probability of getting exactly 3 heads?
(b) What is the probability of getting exactly 3 heads conditioned on the event that the first two tosses
came out the same?
3. (5 points) Let
A, B
be events so that
P
(
A
) = 0
.
5,
P
(
B
) = 0
.
4 and
P
(
A
∪
B
) = 0
.
7. What is
P
(
A
|
B
)?
4. (5 points) If
X, Y
are independent random variables with
X
∼
Exp
(1) and
Y
∼
Bin
(
n, p
), what is
P
(
X > Y
)?
5. (20 points) Consider variables (
X, Y
) which are jointly uniformly distributed with density
a
over the
triangle with corners (0
,
0)
,
(6
,
0) and (6
,
3).
(a) Find a.
(b) Find the marginal densities of
X
and
Y
.
(c) Find
E
(
XY
).
(d) Find
P
(
X >
6
Y
).
6. (15 points)
(a) Precisely state the central limit theorem.
(b) Suppose the weight of a person is a random variable with mean 75 (kg) and variance
σ
2
= 100.
An airline has 400 passengers on the flight.
Assuming that the weights of the passengers are
independent, use the Central Limit Theorem to estimate the probability that the total weight
exceeds 30500 kg.
(c) Use Chebyshev’s inequality to give a bound on the probability that the total weight exceeds 30500
kg.
7. (10 points) If
Z
1
, Z
2
and independent
N
(0
,
1) random variables, what is the distribution of
(a) 2
Z
1
+
Z
2
?
(b) 2
Z
1
-
Z
2
?
8. (10 points)
(a) Let
X
=
Poi
(
λ
) for some
λ >
0.
For which values of
t
is
E
(
e
tX
) finite?
When it is
finite, what is
E
(
e
tX
)?
(b) Let
Y
=
Exp
(
λ
) for some
λ >
0. For which values of
t
is
E
(
e
tY
) finite? When it is finite, what is
E
(
e
tY
)
9. (15 points) Alice and Bob arrange the digits 1
, ...,
9 in independent random orders, and compare the
resulting numbers digit by digit.
Let
Q
be the number of digits in agreement.
For example, if the
numbers happen to be 475619283 and 374956182, then
Q
= 2 (the digits 7 and 8 are in the same
position).
(a) What approximation rule gives an estimate for the distribution of
Q
?
(b) Find
E
(
Q
).
(c) Find Var(
Q
).
Math 302
Final Examination, December 2012
Page 1 of 6
Instructions:
Calculators are allowed but no other materials are permitted. A table of common distribu-
tions and values for the cumulative distribution function of the normal distribution were given.
1. (15 points)
(a) Define precisely the covariance of random variables
X
and
Y
.
Solution:
The covariance of two random variables
X, Y
is defined by the formula Cov(
X, Y
) =
E
((
X
-
E
(
X
))(
Y
-
E
(
Y
)), or equivalently Cov(
X, Y
) =
E
(
XY
)
-
E
(
X
)
E
(
Y
)
.
(b) Define precisely the correlation coefficient of random variables
X
and
Y
.
Solution:
The correlation coefficient
ρ
(
X, Y
) is defined by
ρ
(
X, Y
) =
Cov(
X, Y
)
radicalbig
Var(
X
)Var(
Y
)
where
Cov is the covariance of (
X, Y
) and Var is the variance of a random variable.
(c) Define precisely what it means for events
A, B, C
to be independent.
Solution:
The events
A, B, C
are said to be independent if
P
(
A
∩
B
∩
C
) =
P
(
A
)
P
(
B
)
P
(
C
),
P
(
A
∩
B
) =
P
(
A
)
P
(
B
),
P
(
B
∩
C
) =
P
(
B
)
P
(
C
) and
P
(
A
∩
C
) =
P
(
A
)
P
(
C
).
2. (10 points) A fair coin is tossed 4 times.
(a) What is the probability of getting exactly 3 heads?
Solution:
If
X
is the random variable denoting the number of heads, then
X
is binomially
distributed with
n
= 4 and
p
=
1
2
.
Therefore, the probability of getting exactly 3 heads is
(
4
3
)
1
2
4
=
1
4
.
Equivalently, the four sequences of coin tosses
THHH, HTHH, HHTH, HHHT
are the only
sequences where exactly 3 heads are obtained.
Since there are 16 possible sequences of coin
tosses and they are all equally likely, the probability is
4
16
=
1
4
.
(b) What is the probability of getting exactly 3 heads conditioned on the event that the first two tosses
came out the same?
Solution:
The only sequences where the first two tosses are the same and exactly 3 heads are
obtained are
HHTH
and
HHHT
. Next, the probability the first two tosses are the same is
1
2
.
By the definition of conditional probability,
P
(3 heads
|
first two same) =
2
/
16
1
/
2
=
1
4
.
3. (5 points) Let
A, B
be events so that
P
(
A
) = 0
.
5,
P
(
B
) = 0
.
4 and
P
(
A
∪
B
) = 0
.
7. What is
P
(
A
|
B
)?
Solution:
The definition of conditional probability states that
P
(
A
|
B
) =
P
(
A
∩
B
)
P
(
B
)
. Since
P
(
A
∩
B
) =
P
(
A
) +
P
(
B
)
-
P
(
A
∪
B
) = 0
.
2
,
then
P
(
A
|
B
) =
0
.
2
0
.
4
= 0
.
5
.
Math 302
Final Examination, December 2012
Page 2 of 6
4. (5 points) If
X, Y
are independent random variables with
X
∼
Exp
(1) and
Y
∼
Bin
(
n, p
), what is
P
(
X > Y
)?
Solution:
We will evaluate this probability by conditioning on the value of
Y
. Since
Y
can take
integer values from 0 to
n
, then
P
(
X > Y
) =
n
summationdisplay
i
=0
P
(
X > i
|
Y
=
i
)
P
(
Y
=
i
) =
n
summationdisplay
i
=0
P
(
X > i
)
P
(
Y
=
i
)
noting that
P
(
X > i
|
Y
=
i
) =
P
(
X > i
) since
X
and
Y
are independent and using the law of total
probability.
Since
X
is exponentially distributed with parameter
λ
= 1, then
P
(
X > i
) =
integraldisplay
∞
i
e
-
x
dx
=
e
-
i
.
Since
Y
is binomially distributed with parameters
n
and
p
,
P
(
Y
=
i
) =
(
n
i
)
p
i
(1
-
p
)
n
-
i
for 0
≤
i
≤
n
.
Therefore, using the binomial theorem,
P
(
X > Y
) =
n
summationdisplay
i
=0
parenleftbigg
n
i
parenrightbigg
e
-
i
p
i
(1
-
p
)
n
-
i
=
n
summationdisplay
i
=0
parenleftbigg
n
i
parenrightbigg
parenleftBig
p
e
parenrightBig
i
(1
-
p
)
n
-
i
=
parenleftBig
p
e
+ 1
-
p
parenrightBig
n
.
5. (20 points) Consider variables (
X, Y
) which are jointly uniformly distributed with density
a
over the
triangle with corners (0
,
0)
,
(6
,
0) and (6
,
3).
(a) Find a.
Solution:
The area of the triangle formed by the three corners is
6
*
3
2
= 9. Since (
X, Y
) are
jointly uniformly distributed over the triangle, then
a
=
1
9
.
(b) Find the marginal densities of
X
and
Y
.
Solution:
The triangle is defined by 0
≤
x
≤
6
,
0
≤
y
≤
1
2
x
, or equivalently 0
≤
y
≤
3
,
2
y
≤
x
≤
6.
Hence the marginal density of
X
is
f
X
(
x
) =
integraldisplay
f
(
x, y
)
dy
=
integraldisplay
1
2
x
0
1
9
dy
=
1
18
x
and the marginal density of
Y
is
f
Y
(
y
) =
integraldisplay
f
(
x, y
)
dx
=
integraldisplay
6
2
y
1
9
dy
=
1
9
(6
-
2
y
)
.
(c) Find
E
(
XY
).
Solution:
The expectation of
XY
can be computed by the integral
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Math 302
Final Examination, December 2012
Page 3 of 6
E
(
XY
) =
integraldisplay
xyf
(
X,Y
)
(
x, y
)
dy dx
=
integraldisplay
6
0
integraldisplay
1
2
x
0
xy
1
9
dy dx.
The integral evaluates to
integraldisplay
6
0
xy
2
18
vextendsingle
vextendsingle
vextendsingle
1
2
x
0
=
integraldisplay
6
0
x
3
72
dx
=
6
4
4
*
72
=
9
2
.
(d) Find
P
(
X >
6
Y
).
Solution:
If
x >
6
y
, then
y <
1
6
x
. Hence,
P
(
X >
6
Y
) =
integraldisplay
6
0
integraldisplay
1
6
x
0
1
9
dy dx
=
1
9
integraldisplay
6
0
1
6
x dx
=
36
108
=
1
3
.
6. (15 points)
(a) Precisely state the central limit theorem.
Solution:
Let
X
1
, X
2
, X
3
,
· · ·
be a sequence of independent and identically distributed random
variables, each with mean
μ
and variance
σ
2
. Let
S
n
=
∑
n
i
=1
X
i
.
Then the distribution of the
random variable
S
n
-
nμ
σ
√
n
converges to the standard normal
Z
as
n
→ ∞
. That is,
lim
n
→∞
P
(
S
n
-
nμ
σ
√
n
≤
a
) =
1
√
2
π
integraldisplay
a
-∞
e
-
x
2
2
dx.
(b) Suppose the weight of a person is a random variable with mean 75 (kg) and variance
σ
2
= 100.
An airline has 400 passengers on the flight.
Assuming that the weights of the passengers are
independent, use the Central Limit Theorem to estimate the probability that the total weight
exceeds 30500 kg.
Solution:
Let
T
be the total weight on the flight. By the central limit theorem,
T
-
400(75)
10
√
400
=
T
-
30000
200
is approximately distributed as standard normal. Therefore,
P
(
T >
30500) =
P
(
T
-
30000
200
>
2
.
5)
≈
P
(
Z >
2
.
5)
.
Since
P
(
Z
≤
2
.
5) = 0
.
9938 using a standard normal table, then
P
(
Z >
2
.
5) = 1
-
0
.
9938 =
0
.
0062. Hence the probability that the total weight exceeds 30500 is approximately 0.0062.
(c) Use Chebyshev’s inequality to give a bound on the probability that the total weight exceeds 30500
kg.
Solution:
Let
T
be the total weight.
By linearity of expectation and variance (since the
weights are independent),
E
(
T
) = 400(75) = 30000 and Var(
T
) = 100(400) = 40000
.
Hence,
the event
T >
30500 is equal to the event
T
-
30000
>
500.
Solution 1
: Using the two-sided Chebyschev inequality,
P
(
|
X
-
μ
| ≥
a
)
≤
Var(
X
)
a
2
,
Math 302
Final Examination, December 2012
Page 4 of 6
P
(
T
-
30000
≥
500)
≤
P
(
|
T
-
30000
| ≥
500)
≤
Var(
T
)
500
2
=
40000
250000
= 0
.
16
.
Solution 2
: Using the one-sided Chebyschev inequality,
P
(
X
-
μ
≥
a
)
≤
Var(
X
)
Var(
X
) +
a
2
,
P
(
T
-
30000
≥
500)
≤
40000
40000 + 250000
=
40000
290000
=
4
29
≈
0
.
14
.
7. (10 points) If
Z
1
, Z
2
and independent
N
(0
,
1) random variables, what is the distribution of
(a) 2
Z
1
+
Z
2
?
(b) 2
Z
1
-
Z
2
?
Solution:
If
Z
1
, Z
2
are independent standard normal random variables, then the linear combination
a
1
Z
1
+
a
2
Z
2
is a normal random variable with mean
μ
=
a
1
(0)+
a
2
(0) = 0 and variance
σ
2
=
a
2
1
+
a
2
2
.
The above claim is most easily proved using moment generating functions. The moment generating
function of a standard normal is
φ
Z
(
t
) =
e
t
2
2
.
Therefore,
a
1
Z
1
+
a
2
Z
2
has moment generating
function
e
a
2
1
t
2
+
a
2
2
t
2
2
=
e
(
a
2
1
+
a
2
2
)
t
2
2
, which is exactly the moment generating function of a normal random
variable with
μ
= 0 and
σ
2
=
a
2
1
+
a
2
2
.
This implies that both 2
Z
1
+
Z
2
and 2
Z
1
-
Z
2
are distributed normally with
μ
= 0 and
σ
2
=
2
2
+ (
±
1)
2
= 5.
8. (10 points)
(a) Let
X
=
Poi
(
λ
) for some
λ >
0.
For which values of
t
is
E
(
e
tX
) finite?
When it is
finite, what is
E
(
e
tX
)?
Solution:
If
X
has the Poisson distribution with parameter
λ
, then
E
(
e
tX
) =
∞
summationdisplay
n
=0
e
tn
e
-
λ
λ
n
n
!
=
e
-
λ
∞
summationdisplay
n
=0
(
e
t
λ
)
n
n
!
.
The series
e
x
=
∞
summationdisplay
n
=0
x
n
n
!
converges for all
x
, so
E
(
e
tX
) is finite for all
t
.
Therefore using the
same series,
E
(
e
tX
) =
e
-
λ
e
e
t
λ
=
e
λ
(
e
t
-
1)
.
(b) Let
Y
=
Exp
(
λ
) for some
λ >
0. For which values of
t
is
E
(
e
tY
) finite? When it is finite, what is
E
(
e
tY
)
Solution:
If
Y
has the exponential distribution with parameter
λ
, then
E
(
e
tY
) =
integraldisplay
∞
0
e
tx
λe
-
λx
dx
=
λ
integraldisplay
∞
0
e
(
t
-
λ
)
x
dx.
The integral converges if and only if
t
-
λ <
0, so
E
(
e
tY
) is finite for
t < λ
only. When
t < λ
,
integraldisplay
∞
0
e
(
t
-
λ
)
x
dx
=
e
(
t
-
λ
)
x
t
-
λ
vextendsingle
vextendsingle
vextendsingle
∞
0
=
-
1
t
-
λ
=
1
λ
-
t
Math 302
Final Examination, December 2012
Page 5 of 6
since
lim
x
→∞
e
(
t
-
λ
)
x
= 0 if
t
-
λ <
0. Therefore,
E
(
e
tY
) =
λ
λ
-
t
.
9. (15 points) Alice and Bob arrange the digits 1
, ...,
9 in independent random orders, and compare the
resulting numbers digit by digit.
Let
Q
be the number of digits in agreement.
For example, if the
numbers happen to be 475619283 and 374956182, then
Q
= 2 (the digits 7 and 8 are in the same
position).
(a) What approximation rule gives an estimate for the distribution of
Q
?
Solution:
Let
X
i
=
braceleftBigg
1
digits in position
i
agree
0
digits in position
i
disagree
.
Then
Q
=
∑
9
i
=1
X
i
.
Since
{
X
i
}
are
“weakly dependent” and
P
(
X
i
= 1) is small, we expect that the Poisson distribution with
parameter
E
(
Q
) gives an estimate for the distribution of
Q
.
This is known as the “Poisson
paradigm”.
(b) Find
E
(
Q
).
Solution:
Using the definition of
X
i
in part (a), linearity of expectation implies
E
(
Q
) =
∑
9
i
=1
E
(
X
i
).
Since
E
(
X
i
) =
P
(digits in position i agree) =
9
81
=
1
9
by consider all possible
pairs of digits in position
i
, then
E
(
Q
) = 1.
(c) Find Var(
Q
).
Solution:
Since
X
i
are
not
independent, we apply the formula
Var(
Q
) =
9
summationdisplay
i
=1
Var(
X
i
) + 2
summationdisplay
1
≤
i<j
≤
9
Cov(
X
i
, X
j
)
.
Since
X
2
i
=
X
i
for every
i
,
Var(
X
i
) =
E
(
X
2
i
)
-
E
(
X
i
)
2
=
E
(
X
i
)
-
E
(
X
i
)
2
=
1
9
(1
-
1
9
)
.
Next, let
i < j
. Since
E
(
X
i
X
j
) is the probability that both digits in position
i
and position
j
agree, then
E
(
X
i
X
j
) =
9
81
8
64
=
1
9
1
8
=
1
72
by considering all possible pairs of digits at position
i
and position
j
.
Hence,
Cov(
X
i
, X
j
) =
E
(
X
i
X
j
)
-
E
(
X
i
)
E
(
X
j
) =
1
72
-
1
81
.
Putting this together, since there are
8
*
9
2
choices of indices
i, j
with 1
≤
i < j
≤
9, then
Var(
Q
) =
9
9
parenleftbigg
1
-
1
9
parenrightbigg
+
2
*
8
*
9
2
parenleftbigg
1
72
-
1
81
parenrightbigg
= 1
-
1
9
+
72
72
-
72
81
=
8
9
+ 1
-
8
9
= 1
.
These calculations also justify the approximation rule stated in (a) as a Poisson random variable
X
with parameter
λ
has mean
λ
and variance
λ
, and we expected that
Q
is approximately
Poisson with parameter
λ
=
E
(
Q
) = 1.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help