Concept explainers
In Example 6.14, Y1 and Y2 were independent exponentially distributed random variables, both with mean β. We defined U1 = Y1/(Y1 + Y2) and U2 = Y1 + Y2 and determined the joint density of (U1, U2) to be
- a Show that U1 is uniformly distributed over the interval (0, 1).
- b Show that U2 has a gamma density with parameters α = 2 and β.
- c Establish that U1 and U2 are independent.
Want to see the full answer?
Check out a sample textbook solutionChapter 6 Solutions
Mathematical Statistics with Applications
- Suppose that the random variables X, Y, Z have multivariate PDFfXYZ(x, y, z) = (x + y)e−z for 0 < x < 1, 0 < y < 1, and z > 0. FInd (d) fZ|XY (z|x,y), (e) fX|YZ(x|y, z).arrow_forwardLet X1 ... Xn i.i.d random variables with Xi ~ U(0,1). Find the pdf of Q = X1, X2, ... ,Xn. Note that first that -log(Xi) follows exponential distribuition.arrow_forwardSuppose the joint probability density of X and Y is fX,Y (x, y) = 3y 2 with 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1 and zero everywhere else. 1. Compute E[X|Y = y]. 2. Compute E[X3 + X|X < .5]arrow_forward
- If X and Y are independent exponential random variables, each having parameter λ.(a) Find the joint density function of U = X + Y by using the convolution of fX and fY .(b) Find the joint density function of V = X − Y by using the method of transformation.(c) Are U and V independent?arrow_forward9.19 Let X and Y be two continuous random variables, with joint proba- bility density function f(x, y): - 30 -50x²-50y² +80xy for -arrow_forwardSuppose that X1, X2, X3 are independent with the common probability mass function: P{Xi = 0} = 0.2, P{Xi =1} = 0.3, P{Xi = 3} = 0.5 i =1, 2,3 a. Plot the probability mass function of X2_average = (X1 + X2)/ 2 b. Determine E [X2_average] and Var [X2_average] c. Plot the probability mass function of X3_average = (X1 + X2 + X3)/ 3 d. Determine E [X3_average] and Var [X3_average]arrow_forward
- Let X1, ...., Xn be a random sample from a population with θ unknown and given by the density f(x; θ) = ( 1 2θ √2 x e − √2 x θ if x > 0 0 if x ≤ 0 1. Show that E(X) = 2θ 2 and E( √2 X) = θ (Hint: you may use that R ∞ 0 e −z z α−1dz = (α − 1)! for every α ∈ N). 2. Show that the statistic θbn := 1 n Xn i=1 p2 Xi (1) is an unbiased estimator of θ. 3. Give the definition of a consistent estimator. 4. Show that the estimator θbn given in relation (1) is a consistent estimator of θ. 5. Show that the estimator θbn is a minimum variance estimator of θ. (Hint: use the Cramer-Rao inequality given by var(θb) ≥ 1 nE ∂ ln(f(X;θ) ∂θ 2arrow_forwardLet random variables X and Y have the joint pdf fX,Y (x, y) = 4xy, 0 < x < 1, 0 < y < 1 0, otherwise Find the joint pdf of U = X^2 and V = XY.arrow_forwardLet X1,X2,... be a sequence of identically distributed random variables with E|X1|<∞ and let Yn = n−1max1≤i≤n|Xi|. Show that limnE(Yn) = 0arrow_forward
- Find the maximum likelihood estimator for θ in the pdf f(y; θ) = 2y/(1 − θ^2), θ ≤ y ≤ 1.arrow_forwardConsider a random process X(t) defined by X(t) = U cos t + (V + 1) sin t, −∞ < t < ∞where U and V are independent random variables for which E(U) = E(V) = 0 E(U2) = E(V2) = 1(a) Find the autocovariance function KX(t, s) of X(t).(b) Is X(t) WSS?arrow_forwardX1 and X2 are two discrete random variables, while the X1 random variable takes the values x1 = 1, x1 = 2 and x1 = 3, while the X2 random variable takes the values x2 = 10, x2 = 20 and x2 = 30. The combined probability mass function of the random variables X1 and X2 (pX1, X2 (x1, x2)) is given in the table below a) Find the marginal probability mass function (pX1 (X1)) of the random variable X1.b) Find the marginal probability mass function (pX2 (X2)) of the random variable X2.c) Find the expected value of the random variable X1.d) Find the expected value of the random variable X2.e) Find the variance of the random variable X1.f) Find the variance of the random variable X2.g) pX1 | X2 (x1 | x2 = 10) Find the mass function of the given conditional probability.h) pX2 | X1 (x2 | x1 = 2) Find the mass function of the given conditional probability.i) Are the random variables X1 and X2 independent? Show it. The combined probability mass function of the random variables X1 and X2 is belowarrow_forward
- Linear Algebra: A Modern IntroductionAlgebraISBN:9781285463247Author:David PoolePublisher:Cengage Learning