Concept explainers
Let
(a) Show that the pdf of
(b) Let
Show that the joint pdf of
(c) Show that the pdf of
(d) Show that
That is,
Want to see the full answer?
Check out a sample textbook solutionChapter 5 Solutions
Probability And Statistical Inference (10th Edition)
- Let X and Y be discrete random variables with joint pdf f(x,y) given by the following table: y = 1 y = 2 y = 3 x = 1 0.1 0.2 0 x = 2 0 0.167 0.4 x = 3 0.067 0.022 0.033 Find the marginal pdf’s of X and Y. Are X and Y independent?arrow_forwardLet X be a random variable with pdff(x) = 4x^3 if 0 < x < 1 and zero otherwise. Use thecumulative (CDF) technique to determine the pdf of each of the following random variables: 1) Y=X^4, 2) W=e^(-x) 3) Z=1-e^(-x) 4) U=X(1-X)arrow_forwardLet X denote the temperature at which a certain chemical reaction takes place. Suppose that X has pdf f(x)={1/9(4-x^2) -1<=x<=2 0 otherwise a) Compute P(0≤ X ≤1)b) Obtain E(x) and Variance of Xarrow_forward
- 1)Let x be a uniform random variable over the interval (0, 1). Knowing that y = x2 , calculate:a)Determine Fy(Y) = P(y<=Y),Y real and determine the pdf of y.b)Calculate E[x2] , using the pdf of x.c)Calculate E[y], using the pdf of y and compare with part (b).arrow_forwardConsider a random sample X1, … , Xn from the pdff (x; u) = .5(1 + (THETA)x) -1 <= x <= 1where -1 <= theta <= 1 (this distribution arises in particlephysics). Show that theta = 3X is an unbiased estimator oftheta. [Hint: First determine mu = E(X) = E(X).]arrow_forwardLet f(x) = ½ , -1 < x < 1 0 otherwise be a pdf of the random variable X. Find the distribution function and the pdf of Y= X2arrow_forward
- Let X1,X2,... be a sequence of identically distributed random variables with E|X1|<∞ and let Yn = n−1max1≤i≤n|Xi|. Show that limnE(Yn) = 0arrow_forwardConsider a random variable, Y , which has a quasi-Bernoulli structure. With probability p ∈ [0, 1] it takes value 0. With probability (1 − p) it is described by a continuous random variable, X , with the following PDF, f_X (x)=1+x, x∈[−1,0), −1≤x<0f_X (x)=1−x, x∈[0,1], 0≤x≤1f_X (x)=0, otherwise , x>1 Obtain CDF of Y, F_Y (y), and draw the sketch.arrow_forwardLet X1, ...., Xn be a random sample from a population with θ unknown and given by the density f(x; θ) = ( 1 2θ √2 x e − √2 x θ if x > 0 0 if x ≤ 0 1. Show that E(X) = 2θ 2 and E( √2 X) = θ (Hint: you may use that R ∞ 0 e −z z α−1dz = (α − 1)! for every α ∈ N). 2. Show that the statistic θbn := 1 n Xn i=1 p2 Xi (1) is an unbiased estimator of θ. 3. Give the definition of a consistent estimator. 4. Show that the estimator θbn given in relation (1) is a consistent estimator of θ. 5. Show that the estimator θbn is a minimum variance estimator of θ. (Hint: use the Cramer-Rao inequality given by var(θb) ≥ 1 nE ∂ ln(f(X;θ) ∂θ 2arrow_forward
- A First Course in Probability (10th Edition)ProbabilityISBN:9780134753119Author:Sheldon RossPublisher:PEARSON