B2. (a) Consider X₁,..., X₁ to be a random sample from the geometric distribution, with probability mass function: P(X= x) = p(1-p), with = 0, 1, 2, 3,..., and pe (0, 1]. (i) Using the MGF (M(t) = 1-(1-p)e derive E[X] and Var[X]. (ii) Find the Maximum Likelihood Estimator (MLE) for p.
B2. (a) Consider X₁,..., X₁ to be a random sample from the geometric distribution, with probability mass function: P(X= x) = p(1-p), with = 0, 1, 2, 3,..., and pe (0, 1]. (i) Using the MGF (M(t) = 1-(1-p)e derive E[X] and Var[X]. (ii) Find the Maximum Likelihood Estimator (MLE) for p.
Linear Algebra: A Modern Introduction
4th Edition
ISBN:9781285463247
Author:David Poole
Publisher:David Poole
Chapter4: Eigenvalues And Eigenvectors
Section4.6: Applications And The Perron-frobenius Theorem
Problem 25EQ
Related questions
Question
![B2. (a) Consider X₁,..., Xn to be a random sample from the geometric distribution, with
probability mass function: P(X= x) = p(1-p), with x = : 0, 1, 2, 3,..., and
p € (0, 1].
(i) Using the MGF (M(t) ==
(ii) Find the Maximum Likelihood Estimator (MLE) for p.
-(1-p)et
(b) Suppose X₁,..., X₁ is a random sample from a Beta(01, 1) population, and Y₁,..., Ym
is an independent random sample from a Beta(02, 1) population. We want to find
the approximate Likelihood Ratio Test for Ho: 01 02 00, versus H₁: 01 02.
To this aim:
=
=
(i) Under the alternative hypothesis H₁0₁02, show that the MLE for ₁ and
02 are:
0₁
02
derive E[X] and Var[X].
n
Σlog(x)'
Recall, that the PDF of Beta(a, b) is fy (y)
00
= -
m
Σ log(yi)
=
['(a) = (a − 1)! for all positive integers a and I (1) = 1)
[(a+b)
F(a) (b)-1(1-y)b-1 and
(ii) Under the null hypothesis Ho: 0₁ 02 = 0o, show that the MLE for 00 is:
01
=
n + m
E log(xi) + log(yi)](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2F3011c556-643e-4a01-b0e0-55d8cf24eddf%2Fd740549b-0b50-4815-bd29-6bb65327f31c%2Fk8gee9_processed.jpeg&w=3840&q=75)
Transcribed Image Text:B2. (a) Consider X₁,..., Xn to be a random sample from the geometric distribution, with
probability mass function: P(X= x) = p(1-p), with x = : 0, 1, 2, 3,..., and
p € (0, 1].
(i) Using the MGF (M(t) ==
(ii) Find the Maximum Likelihood Estimator (MLE) for p.
-(1-p)et
(b) Suppose X₁,..., X₁ is a random sample from a Beta(01, 1) population, and Y₁,..., Ym
is an independent random sample from a Beta(02, 1) population. We want to find
the approximate Likelihood Ratio Test for Ho: 01 02 00, versus H₁: 01 02.
To this aim:
=
=
(i) Under the alternative hypothesis H₁0₁02, show that the MLE for ₁ and
02 are:
0₁
02
derive E[X] and Var[X].
n
Σlog(x)'
Recall, that the PDF of Beta(a, b) is fy (y)
00
= -
m
Σ log(yi)
=
['(a) = (a − 1)! for all positive integers a and I (1) = 1)
[(a+b)
F(a) (b)-1(1-y)b-1 and
(ii) Under the null hypothesis Ho: 0₁ 02 = 0o, show that the MLE for 00 is:
01
=
n + m
E log(xi) + log(yi)
Expert Solution
![](/static/compass_v2/shared-icons/check-mark.png)
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 4 steps with 28 images
![Blurred answer](/static/compass_v2/solution-images/blurred-answer.jpg)
Similar questions
Recommended textbooks for you
![Linear Algebra: A Modern Introduction](https://www.bartleby.com/isbn_cover_images/9781285463247/9781285463247_smallCoverImage.gif)
Linear Algebra: A Modern Introduction
Algebra
ISBN:
9781285463247
Author:
David Poole
Publisher:
Cengage Learning
![Linear Algebra: A Modern Introduction](https://www.bartleby.com/isbn_cover_images/9781285463247/9781285463247_smallCoverImage.gif)
Linear Algebra: A Modern Introduction
Algebra
ISBN:
9781285463247
Author:
David Poole
Publisher:
Cengage Learning