Let X1,..., Xn be an iid random sample from the Binomial(k, p) distribution, where p is known but k is unknown. For example, this could happen in an experiment where we flip a coin that we know is fair and observe x; heads, but we do not know how many times the coin was flipped. Suppose we observe values x1,..., xn of our random sample. In this problem, you will show that there is a unique maximum likelihood estimator of k. (a) difficult here? What is the likelihood function, L(k)? Why is maximization by differentiation (b) of k. What is the value of the likelihood when k < max x;? Based on this, what can you We have to take a different approach to finding the maximum likelihood estimator i=1,...,n conclude about the range of values the maximum likelihood estimator can take?

College Algebra
7th Edition
ISBN:9781305115545
Author:James Stewart, Lothar Redlin, Saleem Watson
Publisher:James Stewart, Lothar Redlin, Saleem Watson
Chapter9: Counting And Probability
Section9.3: Binomial Probability
Problem 2E: If a binomial experiment has probability p success, then the probability of failure is...
icon
Related questions
Question
Let X1,..., X, be an iid random sample from the Binomial(k, p) distribution, where p is known but
k is unknown. For example, this could happen in an experiment where we flip a coin that we know
is fair and observe x; heads, but we do not know how many times the coin was flipped. Suppose
we observe values x1,..., xn of our random sample. In this problem, you will show that there is a
unique maximum likelihood estimator of k.
(a)
What is the likelihood function, L(k)? Why is maximization by differentiation
difficult here?
(b)
of k. What is the value of the likelihood when k < ¸max x;? Based on this, what can you
We have to take a different approach to finding the maximum likelihood estimator
i=1,...,n
conclude about the range of values the maximum likelihood estimator can take?
(c)
at k and k +1,
Based on part (a), what can you conclude about the ratio of two likelihoods evaluated
L(k)
L(k – 1)'
L(k + 1)
L(k)
|
for k in the range found in part (b)?
(d)
Using part (c), show that the maximum likelihood estimator is the value of k that
satisfies
n
n
II (1 - ) < (1 – p)" < II(!
Xi
k +1
i=1
i=
for k > max;=1,...,n i.
(e)
part (d). This description of the MLE for k was found by Feldman and Fox (1968).
Show that there is only one unique integer value of k that satisfies the equation in
Transcribed Image Text:Let X1,..., X, be an iid random sample from the Binomial(k, p) distribution, where p is known but k is unknown. For example, this could happen in an experiment where we flip a coin that we know is fair and observe x; heads, but we do not know how many times the coin was flipped. Suppose we observe values x1,..., xn of our random sample. In this problem, you will show that there is a unique maximum likelihood estimator of k. (a) What is the likelihood function, L(k)? Why is maximization by differentiation difficult here? (b) of k. What is the value of the likelihood when k < ¸max x;? Based on this, what can you We have to take a different approach to finding the maximum likelihood estimator i=1,...,n conclude about the range of values the maximum likelihood estimator can take? (c) at k and k +1, Based on part (a), what can you conclude about the ratio of two likelihoods evaluated L(k) L(k – 1)' L(k + 1) L(k) | for k in the range found in part (b)? (d) Using part (c), show that the maximum likelihood estimator is the value of k that satisfies n n II (1 - ) < (1 – p)" < II(! Xi k +1 i=1 i= for k > max;=1,...,n i. (e) part (d). This description of the MLE for k was found by Feldman and Fox (1968). Show that there is only one unique integer value of k that satisfies the equation in
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 3 steps

Blurred answer