Question
X_1,\ldots ,X_ n\sim X are i.i.d. random variables with density f_\theta, for some unknown \theta \in (0,1):
f_\theta (x)=\left\{ \begin{array}{ll} \theta ^2& \text {if } -1 \le x<0\\ 1- \theta ^2 & \text {if } 0\leq x\leq 1\\ 0 & \text {otherwise.} \end{array}\right.
Consider the following hypotheses:
\displaystyle \displaystyle H_0 \displaystyle : X \sim \textsf{Unif}(-1,1)
\displaystyle H_1 \displaystyle : X \text { not distributed as } \textsf{Unif}(-1,1).
Write down the test statistic \, T_ n^{\text {Wald}}\, (that is quadratic in \hat\theta ^{\text {MLE}}) for Wald's test (involving a chi-square distribution) for the above hypothesis. Use the value of \theta that defines H_0 as the argument of the asymptotic variance \, V(\theta )\,.
f_\theta (x)=\left\{ \begin{array}{ll} \theta ^2& \text {if } -1 \le x<0\\ 1- \theta ^2 & \text {if } 0\leq x\leq 1\\ 0 & \text {otherwise.} \end{array}\right.
Consider the following hypotheses:
\displaystyle \displaystyle H_0 \displaystyle : X \sim \textsf{Unif}(-1,1)
\displaystyle H_1 \displaystyle : X \text { not distributed as } \textsf{Unif}(-1,1).
Write down the test statistic \, T_ n^{\text {Wald}}\, (that is quadratic in \hat\theta ^{\text {MLE}}) for Wald's test (involving a chi-square distribution) for the above hypothesis. Use the value of \theta that defines H_0 as the argument of the asymptotic variance \, V(\theta )\,.
Answers
There are no human answers yet.
Answered by
GPT 3.5
The maximum likelihood estimator (MLE) of θ is given by:
\(\hat{\theta}^{\text{MLE}} = \frac{n_2}{n_1 + n_2}\)
where n_1 is the number of observations in the interval [-1, 0) and n_2 is the number of observations in the interval [0, 1].
The test statistic \(T_n^{\text{Wald}}\) is quadratic in \(\hat{\theta}^{\text{MLE}}\) and is given by:
\(T_n^{\text{Wald}} = \frac{(\hat{\theta}^{\text{MLE}} - \theta_0)^2}{V(\theta_0)}\)
where \(\theta_0\) is the value of θ that defines H_0 and \(V(\theta)\) is the asymptotic variance of \(\hat{\theta}^{\text{MLE}}\).
Since H_0 assumes a uniform distribution, \(\theta_0 = \frac{1}{2}\).
To find the asymptotic variance \(V(\theta)\), we first need to find the Fisher information, which is given by:
\(I(\theta) = -E\left(\frac{d^2}{d\theta^2}\log f_{\theta}(X)\right)\)
The log-likelihood function is given by:
\(\log f_{\theta}(x) = \begin{cases} \log(\theta^2) & \text{if } -1 \le x < 0 \\ \log(1-\theta^2) & \text{if } 0 \le x \le 1 \end{cases}\)
Differentiating twice with respect to θ, we get:
\(\frac{d^2}{d\theta^2}\log f_{\theta}(x) = \begin{cases} -\frac{1}{\theta^2} & \text{if } -1 \le x < 0 \\ \frac{1}{1-\theta^2} & \text{if } 0 \le x \le 1 \end{cases}\)
Taking the expectation, we need to integrate over the respective intervals:
\(I(\theta) = \int_{-1}^0 (-\frac{1}{\theta^2}) (\theta^2) dx + \int_0^1 \frac{1}{1-\theta^2} (1-\theta^2) dx\)
Simplifying, we get:
\(I(\theta) = -1 + 1 = 0\)
Given that the Fisher information is 0, the asymptotic variance \(V(\theta)\) is also 0.
Therefore, the test statistic \(T_n^{\text{Wald}}\) simplifies to:
\(T_n^{\text{Wald}} = \frac{(\hat{\theta}^{\text{MLE}} - \theta_0)^2}{V(\theta_0)} = \frac{\left(\frac{n_2}{n_1 + n_2} - \frac{1}{2}\right)^2}{0}\)
Since the asymptotic variance is 0, the test statistic \(T_n^{\text{Wald}}\) does not have a chi-square distribution and therefore Wald's test cannot be used in this case.
\(\hat{\theta}^{\text{MLE}} = \frac{n_2}{n_1 + n_2}\)
where n_1 is the number of observations in the interval [-1, 0) and n_2 is the number of observations in the interval [0, 1].
The test statistic \(T_n^{\text{Wald}}\) is quadratic in \(\hat{\theta}^{\text{MLE}}\) and is given by:
\(T_n^{\text{Wald}} = \frac{(\hat{\theta}^{\text{MLE}} - \theta_0)^2}{V(\theta_0)}\)
where \(\theta_0\) is the value of θ that defines H_0 and \(V(\theta)\) is the asymptotic variance of \(\hat{\theta}^{\text{MLE}}\).
Since H_0 assumes a uniform distribution, \(\theta_0 = \frac{1}{2}\).
To find the asymptotic variance \(V(\theta)\), we first need to find the Fisher information, which is given by:
\(I(\theta) = -E\left(\frac{d^2}{d\theta^2}\log f_{\theta}(X)\right)\)
The log-likelihood function is given by:
\(\log f_{\theta}(x) = \begin{cases} \log(\theta^2) & \text{if } -1 \le x < 0 \\ \log(1-\theta^2) & \text{if } 0 \le x \le 1 \end{cases}\)
Differentiating twice with respect to θ, we get:
\(\frac{d^2}{d\theta^2}\log f_{\theta}(x) = \begin{cases} -\frac{1}{\theta^2} & \text{if } -1 \le x < 0 \\ \frac{1}{1-\theta^2} & \text{if } 0 \le x \le 1 \end{cases}\)
Taking the expectation, we need to integrate over the respective intervals:
\(I(\theta) = \int_{-1}^0 (-\frac{1}{\theta^2}) (\theta^2) dx + \int_0^1 \frac{1}{1-\theta^2} (1-\theta^2) dx\)
Simplifying, we get:
\(I(\theta) = -1 + 1 = 0\)
Given that the Fisher information is 0, the asymptotic variance \(V(\theta)\) is also 0.
Therefore, the test statistic \(T_n^{\text{Wald}}\) simplifies to:
\(T_n^{\text{Wald}} = \frac{(\hat{\theta}^{\text{MLE}} - \theta_0)^2}{V(\theta_0)} = \frac{\left(\frac{n_2}{n_1 + n_2} - \frac{1}{2}\right)^2}{0}\)
Since the asymptotic variance is 0, the test statistic \(T_n^{\text{Wald}}\) does not have a chi-square distribution and therefore Wald's test cannot be used in this case.
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.