To compute the posterior distribution [mathjaxinline]\pi(\lambda | X_1, X_2, \ldots, X_n)[/mathjaxinline], we can use Bayes' theorem:
[mathjaxinline]\pi(\lambda | X_1, X_2, \ldots, X_n) \propto \pi(\lambda) \cdot f(X_1, X_2, \ldots, X_n | \lambda)[/mathjaxinline],
where [mathjaxinline]\pi(\lambda)[/mathjaxinline] is the prior distribution and [mathjaxinline]f(X_1, X_2, \ldots, X_n | \lambda)[/mathjaxinline] is the likelihood function.
Given that [mathjaxinline]\pi(\lambda)[/mathjaxinline] follows an exponential distribution [mathjaxinline]\textsf{Exp}(a)[/mathjaxinline] and [mathjaxinline]X_1, X_2, \ldots, X_n[/mathjaxinline] are observations drawn from a normal distribution [mathjaxinline]\textsf{N}(\lambda, 1)[/mathjaxinline], the likelihood function can be expressed as:
[mathjaxinline]f(X_1, X_2, \ldots, X_n | \lambda) = f(X_1|\lambda) \cdot f(X_2|\lambda) \cdot \ldots \cdot f(X_n|\lambda)[/mathjaxinline].
Since the observations are independent, we can write:
[mathjaxinline]f(X_1, X_2, \ldots, X_n | \lambda) = \prod_{i=1}^{n} f(X_i|\lambda)[/mathjaxinline].
Substituting the probability density function (PDF) of the normal distribution [mathjaxinline]\textsf{N}(\lambda, 1)[/mathjaxinline], we have:
[mathjaxinline]\begin{align*}
f(X_1, X_2, \ldots, X_n | \lambda) &= \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi}} \exp\left(-\frac{1}{2}(X_i - \lambda)^2\right) \\
&= \frac{1}{(2\pi)^{\frac{n}{2}}} \exp\left(-\frac{1}{2}\sum_{i=1}^{n}(X_i - \lambda)^2\right) \\
&= \frac{1}{(2\pi)^{\frac{n}{2}}} \exp\left(-\frac{1}{2}\left(\sum_{i=1}^{n}X_i^2 - 2\lambda\sum_{i=1}^{n}X_i + n\lambda^2\right)\right) \\
&= \frac{1}{(2\pi)^{\frac{n}{2}}} \exp\left(-\frac{1}{2}\sum_{i=1}^{n}X_i^2 + \lambda\sum_{i=1}^{n}X_i - \frac{n}{2}\lambda^2\right) \\
&\propto \exp\left(\lambda\sum_{i=1}^{n}X_i - \frac{n}{2}\lambda^2\right),
\end{align*}[/mathjaxinline].
Next, we need to compute the posterior distribution by multiplying the prior and the likelihood, and then normalizing it:
[mathjaxinline]\begin{align*}
\pi(\lambda | X_1, X_2, \ldots, X_n) &\propto \pi(\lambda) \cdot f(X_1, X_2, \ldots, X_n | \lambda) \\
&\propto \exp(-a\lambda) \cdot \exp\left(\lambda\sum_{i=1}^{n}X_i - \frac{n}{2}\lambda^2\right) \\
&\propto \exp\left(\lambda(\sum_{i=1}^{n}X_i - a - \frac{n}{2}\lambda)\right).
\end{align*}[/mathjaxinline]
We can recognize this as a normal distribution with mean [mathjaxinline]\frac{\sum_{i=1}^{n}X_i - a}{n}[/mathjaxinline] and variance [mathjaxinline]\frac{1}{n}[/mathjaxinline]:
[mathjaxinline]\pi(\lambda | X_1, X_2, \ldots, X_n) = \textsf{N}\left(\frac{\sum_{i=1}^{n}X_i - a}{n}, \frac{1}{n}\right).[/mathjaxinline]
The median of a normal distribution is given by its mean, so in this case, the median of the posterior distribution is [mathjaxinline]\frac{\sum_{i=1}^{n}X_i - a}{n}[/mathjaxinline].
To summarize, the statistics on the posterior distribution are as follows:
Median: [mathjaxinline]\frac{\sum_{i=1}^{n}X_i - a}{n}[/mathjaxinline]
Now, suppose that we instead have the proper prior [mathjaxinline]\pi (\lambda ) \sim[/mathjaxinline] [mathjaxinline]\textsf{Exp}(a)[/mathjaxinline] ([mathjaxinline]a > 0[/mathjaxinline]). Again, just as in part (b): conditional on [mathjaxinline]\lambda[/mathjaxinline], we have observations [mathjaxinline]X _1[/mathjaxinline], [mathjaxinline]X _2[/mathjaxinline], [mathjaxinline]\cdots[/mathjaxinline], [mathjaxinline]X _{n}[/mathjaxinline] [mathjaxinline]\stackrel{\text {i.i.d}}{\sim }[/mathjaxinline] [mathjaxinline]\textsf{N}(\lambda , 1)[/mathjaxinline]. You may assume that [mathjaxinline]a < \displaystyle \sum _{i=1}^{n} X_ i[/mathjaxinline]. Compute the posterior distribution [mathjaxinline]\pi (\lambda | X_1, X_2, \ldots , X_ n)[/mathjaxinline], then provide the following statistics on the posterior distribution. Write Phi for the CDF function [mathjaxinline]\Phi ()[/mathjaxinline] and PhiInv for its inverse.
Use SumXi for [mathjaxinline]\sum _{i=1}^ n X_ i[/mathjaxinline].
median:
1 answer