Definition
Empirical Bayes estimator is a Bayes Estimator whose prior distribution is estimated from the data.
The parameter of the prior distribution is estimated by Maximum Likelihood Estimation. And a posterior distribution is calculated with the . Estimator is obtained using the posterior distribution.
Examples
Consider data following a Poission Distribution and a prior distribution follows a Gamma Distribution where with known, unknown.
Then the marginal likelihood is defined as
p(x_{i}|\beta) &= \int \operatorname{Pois}(x|\lambda_{i})\Gamma(\lambda_{i};\alpha, \beta) d\lambda_{i} = \int \left[ \frac{e^{-\lambda_{i}}\lambda^{x_{i}}}{x_{i}!} \right]\left[ \frac{\beta^{\alpha}\lambda_{i}^{\alpha-1}e^{-\beta \lambda_{i}}}{\Gamma(\alpha)} \right]d\lambda_{i}\\ &= \binom{x_{i}+\alpha-1}{\alpha-1}\left( \frac{\beta}{\beta+1} \right)^{\alpha} \left( \frac{1}{\beta+1} \right)^{x_{i}} \sim NB\left( \alpha, \frac{\beta}{\beta+1} \right) \end{aligned}$$ And the [[Maximum Likelihood Estimation|MLE]] of $\beta$, $\hat{\beta}_\text{MLE}$ is $$\hat{\beta}_\text{MLE} = \underset{\beta}{\operatorname{argmax}} \prod_{i=1}^{n}p(x_{i}|\beta) = \frac{\alpha}{\bar{X}}$$ The posterior distribution with $\hat{\beta}_\text{MLE}$ is defined as $$p(\lambda_{i}|x; \hat{\beta}_{\text{MLE}}) \propto p(x|\lambda_{i})\pi(\lambda_{i};\alpha, \hat{\beta}) \sim \Gamma(x_{i}+\alpha, 1+\hat{\beta})$$ Under [[Squared Error Loss]], the [[Bayes Estimator]] is a mean of the posterior distribution $\Gamma(x_{i}+\alpha, 1+\hat{\beta})$. $$\hat{\delta}_{\text{Bayes}} = \frac{\bar{X}(X_{i}+\alpha)}{\bar{X} + \alpha}$$ # Facts > Assume that $\mathbf{z} \sim N_{p}(\boldsymbol{\mu}, \mathbf{I})$ and the prior distribution for $\boldsymbol{\mu}$ is $\boldsymbol{\mu} \sim N_{p}(\mathbf{0}, \sigma^{2}\mathbf{I})$. > Then, the empirical Bayes estimator under the [[Squared Error Loss]] is [[James-Stein Estimator]] $\hat{\boldsymbol{\mu}}_{JS} = \left( 1 - \cfrac{p-2}{\mathbf{z}^{\intercal}\mathbf{z}} \right)\mathbf{z}$