sufficient statistic for a parameter
sufficient statistic for a parameter
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: THEend8_
Ch 7. Sufficiency
Ch 7. Sufficiency
Ch 7.1 A sufficient statistic for a parameter, p433
- Example 7.1
▶ Experiment A: Observe IID sample X1 and X2 from b(1, θ),
0 < θ < 1
▶ Experiment B: Observe X1 and X1 +X2 where
X1, X2
iid∼ b(1, θ)
▶ Experiment C
1st step: Observe Y = X1 +X2
2nd step: Observe X1 given Y = y
Ch 7. Sufficiency
▶ Probability Structure of Experiment A
(x1, x2) (0, 0) (0, 1) (1, 0) (1, 1)
P (x1, x2) (1− θ)2 (1− θ)θ θ(1− θ) θ2
▶ Probability Structure of Experiment B
(x1, x2) (0, 0) (0, 1) (1, 1) (1, 2)
P (x1, x2) (1− θ)2 (1− θ)θ θ(1− θ) θ2
Ch 7. Sufficiency
▶ Experiment C
1st step: Observe Y = X1 +X2
2nd step: Observe X1 given Y = y
▶ Probability Structure of Experiment C
1. P ((X1, X2) = (0, 0)|Y = 0)) = 1
2. P ((X1, X2) = (0, 1)|Y = 1)) = P ((X1, X2) = (1, 0)|Y =
1)) = 0.5
3. P ((X1, X2) = (1, 1)|Y = 2)) = 1
▶ From this probability structure, we can see that the
conditional probability of (X1, X2) given Y does not depend
on θ. This implies that if Y is given, knowing (X1, X2) does
not give any useful information for the inference of θ.
Ch 7. Sufficiency
Definition
Let X1, . . . , Xn be a random sample from pdf f(x; θ), θ ∈ Ω.
T (X) = T (X1, . . . , Xn) is called sufficient statistic for θ ∈ Ω if
(i) Y is a statistic
(ii) Pθ [(X1, . . . , Xn) ∈ A|T (X) = y] does not depend on θ for
each y and all A.
Ch 7. Sufficiency
- Example 7.2 (p.434)
For X1, . . . , Xn
iid∼ b(1, θ), Y = X1 + . . .+Xn is a sufficient
statistic for θ.
Ch 7. Sufficiency
Theorem (Factorization theorem, p436)
Let X1, . . . , Xn be a random sample from pdf f(x; θ), θ ∈ Ω.
Y = T (X) is a sufficient statistic if and only if we can find two
nonnegative function k1 and k2 such that
L(θ) =
n∏
i=1
f(xi; θ) = k1(y, θ)k2(x)
Ch 7. Sufficiency
- Example 7.2 (revisited): X1, . . . , Xn
iid∼ b(1, θ)
Ch 7. Sufficiency
- Example 7.3: X1, . . . , Xn
iid∼ N(µ, 1)
Ch 7. Sufficiency
- Example 7.4 (p.438): X1, . . . , Xn
iid∼ Beta(θ, 1)
Ch 7. Sufficiency
- Example 7.5: X1, . . . , Xn
iid∼ U [0, θ]
Ch 7. Sufficiency
- Fact: one-to-one function of a sufficient statistic for θ is also a
sufficient statistic for θ.
Ch 7. Sufficiency
Ch 7.2 Properties of a sufficient statistic
Theorem (Rao-Blackwell, p.441)
Let X1, . . . , Xn be a random sample from pdf f(x; θ), θ ∈ Ω. If
Y = T (X) is a sufficient statistic for θ, and θˆ1 is an unbiased
estimator of θ, then θˆ2 = E(θˆ1|T (X)) = E(θˆ1) = θ with
V (θˆ2) ≤ V (θˆ1).
Ch 7. Sufficiency
Remarks
▶ For any unbiased estimator θˆ1 for θ, we can always find a
better unbiased estimator θˆ2 = E(θˆ1|u1(X)) whe a sufficient
statistic u1(X) is given.
▶ Although SS is not unique, we can know that a desirable
estimator should be at least a function of a sufficient statistic.
Ch 7. Sufficiency
Theorem
If a unique MLE of θˆ exists, θˆ should be a function of SS.
Ch 7. Sufficiency
Ch 7.3 Completeness and uniqueness
- Fact: MVUE with finite variance is unique when it exists.
Proof.
Let θˆ1 and θˆ2 be MVUE’s of θ. We will show that θˆ1 = θˆ2 or
P (θˆ1 = θˆ2) = 1. For this, define θˆ3 = (θˆ1 + θˆ2)/2. Then θˆ3 is
unbiased and this means V (θˆ3) ≥ V (θˆ1) = V (θˆ2). Hence,
Cov(θˆ1, θˆ2) ≥ V (θˆ1) and this implies V (θˆ1 − θˆ2) = 0.
Ch 7. Sufficiency
Definition
X1, . . . , Xn: random sample from pdf f(x; θ), θ ∈ Ω.
(a) Y = u(X) is a complete statistic for θ if and only if
Eθ(g(Y )) = 0 for all θ ∈ Ω implies Pθ(g(Y ) = 0) = 1 for all θ ∈ Ω
(b) Y = u(X) is a complete sufficient statistic (CSS) for θ if Y is
complete and sufficient for θ.
Ch 7. Sufficiency
Theorem (Lehmann & Scheffe, p.446)
If T is CSS for θ and E(ϕ(T )) = θ, then ϕ(T ) is the MVUE of θ.
- Why this theorem works?
▶ From Rao-Blackwell, we know that a good estimator should be a
function of SS.
▶ If a given sufficient statistic T is also complete, we can say that
h(T ) = 0 whenever Eθ(h(T )) = 0 for all θ ∈ Ω.
▶ When T is CSS for θ, suppose that there are two unbiased
estimators for θ: ϕ(T ) and ψ(T ). Then the completeness of T
implies that ϕ(T ) = ψ(T ).
▶ This means that there is only one unbiased estimator which is a
function of CSS.
▶ How to show the completeness? No general way exists..
Ch 7. Sufficiency
- Example 7.5 (revisited): X1, . . . , Xn
iid∼ U [0, θ], θ > 0
→ From Example 7.5, we know that Yn = max(X1, . . . , Xn) is a
sufficient statistic for θ.
1. Yn is also a complete statistic?
(i) Assume Eθ(g(Yn)) = 0 for all θ > 0
(ii) Show that g(y) = 0
Ch 7. Sufficiency
- Example 7.5 (revisited): X1, . . . , Xn
iid∼ U [0, θ], θ > 0
→ From Example 7.5, we know that Yn = max(X1, . . . , Xn) is a
sufficient statistic for θ.
1. Yn is also a complete statistic? Yes!
2. What is the MVUE of θ?
3. Is there any easy way to show completeness? For some family
of distributions, we can show the completeness.
Ch 7. Sufficiency
- Example 7.6: X1, . . . , Xn
iid∼ Gamma(1, θ), θ > 0
Definition (Laplace transform)
The Laplace transform of f(x) is
Lf(x) =
∫ ∞
0
f(t)e−txdt
provided with
∫∞
0 f(t)e
−txdt <∞
Theorem (Uniqueness of Laplace transform)
Lf(x) = Lg(x) if and only if f(x) = g(x)
Ch 7. Sufficiency
- Example 7.6: X1, . . . , Xn
iid∼ Gamma(1, θ), θ > 0
→ ∑ni=1Xi is a complete and sufficient statistic
Ch 7. Sufficiency
Definition (Exponential family of distributions, p. 449)
f(x; θ) is said to be a member of the regular exponential family if
(i) Support of f(x; θ) does not depend on θ.
(ii) f(x; θ) = exp(p(θ)K(x) +H(x) + q(θ)), where p(θ) is a
nontrivial continuous function of θ.
(iii) p(θ) is a nontrivial continuous function of θ, and K(x) is a
nontrivial function of x
Ch 7. Sufficiency
- Examples
▶ Binomial distributions
▶ Gamma distributions
▶ Normal distributions
▶ Beta distributions
▶ Uniform distributions → NOT exponential family!
Ch 7. Sufficiency
Theorem (p. 450)
If f(x; θ) belongs to the exponential family
(i.e. f(x; θ) = exp(p(θ)K(x) +H(x) + q(θ)), then
(i) q(θ) = − log (∫ exp(p(θ)K(x) +H(x))dx)
(ii) E(K(X)) = −q′(θ)/p′(θ)
(iii)
V (K(X)) = −p
′′(θ)E(K(X)) + q′′(θ)
(p′(θ))2
=
p′′(θ)q′(θ)− q′′(θ)p′(θ)
(p′(θ))3
If p(θ) = θ, then E(K(X)) = −q′(θ) and V (K(X)) = −q′′(θ)
Proof.
Note that
d
dθ
f(x; θ) = (p′(θ)K(x) + q′(θ))f(x; θ)
d2
dθ2
f(x; θ) = (p′′(θ)K(x) + q′′(θ))f(x; θ) + (p′(θ)K(x) + q′(θ))2f(x; θ)
Ch 7. Sufficiency
Ch 7. Sufficiency
- Example 7.7
(i) X ∼ b(n, θ)
Ch 7. Sufficiency
- Example 7.7
(ii) X follows exponential distribution having mean λ.
Ch 7. Sufficiency
Theorem (p.451)
For X1, . . . , Xn
iid∼ f(x; θ), if f(x; θ) is a member of exponential
family, then
∑
K(Xi) is CSS
Ch 7. Sufficiency
- Example 7.7 (revisited)
(i) If X1, . . . , Xn
iid∼ b(1, θ), find the MVUE of θ.
Ch 7. Sufficiency
- Example 7.7(revisited)
(ii) If X1, . . . , Xn is a random sample from the exponential
distribution having mean λ, find the MVUE of λ.
Ch 7. Sufficiency
- Example 6.9 (revisited)
If X1, . . . , Xn
iid∼ Beta(θ, 1), find the MVUE of θ.
In Example 6.12, the MLE of θ is θˆMLE = − n∑ logXi and
E(θˆMLE) = nθn+1 . What is the MVUE of θ?
Ch 7. Sufficiency
▶ Rao-Blackwell+Lehmann-Scheffe
If X1, . . . , Xn
iid∼ f(x; θ), θˆ1 is unbiased, and Y is CSS, then
θˆ2 = E(θˆ1|Y ) is the MVUE of θ.
▶ Remark
1. If your estimator is a function of CSS and unbiased, then your
estimator is the unique MVUE.
2. If you have an unbiased estimator θˆ which is not a function of
CSS, θˆ∗ = E(θˆ|Y ) is the MVUE.
Ch 7. Sufficiency
- Example 7.8
If X1, . . . , Xn
iid∼ b(1, p), find the MVUE of p.
Ch 7. Sufficiency
- Example 7.9
If X1, . . . , Xn is a random sample from the exponential distribution
having mean θ, find the MVUE of η = e−1/θ = P (X1 > 1).
Ch 7. Sufficiency