When we deal with a population, most of the time the parameters are unknown. So we cannot draw any conclusion about the population. To know the unknown parameters the technique is to draw a sample from the population and try to gather information about the parameter through a function which is reasonably close. Thus the obtained value is called an estimated value of the parameter, the process is called estimation and the estimating function is called estimator.

A good estimator should satisfy the four properties which we briefly explain below :

**(a) Un Biasedness.** A statistic t is said to be an unbiased estimator of a parameter θ if,

E [t] = θ.

Otherwise it is said to be 'biased'

**Theorem 1.** Prove that the sample mean x is an unbiased estimator of the population mean µ

**Proof.** Let xl,x2, …,x, be a simple random sample with replacement from a finite population of size N, say, X1, X2,… , X_{N
}

_{ }

_{ }µ = (X1+X2+….+ X_{n})/N_{ }

To provethat E(x) = µ

While drawing xi, it can be one of the population members i.e., the probability distribution of Xi can be taken as follows :

Therefore,

The same result is also true for infinite population and the sampling without replacement.

** **

**Theorem 2.** The sample variance

is a biased estimator of the population variance σ2.

** **

**Proof.** Let x1, x2, ... , xn be a random sample from an infinite population with mean σ and

variance σ 2.

Then E (x) = µ, Var (x_{i}) = E (xi - µ)2 = σ 2, for i = 1, 2, ... , n.

ð *s ^{2} *is a biased estimator of σ

^{2}

Note

Thus *s ^{2}* is an unbiased estimator of σ

^{2}.

**Example 1.** A population consists of 4 values 3, 7, 11, 15. Draw all possible sample of size

two with replacement. Verify that the sample mean is an unbiased estimator of the population mean.

**Solution.** No. of samples = 42 = 16, which are listed below:

(3, 3), (7, 3), (11, 3), (15, 3)

(11 , 7), (15, 7), (11 , 11), (15, 11)

(11, 15), (15, 15), (3, 7), (7, 7),

(3, 11), (7, 11), (3, 15), (7, 15)

Population mean,

Sampling distribution of sample mean

Sample mean () |
Frequency f () |
.f () |

3 5 7 9 11 13 15 |
1 2 3 4 3 2 1 |
3 10 21 36 33 26 15 |

Total |
16 |
144 |

**Mean of sample mean = **144 / 16= 9

Since, E () = µ,

ð Sample mean is an unbiased estimator of the population mean.

**(b) Consistency.** A statistic t_{n} obtained from a random sample of size n is said to be a consistent estimator of a parameter if it converges in probability to θ as n tends to infinity.

Alt, If E [T_{n}] θ and Var [T_{n}] 0 as n ∞, then the statistic t_{n} is said to be consistent estimator of θ.

For example, in sampling from a Normal Population N (µ, σ^{2}),

Hence the sample mean is a consistent estimator of population mean.

(c) Efficiency. There may exist more than one consistent estimator of a parameter. Let T_{1} and T_{2} be two consistent estimators of a parameter θ. If

Var (T_{1}) < Var (T_{2}) for all n

then T_{1} is said to be more efficient than T_{2} for all sample size.

*(d) ***Sufficiency. **Let *x1, x 2, *.. . , *xn *be a random sample from a population whose *p.mf *or *pdf *is *f (x, *8). Then T is said to be a sufficient estimator of e if we can express the following :

* *

* f (x1,θ) . f(x2, θ)…..f(xn, θ) = g _{1}(T , θ). g( x_{1},x_{2},…, xn)*

where g1 (T, e) is the sampling distribution of and contains 8 and *g2 (x _{1},x_{2},…, xn ) *is independent of

*θ.*

Sufficient estimators exist only in few cases. However in random sampling from a normal population, the sampling mean *x *is a sufficient estimator of µ.

** **