Abstract

Let X1,X2,,Xn be a random sample from a normal N(θ,σ2) distribution with an unknown mean θ=0,±1,±2,. Hammersley (1950) proposed the maximum likelihood estimator (MLE) d=[X¯n], nearest integer to the sample mean, as an unbiased estimator of θ and extended the Cramér-Rao inequality. The Hammersley lower bound for the variance of any unbiased estimator of θ is significantly improved, and the asymptotic (as n) limit of Fraser-Guttman-Bhattacharyya bounds is also determined. A limiting property of a suitable distance is used to give some plausible explanations why such bounds cannot be attained. An almost uniformly minimum variance unbiased (UMVU) like property of d is exhibited.