Loading [MathJax]/jax/output/HTML-CSS/jax.js
Google
 
Web unafbapune.blogspot.com

Saturday, April 05, 2014

 

Estimator

A point estimate, ˆθ=g(x), is a number, whereas an estimator, ˆΘ=g(X), is a random variable.
  ˆθMAP=gMAP(x): maximises pΘ|X(θ|x)ˆθLMS=gLMS(x)=E[Θ|X=x]

Conditional probability of error

  P(ˆθΘ|X=x)smallest under the MAP rule

Overall probability of error

  P(ˆΘΘ)=P(ˆΘΘ|X=x)fX(x)dx=θP(ˆΘΘ|Θ=θ)pΘ(θ)

Mean squared error (MSE)

  E[(Θˆθ)2]
Minimized when ˆθ=E[Θ], so that 
  E[(Θˆθ)2]=E[(ΘE[Θ])2]=var(Θ)least mean square (LMS)

Conditional mean squared error

  E[(Θˆθ)2|X=x]with observation x
Minimized when ˆθ=E[Θ|X=x], so that 
  E[(Θˆθ)2|X=x]=E[(ΘE[Θ|X=x])2|X=x]=var(Θ|X=x)expected performance, given a measurement
Expected performance of the design:
  E[(ΘE[Θ|X])2]=E[var(Θ| X)]
Note that ˆθ is an estimate whereas ˆΘ=E[Θ|X] is an estimator.

Linear least mean square (LLMS) estimation

Minimize E[(ΘaXb)2] w.r.t. a,b
  ˆΘL=E[Θ]+cov(Θ,X)var(X)(XE[X])=E[Θ]+ρσΘσX(XE[X])only means, variances and covariances matter
Error variance:
  E[(ˆΘLΘ)2]=(1ρ2)var(Θ)

Source: MITx 6.041x, Lecture 16, 17.


Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?