Title | SSP 2021 Chapter 1-2 - This course contains an introduction into some basics concepts of classical |
---|---|
Author | Yongkang Ding |
Course | Statistical Signal Processing |
Institution | Technische Universität München |
Pages | 19 |
File Size | 250.8 KB |
File Type | |
Total Downloads | 48 |
Total Views | 119 |
This course contains an introduction into some basics concepts of classical statistical signal processing and
its applications:
- Estimating parameters based on likelihood models
- Estimating the outcomes of random variables
- Estimating parameters based on linear modelling
Online Course
Statistical Signal Processing Univ.-Prof. Dr.-Ing. Wolfgang Utschick Technische Universit¨at M¨unchen
Summer 2021
© 2021 Univ.-Prof. Dr.-Ing. Wolfgang Utschick A circulation of this document to other parties without a written consent of the author is forbidden. Email: [email protected] Layout by LATEX2ε
A few starting remarks This course contains an introduction into some basics concepts of classical statistical signal processing and its applications: -
Estimating Estimating Estimating Estimating
parameters based on likelihood models the outcomes of random variables parameters based on linear modelling the realization of random sequences
3
The content is not taken directly from a well-known textbook, but reflects the author’s view after more than 20 years of teaching and research in this field. Nevertheless, there has been a strong influence by some textbooks, in particular these are - Steven M. Kay, Fundamentals of Statistical Signal Processing, Volume I: Estimation Theory, Prentice Hall, 1993. - Steven M. Kay, Fundamentals of Statistical Signal Processing, Volume II: Detection Theory, Prentice Hall, 1998. - H. Stark and J. W. Woods, Probability, Random Processes, and Estimation Theory for Engineers, 2nd edition, Prentice Hall, 1994. - Louis L. Scharf, Statistical Signal Processing: Detection, Estimation, and Time Series Analysis, AddisonWesley Publishing Company, 1991. - B. Hajek, Random Processes for Engineers, Cambridge University Press, 2015. - G. R. Grimmett, D. R. Stirzaker, Probability and Random Processes, Oxford University Press, 2001. (↑advanced reader!) - D. Simon, Optimal State Estimation, Wiley, 2006. (↑advanced reader!) - B. Ristic, S. Arulampalam, and N. Gordon. Beyond the Kalman Filter, Artech House, 2004.
4
Part I Parameter Estimation
5
1. Statistical Modeling Statistical Estimation treats the problem of inferring underlying Parameters of unknown random variables on the basis of observations of Outcomes of those random variables. The resulting basic task of statistical estimation is to infer the most appropriate Probability Measure P, which the Realizations of the respective random variables X : Ω → X are subject to. The most difficult part of any parameter estimation problem is the choice of an appropriate Statistic Model (Ω, F, Pθ ), with the metric space (X, B) and
Observation Space :
(1.1)
X,
Sigma Algebra : F, Probability Measure :
Pθ , θ ∈ Θ.
(1.2) (1.3)
In other words, the stochastic model is a set of Probability Spaces and the task of statistical estimation is to select the most appropriate candidate on the basis of observed outcomes of a random experiment. 6
1.1
Standard Model
Definition. We call the introduced statistical model Standard Model and the inference problem Parameter Estimation, if the set of potential parameters
Θ ⊂ RD ,
(1.4)
and the random variable X is either Discrete or Continuous. Commonly used terminology: - Random variables Xi : Ω → X are Statistics, and - its Outcomes xi of Xi are called Realizations, Observations, Samples, Measurements, etc. Definition. A special statistic is the random variable T : X → Θ, which maps one or multiple observations to θˆ ∈ Θ or another parameter depending thereof. The random variable or statistic T : X → Θ is called Estimator. Note, that the Estimator T is also a random variable, since it is a function of the random variable X .
7
1.2
Introductory Example
Given stochastically independent statistics X1 , . . . , XN of a uniquely distributed random variable X : Ω → [0, θ], with [0, θ] ⊂ R such that FX (ξ) =
ξ θ
(1.5)
and fX (ξ) = 1θ , if 0 ≤ ξ ≤ θ .
The unknown parameter θ, which describes the random variable X is Deterministic and Unknown. x
x
x xx
x max xi θ
0
Fig. 1.1: Estimating the upper bound of an intervall. How to estimate the upper bound? Any guesses?
8
How to estimate the upper bound? 1. Attempt: Given E [X ] = θ/2, we conclude for the statistics Xi T1 = 2 ·
N 1 X Xi : N | i=1 {z }
x1 , . . . , xN 7→ θˆ1 .
(1.6)
Average
2. Attempt: Since for large N the maximum observed value will be close to the upper bound, we conclude T2 = max {Xi } : i=1,...,N
How realiable are these attempts?
9
x1 , . . . , xN 7→ θˆ2 .
(1.7)
Estimator T1 : According to the Law of T1 Pθ 2 −
Large Numbers (Chebyshev Inequality): θ Var [X ] −→ 0. ≥ǫ ≤ N ǫ2 N →∞ 2
(1.8)
Estimator T2 : Again with an asymptotic approach:
Pθ ({|T2 − θ| ≥ ǫ}) = Pθ ({X1 ≤ θ − ǫ, . . . , XN ≤ θ − ǫ}) =
N Y
Pθ ({Xi ≤ θ − ǫ})
i=1 N Y
θ−ǫ θ i=1 N θ−ǫ = θ ǫ N −→ 0. = 1− | {z θ } N →∞ =...