Lecture notes, lecture 3 PDF

Title Lecture notes, lecture 3
Course Kalman Filter With Application
Institution George Mason University
Pages 5
File Size 422.6 KB
File Type PDF
Total Downloads 39
Total Views 208

Summary

Download Lecture notes, lecture 3 PDF


Description

! Equations!for! X k (t) ! ! X k (t) = E [X k / y k ] = X k / k = X(k /k) X k*+1 (−) = E [X k +1 / y k ] = E [ φ k X k + uk / y k ]

!!

= E [φ k X k / y k ] (because y k and uk are independent and E [uk ] = 0) = φk X k (+)

! on!the!other!hand,!from!Kalman!filter!equations!on!the!next!page! ! X k*+1 (−) = φ k X k* (−) + Δ*k [ y k − M k X k* (−)] = φk X k* (−) + φ k Pk M Tk [ M k Pk M kT + Rk ]

[

= φk X k* (−) + Pk M kT [ M k Pk M kT

−1

[y + R ] ][ y

k

− M k X k* (−)]

k

− M k X k* (−)]

−1

k

Since φk is invertible,

φk = e F(tk+1 −t k ) by comparing the two expressions for X k*+1 (−) we have X k (+) = X *k (−) + Pk M kT [ M k Pk M kT + Rk ]

−1

[y

k

− M k X k* (−) ]

−1

Let Kk = Pk M kT [ M k Pk M kT + Rk ]

X k (+) = X *k (−) + Kk [y k − M k X k* (−)] where [ y k − M k X *k (−)] is correction due to the residual or, alternatively X k (+) = [ I − K k M k ]X k* (−) + K k y k compare with textbook's equations (4.7) and (4.8) !

!

Error!equation!for! X˜ k (+) ! ! X˜ k (+) = X k − X k* (+) = X k − X k* (−) − K k M k X k − K kVk + K k M k X k* (−) = [1 − K M ] X˜ k (−) − K v k

k k

k

covariance : Pk (+) = E [ X˜ k (+) X˜ Tk (+)]

! T

= [1 − K k M k ]Pk (−)[1 − Kk M k ] + K k RkK Tk Pk (−) = Pk ! Hence!we!have!the!“Joseph!Form”!of!the!covariance!equation.! ! T Pk (+) = [1 − K k M k ]Pk [1 − K k M k ] + K k RkKkT ! ! This! equation! has! a! symmetric! form,! and! is! most! suitable! for! numerical! computation.!Another!equation,!a!nonDsymmetric!one,!is!also!often!used,!but!one!has! to!watch!for!a!potential!nonDsymmetry!of!Pk(+)!due!to!numerical!!errors/roundDoff.! ! NonDsymmetric!form!of!the!covariance!equation! ! Pk (+) = [1 − K k M k ]Pk = Pk [ I − M kT KTk ] ! ! Proof:!Multiply!through!the!first!term!in!the!Joseph!form! ! T [1 − Kk M k ] Pk [1 − K k M k ] = Pk − K k M k Pk − Pk M kT KkT + K k M k Pk M kT KkT the last term above combined with the last term of the Joseph form is K k M k Pk M kT KkT + K k RkK Tk = K k[ M k Pk M Tk + Rk ] KkT −1

= K k [M k Pk M Tk + Rk ][ M k Pk M Tk + Rk ] M k Pk = K k M k Pk or Pk M kT [ M k Pk M kT + Rk]

−1

[M P M k

k

T k

+ Rk ]K kT = Pk M Tk K kT

combining the terms and cancelling, we get the result. !!

!

Properties!of!the!Residuals! ! ek = y k − M k X k* (−) !is!a!sequence!of!residuals! ! In!Kalman!Filter,!the!residuals!are!uncorrelated,!that!is!the!sequence! {ek }!is!a!white! noise.! Comment:! this! statement! refers! only! to! the! residuals! in! output’sDoneDstepD prediction,! ek = y k − M k X k* (−) .! (The!state!variable!residuals! X˜ *k (−) !are!correlated)! ! Proof:! Let! E ukv jT = 0, cov[v k ] = Rk X k +1 = φk X k + uk , y k = M k X k + v k

[

]

X k*+1 (−) = φ k X k* (−) + Δ*k ek −1

* = φk X k (−) + φ k Pk M k [ M k Pk M kT + Rk ]

[y

k

− M k X k* (−)]

* * * * = [φ k − Δ k M k ]X k (−) + Δ k M k X k + Δ k v k

X˜ k +1 (−) = X k +1 − X k*+1 (−) = [φ k − Δ*k M k ]X˜ k (−) + uk − Δ*k v k

!

or (1) X˜ k +1 (−) = φ k X˜ k (−) + uk − Δ*k v k ek = M k X k + v k − M k X k* (−) (2) e = M X˜ k (−) + v k k

k

! Step!1:!!!!!!!! ! let!i>j!and!compute! E X˜ i (−) X˜ j (−) !

[

If!i>k!let! Φ

* i, k

]

* ...Φ*k +1Φ*k ! = closed − loop transition matrix from step k to step i = Φ i−1 !

From (1) i−1

X˜ i (−) = Φ*i, j X˜ j (−) + ∑ Φ*i,k +1[ uk − Δ*k v k] k= j

note that X˜ j (−) is orthogonal to uk and v k ! (i > j )

[

]

[

]

T T E X˜ i (−) X˜ j (−) = Φ*i, j E X˜ j (−)X˜ j (−) + 0

= Φ i*, j Pj ! !

i=1

[

]

[

] ∑Φ [ u

˜ j (−)vTj + E X˜ i (−)v Tj = Φ*i, j E X

* i,k +1

k

− Δ*k v k ] vTj

k= j

because v j has no effect on X (−), thus on X˜ j (−) * j

In the second term one the RHS we have

[ ] E [v v ] = R δ

E ukv Tj = 0 ∀ k, j (by assumption) k

T j

j

!

kj

hence

[

]

E X˜ i (−)v Tj = Φ*i, j +1Δ *j R j ! Step!2:!!!!!! ! T ฀฀ ฀฀ ˜ ˜ E eie Tj = E [M i X i (−) + v i] M j X j (−) + v j ฀฀ ฀฀

[ ]

[ ] ˜ (−) X˜ (−) ] M + M E [X˜ (−)v ]! = M E [X + E [v X˜ (−)M ] + E [v v ] i

T j

i

i

T j

T j

i

T j

i

i

T j

T j

! Since!i>j,!the!last!2!terms!are!0,!because! v i !is!after! X˜ j (−) !and!so!is!orthogonal!to!it,!

[ ]

and! E v iv Tj = 0 ∀ i > j ! ! Hence,!for!all!i>j! ! * * * E eie Tj = M iΦ i, j Pj M Tj − M iΦ i, j +1Δ j R j !

[ ]

! Now! !

[

Δ*j = Φ j Pj M Tj M j Pj M Tj + R j

]

−1

T T * * Δ j M j Pj M j + Δ j R j = Φ j Pj M j * * Δ j R j = Φ j Pj M Tj − Δ j M j Pj MTj *

!

= Φ j Pj M Tj M iΦ*i, j +1Δ j R j = M iΦ*i, j +1Φ*j Pj M Tj = M iΦ i*, j Pj M Tj ! ! So E eie Tj = M iΦ*i, j Pj M Tj − M iΦ*i, jPj M Tj = 0!

[ ]

!

Step!3:! ! If!i i !

! Therefore!the!residuals! ek !are!a!white!sequence!and! E [eieiT ] = M i Pi MTi + Ri !...


Similar Free PDFs