Chapter 5.docx

12 Pages
Unlock Document

Statistical Sciences
Zhou Zhou

Chapter 5 5.1 – Preliminary Estimation 5.1.1 – Yule­Walker Estimation AR(p) X tϕ X1 t−1…−ϕ X p t−p=Z (t) 2 Z tN 0,σ( ) Multiply  X t−1, Xt−2,…, X t−p  to  (¿)  and take expectation E [ Xt t−i]−ϕ E1X [ t−1 Xt−i]…−ϕ E X p [ t−pX t−i]E Z [ t t−i i=1,…,p γi−ϕ γ1 i−1−…−ϕ γ p i−p =0 γi=ϕ γ1 i−1+…ϕ γ p i−p ¿∗¿ ́ γp=Γ ϕp¿ p γ 1 ́p= ⋮ () γ p p Γ p γ[ i−ji,j=1 ¿∗¿ ́ Γ ́ ¿  implies that we can use the Method of Moments and replace  p  and  p  by  p  and  Γ p  and obtain the Yule­Walker estimation: ̂ ϕ pΓ γ p1́p Multiply  X t  to  (¿)  and take expectation 2 γ 0 −ϕ γ 1…1ϕ γ =E Z p =p Z ϕ[X t t] [ t( 1 t−1+…+ϕ X p t−p +Z t)] σ =γ 0 −ϕ γ ́T ́ p p Use the Method of Moments,  ̂ 2 ́T ̂ σ =γ 0 −ϕ γ ṕp Theorem: Simple Yule­Walker Estimation ́ −1 ̂ ϕ pR ̂p ρp σ =γ 0 1−( R ρ́ p̂ p1́ p) R p Here   is the autocorrelation matrix R = ̂ ρ p p [ i−ji, j=1 ρ 1 ρ p ⋮ () ρ p Proof: HW Proposition:  σ Pσ asn→∞ → Proof: not required Theorem: For large  n , ́ ́ 2 −1 √ ( ϕ pϕ D p)0,σ Γ( p ) → Remark: For AR(p) models, the Yule­Walker estimates are as efficient as the Maximum Likelihood Estimates. ́ −1 ϕp=Γ γp ́p Note: 1n−i ̂i= ∑ (X jX X)( j+1X́ ) nj=1 i=0,…, p Note: E X =0 [ i X ≈0 n−i n−i n−i n−i n−i ̂i≈1 ∑ X j j+1 1 ∑ X j( j+i−1 1…+X j+i−p pZ j+i= 1∑ Xj(́ j+i pZ i+j 1∑ X j ϕj+p 1 ∑ X j i+ j n j=1 n j=1 n j=1 nj=1 n j=1 n ̂ ̂ ́ 1 ́p≈Γ p +p ∑ X k k nk=p+1 X i−1 Xi= ⋮ (X) i−p n ϕ −ϕ ≈ Γ ̂−1 1 X Z p p p √ n k=p+1 k k Now 2 steps: (1) Show  1 n ∑ X k k N 0,( Γ 2 p) √n k=p+1 → (2) Show ΓpPΓ p → Then Slutsky’s Theorem implies  −1 1 n −1 2 −1 2 −1 −1 2 −1 Γ p ∑ X k DkΓ N p,σ ( Γ =N 0p) p Γ Γ ( p p p )N 0,( Γ p ) √ nk=p+1 → (2) By the weak law of large numbers, for each  n ̂ = 1 X X PE X X =γ k n j=k+1 j j−k→ [ j j−k] k (1) Apply the CLT of STA457 Y =X Z Note:  k k k  is a mean­zero stationary sequence Z E Z =0 2 Lemma: If  { i  are iid  [ i  and  Var [ ]i 1.96 √ 0.04/√100 ∴  Reject H0 5.1.3 – Innovations Algorithm MA(q) Theorem: The fitted innovations MA(m) is  X tZ +t Zm1 t−1+…+θ ̂mmZ t−m ZtWN 0,̂( vm) θ́ ̂ Here   m  and   m  are obtained by the innovations algorithm with ACVF replaced by sample ACVF 3 m m→∞ m →0 Choose   such that   with  n X MA(q) Note: Suppose  t θ θ ,…,θ One cannot just calculate  qi ’s as estimates of 1 q ̂ θqiθ i Right way: Select a relatively large   and find  θm1…,θ mm  and  vm 2 Set   θ1=θ m1,θ =θ qσ =mq ̂ ̂m 5.1.4 – Hannan­Rissanen Algorithm ARMA(p,q) Step 1:  A high order AR(m) model is fitted to data ( )  ̂ ̂ ̂ Then obtain the residuals  X tϕ Xm1 t−1…−ϕ mmX t−m ,  t=m+1,m+2,…,n Z ≈ Z Rationale:   t t Step 2: Once   Zt  are obtained Then fit the ordinary least squares n s(β)= ∑ ( −ϕ X
More Less

Related notes for STA457H1

Log In


Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.