Models Theory Flashcards

(24 cards)

1
Q

Assumptions of OLS

A

Residuals are white noise
Homoskedastic
No info left = no serial correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

A

Determinagtion coeff. = [0,1]
explication of the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

ESS

A

Error sum of squared
[Y(est)-Y(mean) ]²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

TSS

A

Total sum of squared
[Y(actual)-Y(mean) ]²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

RSS

A

Residuals sum of squared
[Y(actual)-Y(est) ]² = sum of u²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

more variables (k)

A

more convergence
less efficiency = increase variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Inversion of Lag Polynomial

A

AR(1) -> MA(infinite) and vice versa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Assumptions of ( AR(1) into MA(∞) )

A
  • stationarity
  • All roots of the AR lag polynomial ϕ(L)=0 must lie outside the unit circle.

Ensures the infinite sum 1+ϕL+ϕ2L2+⋯ converges
​Guarantees constant mean and variance over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Assumptions of ( MA(1) into AR(∞) )

A
  • invertibility
  • All roots of the MA lag polynomial θ(L)=0 must lie outside the unit circle.

MA(1): ∣θ∣<1
For MA(q): All roots of 1+θ1z+θ2z²+⋯=0 satisfy ∣zi∣>1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Stationarity assumptions

A

E(Y) = cst
V(Y) = cst
Cov(Yt, Yt-1) = cst
for all t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

PACF => better for

A

MA (decaying)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

ACF -> MA

A

Go to zero at lag

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

PACF -> MA

A

decays to zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

PACF -> AR

A

Go to zero at lag

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

ACF -> AR

A

decays to zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Box Jenkins

A

Fit an ARMA

1) clean data
2) Guess p and q
3) run
4) adjust

lower BIC, AIC

17
Q

BIC AIC

A

BIC underestimates nb of lags
AIC overestimates nb of lags

18
Q

Test for white noise

A

Box Pierce Q statistic
LB Statistic

19
Q

LB Statistic decision process

A

LB =< chi square => acf patterns are white noise

20
Q

Var(p) : p too short

A

poorly specified

21
Q

Var(p) : p too long

A

too many degree of freedom will be lost

22
Q

VAR assumptions

A
  • Stationarity I(0)
  • No Perfect Multicollinearity
  • Errors: White Noise
23
Q

VECM assumptions

A
  • Cointegration between variable
  • No stationarity I(0)
  • Errors: White Noise