Assumptions of OLS
Residuals are white noise
Homoskedastic
No info left = no serial correlation
R²
Determinagtion coeff. = [0,1]
explication of the model
ESS
Error sum of squared
[Y(est)-Y(mean) ]²
TSS
Total sum of squared
[Y(actual)-Y(mean) ]²
RSS
Residuals sum of squared
[Y(actual)-Y(est) ]² = sum of u²
more variables (k)
more convergence
less efficiency = increase variance
Inversion of Lag Polynomial
AR(1) -> MA(infinite) and vice versa
Assumptions of ( AR(1) into MA(∞) )
Ensures the infinite sum 1+ϕL+ϕ2L2+⋯ converges
Guarantees constant mean and variance over time
Assumptions of ( MA(1) into AR(∞) )
MA(1): ∣θ∣<1
For MA(q): All roots of 1+θ1z+θ2z²+⋯=0 satisfy ∣zi∣>1
Stationarity assumptions
E(Y) = cst
V(Y) = cst
Cov(Yt, Yt-1) = cst
for all t
PACF => better for
MA (decaying)
ACF -> MA
Go to zero at lag
PACF -> MA
decays to zero
PACF -> AR
Go to zero at lag
ACF -> AR
decays to zero
Box Jenkins
Fit an ARMA
1) clean data
2) Guess p and q
3) run
4) adjust
lower BIC, AIC
BIC AIC
BIC underestimates nb of lags
AIC overestimates nb of lags
Test for white noise
Box Pierce Q statistic
LB Statistic
LB Statistic decision process
LB =< chi square => acf patterns are white noise
Var(p) : p too short
poorly specified
Var(p) : p too long
too many degree of freedom will be lost
VAR assumptions
VECM assumptions