Basics Flashcards

(17 cards)

1
Q

What is a white noise

A

A series of uncorrelated random variables with: Mean = 0, Variance = σ², Cov(Xₜ, Xₛ) = 0 for t ≠ s.
If normal ⇒ White Gaussian Noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is weak and strong stationarity

A

Weak: mean, variance, and autocovariance depend only on lag, not time.
Strong: full distribution of the process is time-invariant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

AR(1) process equation

A

X_t = φ * X_{t-1} + ε_t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the condition for an MA(q) process to be invertible?

A

The roots of the MA polynomial theta(B) = 0 must lie outside the unit circle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

General ARMA(p,q) equation

A

X_t = φ₁X_{t-1} + φ₂X_{t-2} + … + φ_pX_{t-p} + ε_t + θ₁ε_{t-1} + … + θ_qε_{t-q}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does the autocorrelation function (ACF) measure? Properties also.

A

Linear dependence between Xₜ and Xₜ₋ₖ.
ACF is always 1 at h=0
If ACF stays flat, non stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Autocovariance (γ) and autocorrelation (ρ) formula

A

γ_k = Cov(X_t, X_{t-k})
ρ_k = γ_k / γ_0

Autocovariance measures covariance between time-lagged values; autocorrelation is its normalized form (between -1 and 1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the lag operator and how is it used?

A

L^k X_t = X_{t-k}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What’s the difference operator Δ?

A

ΔX_t = X_t - X_{t-1} = (1 - L)X_t

Used to difference non-stationary series.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What’s the purpose of the Box–Pierce and Ljung–Box tests? Which is better

A

They test whether a series’ residuals are white noise (i.e., no autocorrelation). Ljung box is better for small samples also.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does the Durbin–Watson test check?

A

Detects first-order autocorrelation in regression residuals.
If 0 -> +ve corr
If 2 -> no corr
If 4 -> -ve corr

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

For AR(1) tell if stationary or non stationary |φ| ≥ 1, |φ| < 1.

A

If |φ| ≥ 1, shocks explode and variance infinite so non stationary.
Else stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Breusch Godfrey test

A

Tests for higher order (upto lag q) autocorrelation of residuals in a regression. More general and robust version of Durbin Watson test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are unit roots and characteristic equation of AR and MA models

A

AR(p) is stationary if all roots of
1 - φ₁z - φ₂z² - … - φₚzᵖ = 0
lie outside the unit circle (i.e., |z| > 1).

MA(q) is invertible if all roots of
1 + θ₁z + θ₂z² + … + θ_qz^q = 0
lie outside the unit circle (i.e., |z| > 1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Stationarity Tests (DF/ADF steps)

A

Augmented Dickey Fuller Test:
1) Start with AR(1)
2) Difference operator
3) Choose p lags to make residuals white noise
4) Test hypothesis H0: unit roots
H1: stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

All stationarity tests, null or alternate is stationary?

A

Augmented Dickey Fuller (alternate), Phillips Perron (alternate) and KPSS (null)

17
Q

What is partial corr?

A

At lag k, measures direct linear relationship between Yt and Yt-k after removing effect of all intermediate lags.
PACF(k) = Corr(Yt, Yt-k | Yt-1,…,Yt-k+1)