F106-Part4 Flashcards

(70 cards)

1
Q

List: Axioms of Coherence (4)

A
  1. Monotonicity - If L1<=L2 then F(L1)<=F(L2) i.e. If risk portfolio 2 exhibits greater or equal losses under all future scenarios than the losses on risk portfolio 1, then a monotonic risk measure will indicate that a greater or equal amount of capital should be help in respect of the former
  2. Subadditivity - F(L1+L2)<=F(L1)+F(L2) i.e. A merger of risk does not increase the overall level of risk, it may decrease the overall level of risk due to diversification
  3. Positive Homogeneity - F(k x L)=k x F(L) i.e. If we double the size of the loss situation, then we double the risk
  4. Translation Invariance - F(L + k) = F(L) + k i.e. If we add an amount to the loss, the capital requirement needed to mitigate the impact of the loss increases by the same amount
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition: Convex Risk Measure

A

Diversification will reduce the risk & the amount of capital needed - F(lL1 + (1-l)L2) <= lF(L1)+(1-l)F(L2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition: Deterministic Measures

A

Simplistic measures, giving a broad indication of the level of risk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition: Probabilistic Measures

A

Involve applying a statistical distribution to a risk & measure a feature of that distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

List: Disadvantages of the Notional Approach (5)

A
  1. Potential undesirable use of a ‘catch all’ weighting for undefined asset classes
  2. Possible distortions to the market caused by increased demand for asset classes with high weightings
  3. Treating short positions as if they were the exact opposite of the equivalent long positions
  4. No allowance for concentration risk
  5. The probability of the changes is not quantified
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

List: Disadvantages of the Factor Sensitivity Approach (3)

A
  1. Not assessing a wide range of risk, by focusing on a single risk factor
  2. Being difficult to aggregate over different risk factors
  3. The probability of the outcomes is not quantified
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

List: Advantages of Deviation Measures (3)

A
  1. Simplicity of calculation
  2. Applicability to a wide range of financial risks
  3. Can be aggregated, if correlations are known i.e. V(aX+bY) = a^2V(X) + b^2V(Y) + 2ab*Cov(X,Y)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

List: Disadvantages of Deviation Measures (5)

A
  1. Difficulty in interpreting comparisons
  2. Potentially misleading if the underlying distribution is skewed
  3. Does not focus on tail risk, specifically, underestimates tail risk if the underlying distribution is leptokurtic
  4. Aggregations of deviations can be misleading
  5. Quantifies severity but not probability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Definition: Value at Risk (VaR)

A

The maximum potential loss, with a given probability, a, over a given time period
VaR_a = inf{I e R: P(L>I <= 1-a)}
NOTE: VaR is NOT subadditive & therefore NOT a coherent risk measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Formula: VaR under Normal Distribution

A

VaR_a = mu + sigma * z_a
Key z-values: z_95% = 1.645, z_99% = 2.326, z_99.5% = 2.576
n-day VaR scaling: sigma_n = sqrt(n) * sigma_1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

List: Advantages of VaR (5)

A
  1. Simplicity of its expression
  2. Intelligibility of its units i.e. money
  3. Applicability over all sources of risk
  4. Allowance for the way in which different risks interact to cause losses
  5. Ease of its translation into a risk benchmark
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

List: Disadvantages of VaR (5)

A
  1. Gives no indication of the distribution of losses greater than the VaR
  2. Underestimates asymmetric & fat-tail risks
  3. Sensitive to the choices of data, parameters
  4. VaR not sub-additive i.e. not a coherent risk measure
  5. May encourage ‘herding’
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

List: Advantages of the Empirical VaR Approach (3)

A
  1. Simplicity
  2. No requirement to specify the distribution of returns
  3. Realism - Focuses on the largest market movements observed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

List: Disadvantages of the Empirical VaR Approach (4)

A
  1. Reliance on bootstrapping past data
  2. Implication that past data is indicative of future experiences
  3. Doesn’t facilitate stress testing or scenario testing
  4. Practical difficulties & limitations of interpolation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

List: Advantages of the Parametric VaR Approach (3)

A
  1. Ease of calculation
  2. Reduced dependence on past data
  3. Easy adjustment of parameters
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

List: Disadvantages of the Parametric VaR Approach (6)

A
  1. Difficult to explain
  2. Reliance on past data
  3. Difficulty in ensuring parameters chosen are consistent
  4. Assumes that the parameter values remain constant
  5. Risk of adopting an inappropriate statistical distribution
  6. Difficulty in reflecting complex inter-dependencies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

List: Advantages of the Stochastic VaR Approach (3)

A
  1. More complex features of the underlying loss distribution
  2. Wider ranges of future possibilities than the empirical method
  3. Sensitivity testing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

List: Disadvantages of the Stochastic VaR Approach (4)

A
  1. Difficult to explain
  2. Subjective & difficult choice of distributions & parameters
  3. Gives a different answer each time
  4. Potentially high compute time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Definition: Probability of Ruin

A

The probability that the net financial position of an org / line of business falls below 0 over a defined time horizon - Complementary perspective to VaR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Definition: TVaR / CVaR

A

The expected loss given that a loss over the specified VaR has occurred - E(L|L>VaR_a)
TVaR IS a coherent risk measure (unlike VaR)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Formula: TVaR under Normal Distribution

A

TVaR_a = mu + sigma * phi(z_a) / (1 - a)
Where phi(z) is the standard normal PDF evaluated at z

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Definition: Expected Shortfall (ES) (2)

A
  1. ES_a = (1-a) * TVaR_a
  2. Key relationship: ES = integral of VaR_u du from a to 1
    Note: ES has less intuitive meaning than TVaR & cannot be readily linked to current valuation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

List: Advantages of TVaR (2)

A
  1. Considers the losses beyond VaR
  2. Coherent risk measure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

List: Disadvantages of TVaR (2)

A
  1. Choice of distribution & parameters is subjective & difficult
  2. Highly sensitive to assumptions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Definition: Economic Cost of Ruin
The amount key stakeholders can be expected to lose in the event of ruin
26
List: Factors Affecting the Choice of a Suitable Time Horizon (4)
1. Contractual / Legal constraints 2. Liquidity considerations 3. Time to reinstate risk mitigation 4. Time to recover from a loss event
27
List: Risk Discount Rate Considerations (4)
1. Sponsor's cost of capital 2. Inflation rates 3. Interest rates 4. Rates of returns on investments
28
Definition: Multi-Factor Models
Modelling a response variable, Y_t, a time t, in terms of N explanatory variables X_(t,n)
29
Definition: Dynamic Financial Analysis
Modelling the risks to which the enterprise is exposed & the relationship between these risks
30
Definition: Financial Condition Reports
Reports into the current solvency position of a company & possible future developments
31
Definition: Asset-Liability Modelling
Method of projecting both the Assets & Liabilities of an institution with the same model, using consistent assumptions, to assess how well the assets match the liabilities, and to understand the probable evolution of future CF's
32
List: Disadvantages of Linear Correlation (5)
1. Linear correlation coefficient is not unchanged under the operation of a general (non-linear) strictly increasing transformation 2. Not well defined where V(X) or V(Y) is infinite - Cannot be used on some heavy-tailed distributions 3. Independent variables are uncorrelated, but not all uncorrelated variables are independent 4. Valid measure of correlation only if the marginal distributions are jointly elliptical 5. Not necessarily the case that we will be able to put together a joint distribution to combine the info
33
Definition: Kendall's Tau
A rank correlation measure based on the proportion of concordant vs discordant pairs of observations - tau = (concordant - discordant) / total pairs
34
Definition: Spearman's Rho
A rank correlation measure equal to the linear correlation of the ranks of the observations
35
Definition: Deterministic Model
Uses a set (or sets) of assumptions that are pre-determined
36
List: Advantages of Scenario Analysis (4)
1. Facilitates the evolution of potential impact of plausible events on an org 2. Not restricted to consideration of what has happened 3. Provides useful additional info to supplement traditional models based on statistical info 4. Facilitates the production of action plans to deal with possible future catastrophes by assessing the possible impact of a specified mitigation strategy
37
List: Disadvantages of Scenario Analysis (4)
1. Complex 2. Reliance on successfully generating hypothetical extreme but also plausible events 3. Uncertainty as to whether the full set of scenarios considered is representative or exhaustive 4. Absence of any assigned probabilities
38
List: Advantages of Stress Testing (3)
1. Compare the impact of the same stresses on differing orgs 2. Explicit examination of extreme events 3. Assessing the suitability of responses, by assessing the expected impact of the stress in the absence of any response, and then the expected impact in the proposed response
39
List: Disadvantages of Stress Testing (3)
1. Subjective as to which assumptions to stress & the degree of stress to consider 2. Assigns no probabilities to the events 3. Looks only at extreme situations
40
Definition: Stochastic Model
Inputs are uncertain & provides a probability distribution for the model outputs
41
Definition: Cholesky Decomposition
A method to factor a positive-definite correlation matrix C into C = LL' where L is lower triangular - Used to generate correlated random variables from independent standard normals for simulation
42
Definition: Univariate Time Series
A sequence of observations of a single process taken at a sequence of different times, {X_t: t = 1, 2, ..., T}
43
Definition: Covariance Stationary (3)
1. Constant mean - E(X_t)=mu 2. Constant variance - V(X_t)=s^2 3. A covariance that depends only on the difference in time (the lag) between the observations i.e. Cov(X_t, X_(t+k)) depends only on k
44
Definition: White Noise (3)
A process {e_t} where: 1. E(e_t) = 0 for all t 2. V(e_t) = sigma^2 for all t (constant finite variance) 3. Cov(e_t, e_s) = 0 for all t != s (uncorrelated)
45
Definition: Autoregressive
The current value of the time series, X_t, depends on past values of the time series, together with a single WN term
46
Definition: AR(P) Process
The current value of the time series is in terms of the p previous terms plus a current WN term - X_t = a_0 +a_1*x_(t-1)+....+a_p*x_(t-p)+e_t
47
Definition: Moving Average Process
Models the current value of the time series as a combination of past & present WN terms
48
Definition: MA(q) Process
Defines the current value of the time series in terms of the current & q previous WN terms - X_t = e_t+b_1*e_(t-1)+...+b_q*e_(t-q)
49
Definition: ARIMA Process
An ARIMA(p,d,q) process combines: 1. AR(p) - autoregressive component of order p 2. I(d) - integration (differencing) of order d to achieve stationarity 3. MA(q) - moving average component of order q A non-stationary process made stationary by differencing d times
50
Definition: Partial Autocorrelation Function
Conditional correlation of X_t with X_(t-h) given that we know the values of the time series in between i.e. X_(t-1), X_(t-2), ..., X_(t-h) are known
51
List: Tests To See if Residuals are WN (5)
1. Plot of residuals against time 2. Turning point test - H_0 = Graph of the residuals is pattern-less 3. Plot of the sample ACF of residuals 4. Ljung & Box 'Portmanteau' test - H_0 = No correlation present between residuals 5. Durbin-Watson Statistic - H_0 = No correlation present between residuals
52
Definition: Heteroscedasticity
Processes where the variance changes over time
53
Definition: ARCH Models (2)
Models which capture: 1. Volatility clustering 2. Leptokurtosis
54
Definition: GARCH Models
Generalised ARCH model where the variance is now allowed to depend on the previous values of the variance as well as previous squared values of the process
55
Formula: GARCH(1,1) Variance
sigma^2_t = alpha_0 + alpha_1 * e^2_(t-1) + beta_1 * sigma^2_(t-1) Stationarity condition: alpha_1 + beta_1 < 1
56
Definition: Copula
Function that defines the relationship between 2 or more variables, but it takes (marginal) probabilities as its arguments rather than particular values of the variables concerned P(X<=x, Y<=y) = F_(X,Y)(x,y) = C_(X,Y)[F_X(x), F_Y(y)]
57
Definition: Sklar's Theorem (1-2)
Let F be a joint distribution function with marginal CDF's F_1, ..., F_N then Sklar's theorem states that there exists a copula, C, such that for all x_1, ..., x_N: F(x_1, ..., x_n) = C(F_1(x_1), ..., F_N(x_N)) 1. If the marginal CDF's are continuous, then C is unique 2. If we have a joint CDF & marginal CDF's, then these can be linked by a copula function
58
Formula: Clayton Copula
C(u,v) = (u^(-alpha) + v^(-alpha) - 1)^(-1/alpha) alpha > 0 Kendall's tau to alpha: alpha = 2*tau / (1 - tau) Lower tail dependence: lambda_L = 2^(-1/alpha)
59
List: Copula Types & Tail Dependence Properties (5)
1. Clayton: Lower tail dependence ONLY (alpha>0) - suitable for modelling simultaneous negative returns 2. Gumbel: Upper tail dependence ONLY - suitable for extreme high value associations 3. Frank: NO tail dependence, symmetric - suitable when no extreme co-movement expected 4. Gaussian (Normal): NO tail dependence - key limitation, implicated in 2008 CDO mispricing 5. Student-t: Symmetric tail dependence (both upper & lower) - degree controlled by degrees of freedom
60
List: Scarsini's Properties of a Good Measure of Concordance (7) | SUC DICC
1. Completeness of Domain - M_XY is defined for all values of X,Y with X,Y being continuous 2. Symmetry - M_XY = M_YX 3. Coherence - If C_XY(u1,u2) > C_WZ(u1,u2) for all u1, u2 then M_XY > M_WZ 4. Unit Range - -1 <= M_XY <= 1 & extremes not feasible 5. Independence - If X & Y independent then M_XY = 0 6. Consistency - If X=-Z then M_XY = -M_ZY 7. Convergence - If copulas converge then the measure should also converge
61
Definition: Least Squares Regression
Model using N independent explanatory variables - Y = XB + e
62
Definition: Ordinary Least Squares
The parameters are selected to minimise the sum of squared error terms - e'e = e_1^2 + ... + e_T^2 - which has solution - b = (b_1, ..., b_N)' = (X'X)^-1*X'Y
63
List: Assumptions of the Closed-form OLS Solution (6)
1. A linear relationship exists between variables 2. The inverse of the data exists 3. Explanatory variables should not be correlated with error terms 4. Error terms have a constant & finite variance 5. Error terms should not be correlated with one another 6. Error terms are normally distributed
64
Definition: Generalised Least Squares
The variance of the error terms is not necessarily assumed to be constant & they are also not necessarily assumed to be uncorrelated with one another
65
List: Qualitative Model Selection Tests (4)
1. QQ Plots 2. Histograms with superimposed fitted density functions 3. Empirical CDF's with superimposed fitted CDF's 4. Autocorrelation functions of time series data (ACF's)
66
Definition: GEV Family of Distributions
Describes the distribution of the standardised block maxima X_M = max(X_1, ..., X_n) when n is large - Used in the Block Maxima approach to EVT
67
Definition: Generalised Pareto Distribution (GPD)
Describes the tail of the distribution above a threshold, P(X>x+u|X>u) for large values of u - Used in the Peaks Over Threshold (POT) approach to EVT
68
List: Block Maxima vs Peaks Over Threshold (2)
1. Block Maxima: Divide data into blocks (e.g. years), take max of each block, fit GEV - Simple but wastes data 2. Peaks Over Threshold (POT): Use all observations above a chosen threshold u, fit GPD - More efficient use of data but requires careful threshold selection
69
Definition: Fundamental Portfolio Management Concepts (5)
1. Risk 2. Reward 3. Diversification 4. Leverage 5. Hedging
70
Definition: Efficient Portfolio
The portfolio that gives: 1. The highest return for a given level of risk OR 2. The lowest risk for a given level of return