CIA.Models Flashcards

(20 cards)

1
Q

Define ‘model’

A

A simplification of reality using financial, economical, mathematical and statistical concepts.

(Hint = FEMS)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the 3 elements of a ‘model’

A
  1. Specification: A description of the parts of a model and their interactions (data, assumptions, methods, entities, events)
  2. Implementation: The systems that perform the calculations
  3. Run: The inputs/outputs of the implementation

(Hint = SIR)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define ‘model risk’

A

The risk that the user will draw inappropriate conclusions due to shortcomings of the model or its use
Measured by: Severity of model failure & Likelihood of model failure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How can model risk be measured? (2)

A
  1. Severity of model failure
  2. Likelihood of model failure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Identify 3 considerations in assessing the severity of model failure

A
  1. **Financial significance **
    (Ex: severity is higher if estimating a major balance sheet item)
  2. Importance of model
    (Ex: severity is lower if multiple models are being used, or How much the results contribute to a key decision)
  3. Frequency of use of model
    (Ex: severity is higher if model is used frequently=repeated errors, or ow frequency increases the risk of the model being misunderstood or misused)
  4. Non-financial Impact: Reputational damage or jeopardizing standing with regulators.

FIF : 1. Financial , Importance and Frequ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Identify 4 considerations in assessing the likelihood of model failure

A
  1. Complexity of model
    (Ex: higher complexity means higher likelihood of misuse of model)
  2. Expertise of users
    (Ex: non-expert users may not understand model limitations)
  3. Documentation
    (Ex: bad docs means high likelihood of model failure!!!)
  4. Testing
    (Ex: inadequate testing means high likelihood of model failure)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Does the actuary have more control over the SEVERITY or LIKELIHOOD of model failure? (justify)

A

More control over likelihood
- CHOOSE a more reliable model (within the actuary’s control)
- TEST the model more thoroughly (within the actuary’s control)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Identify the 4 steps an actuary should take before using a new model or substantially changed model

A
  1. Review specification
  2. Validate implementation
  3. Deal with limitations
  4. Keep documentation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

describe what an actuary does when reviewing a model’s SPECIFICATIONS (3)

A

Verify DAMs:
- your DATA fits the models requirements, and reliable and sufficient
- METHODS are sound
- ASSUMPTIONS are appropriate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe 5 things an actuary does when validating a model’s IMPLEMENTATION

A
  1. Compare with other tested models
  2. Maintain a set of test cases
  3. Backtesting (testing with historical data where you already know the answer)
  4. Run an entire live file through successive versions of the model (for models with a higher risk-rating)
  5. Peer review of testing procedure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Identify 3 things an actuary should include when DOCUMENTING a model

A
  1. How the model was CHOSEN
  2. How it was TESTED
  3. What are its limitations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Identify 2 things an actuary should evaluate when looking at an existing model that’s being used in a NEW WAY

A
  1. Check that the initial model was properly validated
  2. Review limitations in the new application that may not have been relevant in the initial application
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How should an actuary evaluate a model approved for use BY OTHERS?

A

Should review & approve the initial validation report

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How should an actuary evaluate a model OUTSIDE ACTUARY'S EXPERTISE

A
  1. Rely on recognized experts
  2. Understand basics (inputs/outputs, validation, risk-rating, control framework)
  3. Documentation: Disclose your reliance on other experts
  4. consider risk rating

Make a reasonable attempt at understanding the model’s…
- specifications
- validation (extent to which experts were involved)
- risk-rating
- complexity
- control framework

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the purpose (3) of sensitivity testing regarding models

A
  1. Validate model
  2. Understand the relationship between inputs/outputs
  3. Develop a sense of comfort with the model
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Identify 3 types of validation that should be done when USING a model

(Hint: DAR)

A

DAR: validation of Data, Assumptions, Results

  • data should be Reliable & Sufficient
  • validate non-global assumptions that vary by model run
  • results should be “reasonable” relative to input

+ Documentation: Retain model version, inputs, outputs

17
Q

What does it mean for data to be RELIABLE

A
  • Data RECONCILES to audited sources (Ex: balance sheet)
  • Data is REASONABLE with respect to prior period data (Ex: data from prior quarter shouldn’t be hugely different)
18
Q

What does it mean for data to be SUFFICIENT

A
  • Data FITS model specification
  • Data is available in a CONSISTENT format
19
Q

Identify 2 checks thant can be done to validate the results of a model

A
  1. Inputs/outputs should be CONSISTENT
    (input data should match similar fields in output file)
  2. Results should be REASONABLE in both magnitude & direction
    (small change in inputs causes a small change in outputs)
20
Q

Name two approaches for risk rating models

A
  • One-dimensional scoring (size, importance, complexity, expertise, documentation).
  • Two-dimensional severity × likelihood assessment