Quantify Flashcards

Modelling, stress testing, scenario analysis (141 cards)

1
Q

Range test

A

Test if the value is in the acceptable range of values. Ex: month 13 would fail the range test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Ratio control test

A

Test that the ratio of 2 data elements is within a reasonable range (upper and lower limits). Ex: average salary or average cost of a good

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Zero control test

A

Check that the total collected and the sum of the pieces collected match. Ex: Earnings from 3 channels shouldn’t be higher than the total earnings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Internal consistency test

A

A test to check that the values of data elements within a database are consistent (if-then tests and zero control tests).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If-then test

A

Test the value of a data element based on the value of a different data element. Ex: If “total of past claims” is 0, then “number of past claims” must also be 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Goodness of fit tests

A
  1. Adjusted R^2
  2. F test
  3. t test
  4. Likelihood ratio test
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Adjusted R^2

A

Measures the proportion of the variation in the dependent variable that is predictable from the independent variables. Adjusted for the number of predictors in the model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

F test

A

Tests whether all coefficients in the regression are statistically different from 0 or not

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

t test

A

Tests whether a single coefficient in the regression is statistically different from 0 or not

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How to select a candidate model

A

Likelihood ratio test, AIC, BIC. Also analyze residuals to gauge goodness of fit.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Challenges of likelihood-based model selection criteria

A

1) Selection is relative. We don’t know if any of the candidate models actually provide an adequate fit, just which one fits best out of the candidates.
2) Likelihood is dominated by the fit in the center of the distribution. For risk mangement, we’re often more interested in the tails.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Likelihood ratio test

A
  • Used to compare nested models (one contains all of the independent variables of the other plus one or more additional variables)
  • Null hypothesis is that the additional variables give no significant improvement in the explanatory power of the model (so the coefficients are 0)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How to validate a time series model

A

Back-testing. Fit the model to data for one period, then test how well the model performs in a subsequent period

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How to validate a cross-sectional model

A

A similar approach to back-testing can be used. The data can be split into 2 groups rather than 2 time periods: a training set and a validation set. Make sure there are no time effects that might make the model appear more accurate than it is.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Deterministic scenario

A

Individual scenarios producing individual paths

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Going concern scenario

A

An adverse scenario that is more likely to occur and/or less severe than a solvency scenario. Used to test the insurer’s ability to maintain operations and fulfill obligations while at least meeting regulatory minimum capital ratios

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Integrated scenario

A

An adverse scenario that is a combination of at least 2 risk factors. The 2+ risk factors can be correlated or not, extreme or not, but the integrated scenario needs to be plausible and consider ripple effects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Solvency scenario

A

A plausible adverse scenario that is credible and has a non-trivial probability of occurring and will test the insurer’s ability to maintain a positive equity position.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Standard scenario

A
  • A scenario prescribed by regulators.
  • Regulators have all firms test so that they can gauge possible impacts of systemic risks.
  • The opposit of a standard scenario is an own scenario.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Stochastic scenario

A

A weighted average of a range of scenarios with random variation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Purpose of stochastic scenarios

A

Economic assumptions are often derived from stochastic scenario generators

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Challenges of stochastic scenarios

A

1) Correlation of risks must be taken into account when stochastic scenarios are generated.
2) Since they’re a weighted average, they don’t focus on the effect of low frequency, high severity events, so deterministic scenarios are better for things like sensitivity testing and development of management strategies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Stress scenario

A

A scenario with significant or unexpected adverse consequences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Historical scenario

A

A scenario based on experience during an observation period, possibly triggered by a certain historical event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Pros of historical scenarios
- Easily understood since the situation actually occurred - Impacts: Information concerning the short, medium, and long term effects of the scenario might be available. - Causes: Events leading up to the scenario/event might be available.
26
Cons of historical scenarios
- Historical scenarios are constrained to be no more severe than the most severe historical event - Historical scenarios may not be comprehensive if they're drawn from time periods during which the financial variables or asset classes didn't exist. In this case, the values of the variables must either be imputed based on available data, or postulated based on financial and economic reasoning.
27
What might make a historical scenario different from current circumstances?
- Changes in population mix and movement - Medical advances - New technologies, e.g., computers and the internet - Globalized and more closely linked financial markets - New asset classes owned - Management behavior in response to common incentive schemes - Different valuations used, making historical data not directly comparable to current values - Effects of media on politics - Changes to regulatory and market frameworks.
28
How to make a historical scenario
Historical scenarios are defined in terms of relative changes that occurred in financial risk factors during the period of stress. Let's use interest rates as an example. Don't use the rate at the time or even the absolute change in rate because 100 bps change would represent a very different level of volatility in a high interest rate environment than it would in a low interest rate environment. Instead, use the relative change or log change.
29
Reverse scenario
A scenario that is expected to give rise to a given amount of financial loss
30
Purpose of a reverse scenario
Can be a powerful tool to conduct a reasonableness check on the result of internal models, as well to assess and set various risk limits
31
How to create a reverse scenario
Look at recent scenario analysis and stress tests. Shocks that result in insolvency are shocks to focus on for reverse scenarios. Then think of a scenario that could lead to those changes in risk factors.
32
Scenario
A scenario is a possible future environment, either at a point in time or over a period in time.
33
Properties of a scenario
- Comprehensive - Extreme - Plausible
34
Types of scenarios
1) deterministic or stochastic 2) historical or synthetic 3) standard or own 4) stress 5) integrated 6) reverse 7) going concern 8) solvency 9) single factor or multi factor 10) single period or multi period 11) single event or multi event 12) company-specific or global
35
Components of a scenario
1. Narrative. Include the set of major factors that contribute. Maybe a tipping point. 2. Initial event. May be global, regional, or company-specific. 3. Time evolution. Make sequence of events clear, maybe use a timeline.
36
Purpose of a scenario
- Used in Scenario Analysis to project the effects of the scenario over a time period - Can be used to assess a particular firm or an entire industry or economy - Supports: - setting risk appetite - developing strategies and contingency plans - risk management
37
What is required to develop a scenario?
- an appreciation of long-term historical experience - an understanding of current trends - a sense of how events may unfold in the future - an open mind concerning future possibilities
38
How to create a scenario top-down approach
1. Start by defining the narrative and initial event. (From senior management or collected from the thoughts of managers) 2. Determine the implications of the event on each risk factor that serves as inputs to the quantitative model 3. Revalue the portfolio and the evolution of the portfolio value over time
39
Pros and cons of a top-down scenario generation approach
- Pro: Senior management should not dismiss the scenario and results if they defined the scenario - Pro: If experiences of managers were gathered, the scenario can capture whether there are any controls in place and the types of controls (preventative, detective, etc.) - Con: It's difficult and subjective to translate an event into specific values to be inputted into the firm's risk models
40
How to create a scenario bottom-up approach
The process of creating a reverse scenario. 1. Define the loss threshold 2. Identify movements in specific risk factors that would drive the insurer below the threshold 3. Evaluate if the degree of change is reasonable. If so, identify what sorts of scenarios would make these changes a reality. 4. Revalue the portfolio and the evolution of the portfolio value over time
41
Pros and cons of a bottom-up scenario generation approach
- Pro: simple to apply - Con: can miss the underlying cause of 2 separate events that may stress a portfolio simultaneously - Con: tends to be less effective for longer time horizons
42
Scenario analysis
A flexible framework to assess the financial effect of events in adequate detail so that their causes can be identified and their effects on the firm can be understood
43
What's the difference between scenario analysis and stress testing?
- Scenario analysis focuses on the effects of a given situation. Ex: To study impacts of climate risk, we may expect a mild increase in morbidity risk, a large increase in market risk, etc. - Stress testing does not focus on a cause. It involves examining the impacts of a shock to specific risk drivers to assess vulnerabilities. Ex: What is the impact if lapse increases by X at the same time as new business decreases by Y?
44
Purpose of scenario analysis
Not a forecast. Not a prediction. The purpose is to stimulate the firm and management to be prepared in case a similar event occurs.
45
Benefits of scenario analysis
- Can enhance stakeholders' understanding of the financial vulnerability and viability of the firm - Complements the use of Economic Capital models that apply probabilities to possible future scenarios to determine appropriate capital needs of a firm - Can enhance the risk culture of a firm - Can alert decision makers - Can enable firms to base their business strategies and risk mitigation activities on a range of forecasts rather than a single best-estimate projected result or an average of stochastic results
46
How to review scenario analysis
Check the data quality, methodology, assumptions, and reasonability of results. Check that the magnitude and direction of impacts are consistent with the underlying assumptions. Compare the impacts of different scenarios. Compare the impacts to recent real events.
47
Synthetic scenario
Describes hypothetical conditions that haven't been observed and that can thus be more easily tailored to a specific situation of interest
48
Pros and cons of synthetic scenarios
- Pro: With synthetic scenarios, we can explore events that are different from and more extreme than historical events - Con: They require more assumptions than historical scenarios do, so they can be more difficult to communicate and understand
49
Reverse stress test
The process used to back-solve the required stress that will produce a specific adverse business outcome
50
Benefits of reverse stress test
- Uncover hidden risks, concentrations, interactions among risks, and inconsistencies in hedging strategies - Identify scenarios that would cause insolvency
51
Sensitivity
The effect of a set of alternative assumptions regarding a future environment - Can be the result of a single or several alternative risk factors - Can occur over a short or long period of time
52
Standard shock
A prescribed stress to apply to a risk factor.
53
Sensitivity test
The process of applying shock to a single risk factor and calculating the impact of that change on the portfolio value
54
What's the difference between a sensitivity test and a stress test?
- Stress testing is a form of sensitivity testing - Typically, sensitivity tests are focused on all risk, upside or downside, while stress tests are focused on downside risk. - Stress tests may involve shocks to multiple risk factors
55
Why should sensitivity test results be analyzed with caution?
Shocking only a single risk factor may not relfect relationships with other factors.
56
Stress test
A test of the impacts of a shock to specific risk drivers to assess vulnerabilities
57
Properties of a stress test
- Extreme but plausible - Usually specified in quantitative terms - Usually no probability of occurrence is assigned - May be the result of several risk factors over several time periods or just one risk factor in short duration
58
Benefits of stress tests
- Establish and communicate a firm's risk appetite - Challenge the model and assumptions used in quantitative analysis - Identify risks and risk concentrations that historical data cannot
59
Challenges of stress tests
- We have no objective measurement of the likelihood of the shocks. - Since stress tests are not focused on the underlying events, risk implications can be missed.
60
Purposes of stress tests
1) Risk identification and control 2) Complement risk management tools like VaR (ex: testing the validity of other tools) 3) Supporting capital and liquidity management
61
Exploratory data analysis
An approach on data analysis that employs various techniques (primarily graphical) to detect outliers and other anomalies in a dataset
62
Block maxima model
A type of model used in extreme value theory. Models the distribution of the maximum value in a random sample of losses. (As the sample size increases, the max value can be considered an extreme value)
63
How does a block maxima model work
Suppose we have an iid sample of n values with common distribution function F(x). Let Mn denote the maximum of the sample. As the block size, n, increases, there are only 2 possibilities: the observation is the new maximum of the sample or it is not. We can normalize the block maximum to find a limiting distribution for Mn, the block maximum. The limiting distribution can give us an upper limit on the maximum.
64
Types of block maxima models
1. Frechet (has a lower bound) 2. Gumbel (unbounded) 3. Weibull (has an upper bound)
65
Points over threshold model
A type of model used in extreme value theory. Models the rare, very large losses, defined as those exceeding a threshold.
66
How to fit a points over threshold model
1) Fit a severity distribution to the data 2) Order the data sample 3) Select a threshold such that the excess loss random varialbe approximately follows a generalized pareto distribution 4) Fit the data above the threshold to the GPD 5) Calculate the desired statistics
67
Benefits of points over threshold models have over block maxima models
May be more useful in risk management because POT models focus on the statistic that is measured with TVaR
68
Deterministic model
Involves assigning a single assumption for each variable for projection. Prudence can be added only through margins in the assumptions used or through changing the assumption
69
Benefits of the deterministic model
- Regulators find deterministic modelling useful to compare the effect of a range of consistent scenarios on a number of firms - Deterministic modelling is more appropriate when there is insufficient information to build a more complex stochastic model (as is common with new risks)
70
Factor model
The simplest form of model that can be used to measure risk. A prescribed factor is multiplied by a known base amount to estimate the amount of risk.
71
Pros and cons of factor models
Pro: very easy to implement and interpret Con: no allowance for diversification or concentration Con: can only be used if the item (risk, asset class, etc) is defined. Each item gets a factor.
72
Frequency severity model
Models an aggregate loss by assuming it is a random sum of random, identically distributed individual losses
73
How to make a frequency severity model
1. Fit a frequency distribution to the frequency data 2. Fit a severity distribution to the severity data 3. Calculate the desired statistics of the aggregate loss distribution
74
Benefits of frequency severity models
1. The mean and variance of the aggregate loss can be easily calculated from the moments of the frequency and severity distributions. (VaR and TVaR require numerical methods) 2. Well suited for operational risk management because: a) many types of operational risks can be viewed as arising from independent loss events and b) operational risk has no upside, consistent with the models
75
Cons of frequency severity models
1. Assumes independence of frequency and severity, but this isn't always true 2. The main weight of the fitting process focuses on the center of the distribution, so the tails may not fit well
76
Autoregressive process
A discrete time series model where values are predicted using the previous value(s)
77
GARCH
Generalized autoregressive conditionally heteroscedastic model. A discrete time series model with time-varying volatility (so returns are dependent on the returns in period time steps)
78
Pros and cons of GARCH
Pro: stochastic volatility Pro: volatility clustering is incorporated Con: not scalable Con: volatility will trend back to a lower value over some period (regime switching fits better with reality) Con: high volatility periods are equally likely to be instigated by a random jump up as a random jump down
79
Geometric Brownian Motion
A continuous time series model. Commonly used for asset prices, particularly for short-term applications
80
Pros and cons of geometric brownian motion
Pro: tractability Pro: GBM asset prices provide the underlying assumption of the Black-Scholes option pricing formulas Con: continuous, so doesn't fit long time horizons well Con: Fails to capture extreme market disruptions that are often critical for risk management Con: assumes volatility is constant
81
Independent lognormal model
A discrete time series model. If we observe the continuous time GBM process in discrete time-steps, we would see an independent lognormal (ILN) process. (The process is "independent" because Xt is independent of previous or subsequent values)
82
Pros and cons of independent lognormal model
All the same pros and cons as GBM, plus the benefit of scalability. (The time series takes the same form with appropriate adjustments to the parameters regardless of how frequently it is observed)
83
Regime switching model
A time series model (discrete or continuous) that assumes the process randomly switches between K different underlying processes, each with different parameters.
84
Pros and cons of regime switching model
Pro: 2 or 3 regimes have proven to be quite robust for fitting stock prices Pro: 2 or 3 regimes are relatively tractable Pro: allows for sudden jumps in volatility Con: we don't actually observe the regime. We assign probabilities to each regime conditional on the value of the log-return
85
What are time series models used for
Modelling risks (especially financial) over time, especially stock prices
86
Strictly stationary
A strictly stationary process has characteristics that do not change over time
87
Trend stationary
A series that is stationary when a fixed time trend is removed. Assumed to oscillate around a value other than 0.
88
Difference stationary
Differences in observations are stationary, but the observations themselves are not. Assumed to oscillate around a steadily changing value.
89
White noise process
A stochastic process that oscillates around 0 with a fixed variance and no observation is correlated with any previous observation
90
Inter-temporal link processes
Process that assume values seen in one period are dependent on values in prior periods. Autoregressive processes, Integrated processes, Moving average processes
91
Integrated process
A process that is difference stationary
92
Moving average process
Assumes the error term is linked to prior periods (so it doesn't follow a white noise process)
93
How to choose between ILN, GARCH, and regime switching models
1) ILN is for very short-term problems with high frequency time steps. Has a thinner tail. 2) RSLN needs more data for an adequate fit, best for long time horizons, and has a fat tail (good for TVaR) 3) GARCH is flexible for long and short term uses, and has a fat tail (good for TVaR)
94
What benefits do discrete time series models have over continuous?
1) They tend to provide a better fit for longer time horizons 2) They're very convenient for monte carlo simulations 3) They're more straightforward to interpret
95
Time aggregation problem
The problem of transforming risk measures from one time horizon to another. Scalability.
96
Bayesian approach
An approach to modelling that incorporates parameter uncertainty by treating parameters as random variables. Useful to measure parameter risk.
97
How to apply the bayesian approach
1) Assign a prior distribution to the parameters 2) Determine the posterior distribution for the parameters using the likelihood function, the prior distribution, and the data 3) Use the posterior distribution to estimate the parameters (including means, stnd deviations, and covariances)
98
Benefits of the bayesian approach
1) The posterior distribution provides the fullest information about parameter uncertainty, including dependencies between parameters 2) The bayesian framework can be extended to allow for model uncertainty too
99
Challenges of the bayesian approach
1) An inappropriate selected prior distribution will generate inappropriate results 2) It's only possible to derive an analytical form for the posterior distribution in a few special cases
100
MCMC method
Markov chain monte carlo method. A method used to quantify parameter uncertainty
101
How does the MCMC method work
Generate random samples of parameter vectors from their joint posterior probability distribution. Use this sample to generate point estimates and standard errors for the individual parameters
102
Method of max likelihood
A method to fit data to a distribution or a model. Choose parameters that give the highest probability given the observations recorded.
103
Benefits of the method of max likelihood
1) Parameters estimated are always within the acceptable range (unlike method of moments) 2) As n increases, the bias reduces 3) Allows use of likelihood ratio test, AIC, and BIC to select among candidate models
104
Method of moments
A method to fit data to a distribution. Set as many moments of the distribution as there are parameters to statistics calculated from the data, and solve for the parameters.
105
Pros and cons of the method of moments
Pro: simple Pro: can be used to fit some copulas Con: can give parameter estimates that are outside the acceptable range
106
Monte Carlo simulation and how it works
A stochastic simulation. 1) Create a model to generate a large number of separate, random scenarios. 2) The full range of outcomes is an approximation to the distribution of the loss being modelled, so the mean, std dev, etc can be estimated as if the simulation were a data sample for the loss.
107
What are the ways to value an object?
1) Market approach 2) Cost approach 3) Income approach
108
Market approach to valuing an object
Value = value of identical or similar object traded in the market. Market value. Con: market prices aren't given for quality data
109
Cost approach to valuing an object
Value = cost incurred for making or buying an exact copy. Con: Not future-oriented and doesn't consider the benefits the object creates once obtained
110
Income approach to valuing an object
Value = total economic benefit created by the object in the future (NPV). Con: difficult to calculate to value quality data.
111
Bayesian network
Probabilistic causal networks. 1) If there's 1 parent and 1 child, the connection is linear 2) If there's 1 child with multiple parents, the connection is converging 3) If there's 1 parent with multiple childredn, the connection is diverging
112
What are bayesian networks used for
1) Use observed/estimated risks from various processes to estimate cumulative probabilities of failure 2) Useful for large numbers of nodes and layers. Assess the impact of complex networks and test scenarios.
113
Copula
A multivariate probability distribution function given uniform marginal distributions
114
Purpose of copulas
- Risks can be correlated with each other. Linear correlations can be used to analyze risk relationships, but risk correlations can behave differently in extreme scenarios. Copulas can be used instead. - Often used for default risk modeling since more defaults occur when the market is bear that it is bull. Thus it cannot be modelled by a simple multivariate normal distribution with correlation matrix.
115
Challenges of copulas
1) There's no unique way to determine what type of copula to use 2) There are multiple methods to assess goodness of fit (likelihood functions, graphical comparisons of monte carlo simulations)
116
Benefits of copulas
1) Can be used on any uniform marginal distributions (even ones that aren't in the same family) 2) The bottom-up nature is handy for risk management to aggregate risks across the firm 3) The copula function is unchanged when the underlying variables are transformed (as long as the transformation is a strictly increasing function)
117
Fundamental copulas
Derived from fundamental relationships between variables 1) Independence 2) Comonotonic 3) Countermonotonic
118
Explicit copulas
The copula function has a simple, closed form. 1) Clayton 2) Gumbel
119
Implicit copulas
Constructed from multivariate distributions 1) Gaussian 2) Student's t
120
Discriminant analysis
An approach that takes the quantitative characteristics of G groups and weights them so that the results differ as much as possible between groups but as little as possible within groups
121
K nearest neighbors
A non-parametric approach to determine the group to which observations belong
122
GLM
Generalized linear model. Used to link a linear regression model and a dependent variable that can take only a limited range of values.
123
Types of GLMs
Probit and logit
124
Probit
A GLM that uses the standard normal cumulative distribution function. Useful for independent variables with infinite range paired with a dependent variable ranging from 0 to 1.
125
Logit
A GLM that uses the logistic function to ensure that the dependent variable falls between 0 and 1. The logistic function is symmetrical and bell shaped, like the normal distribution, but the tails are heavier.
126
Internal model
Often used to project a firm's financial condition
127
Least squares regression and how it works
A method to fit data to a linear model. Choose coefficients for variables such that the sum of the squared error terms is minimized.
128
How to decide on a least squares regression strategy
The ordinary least squares methods assumes the error terms have constant variance and there is no serial correlation in the residuals. If this is the case, use generalized least squares instead. A matrix can be implemented into the fitting process that gives weights to or correlations between observations to provide a more accurate fit.
129
Principal component analysis
Fits a dataset to a number of uncorrelated parameters
130
Pros and cons of principal component analysis
Pro: useful for producing stochastic projections, especially if we want to reduce the number of variables projected. Con: there's no intuitive meaning to the uncorrelated parameters, so it's not useful for investigating the influence of variables
131
Survival model
Used to model human mortality (time until death), lapses, time until bankruptcy, and other time-dependent variables. Ex: Gompertz model
132
Pros and cons of survival models
Pro: the period of survival can be modelled without the need to divide the data into year-long chunks Con: GLMs allow complex relationships between risk factors while survival models require the relationships to be parametric
133
Proxy model
A model of a model. Models are simplified representations of real world processes, and proxy models are simplified representations of a more complex model.
134
Pros and cons of proxy models
Pro: useful when the original model is too limited or cumbersome for certain applications. Con: increases model risk. Make sure to frequently benchmark proxy model results against full model results.
135
Stochastic model
A model that incorporates randomness into the results. Different from a deterministic model which will always produce the same output for a given set of inputs. Important for calculating economic capital.
136
What's important to remember about stochastic models?
The result of a stochastic model is usually itself a statistical estimate with its own mean and variance. The variance can be reduced by running more scenarios
137
How to validate a stochastic model
1) Check that the distributions of inputs/assumptions is reasonable 2) Check that the correlation between inputs/assumptions is appropriate 3) Verify the results from a sample of scenarios 4) Check that the random numbers are replicable, have a long period before repitition, are uniformly distributed, and exhibit no serial correlation
138
Bootstrapping
Also called resampling. Randomly select a slice of data to fit the model. Repeat many times.
139
Pros of bootstrapping
1) A small dataset can be used 2) Underlying characteristics of the data and linkages between data series are captured without the need for parametrization
140
Cons of bootstrapping
1) Serial correlation in the data is lost 2) Assumes future will be similar to the past 3) Difficult if there is limited history
141
Forward-looking approaches to stochastic modelling
1) Factor based. Model the factors and their relationships to derive the results 2) Data based. Model the data directly.