Part 3 Flashcards
(38 cards)
Scenario assessment
=A method used to evaluate the impact of potential risk events on an organisation.
It creates detailed scenarios to understand how differenet high severity, low frequency risk events affect the business
Severity assessments - part of the scenario assessment
- severity assessments evaluate total fin/non-fin impacts
- They convert non financial impacts (eg service interruption) into financial terms for a complete assessment.
- Assess impact after mitigation efforts (not including insurance)
- Link impacts to potential financial loss of business (ie lost clients worth $50 mil
- Use peers loss data to benchmark severity, particularly usefull for unfamilliar 1 off scenarios
- smaller firms can leverage large firm’s loss data to adjust their risk framework as they grow.
Importance of scenario assessment
- Enhances preparedness for unexpected events
- provides insights into potential impacts/responses
- Supports strategic plannning and decision making
- Helps calculate regulatory capital requirements (under the Advance Measurement Approach)
Steps in conducting a scenario assessment
- Prep/governance - Establish a strcutured approach with clear governance
- scenario generation - generate a range of scenarios and select the most relevant ones
- assess the impact/likelihood of each scenario
- validate scenarios with relevant stakeholders/experts
- Use scenarios to inform risk mgmt and strategic planning
- aggregate the results and report to senior mgmt/regulators
Frequency assessment
=Evaluates the probability of each scenario occuring in the coming year - alligning with the capital measurement
Attributing probability to rare, low frequency events is difficult and can lead to inaccurate.
Internal risks can are aligned with results from the RCSA exercise.
Scenario generation and selection techniques - how many scenarios, techniques, purpose.
- Brainstorming - generates a wide range of scenarios and encourgaes creativity/divergent thinking
- Clearly outline selection criteria to prioritise scenarios for their impact and impact on the firm.
- Consolidate similar scenarios and exclude negligible ones.
- Focus on around 15 relevant scenarios for detailed assessment.
Scenario assessment techniques
- Strucutured expert analysis - use strcutured questions/benchmakrs to reduce estimation bias based on past events
- Availability and recency bias - recent events seems more likely to occur than older ones. therefore data from further back is used for stable risks (pandemics, fraud) and more recent for rapidly evolving risks (cybercrime)
- Anchoring, confirmation and group polarisation bias - ‘herd mentality’ causing bias. Mitigated by doing private votes before results are discussed.
- group size/dynamics - smaller groups of SMEs are more effective than large groups as large groups can be inefficient/biased
- Bias awareness and training - estimation biases are deeply ingrained and understanind them can help individuals avoid being biased.
Delphi method of scenario assessment/generation - 4 stages and calculations
Focusses on pooling expert judgements
1. Silent collection - individuals write down their assessment with influence from others
2. Disclose estimates - all responses are collated and shared with the group for comparison.
3. Optimal reassessment - participants are allowed to modify answers after seeing estimates/answers of others. significant changes can trigger more rounds however forcing the same answer at this stage is discouraged.
4. final estimate calculation - calculated using the lowest, highest and average responses (weighted for accuracy)
Final estimate = [lowest response + (n-2) X average response + highest response)] /number of participants (n)
Fault Tree Analysis (FTA)
= breaks down scenarios into conditions that must occur in order to cause a disaster.
This helps banks (classed as high reliability organisations) layer independant controls to lower risks. e.g. layering 3 independant controls with a 10% failure rate lowers the chance of them all failing to 0.1%.
Fault trees and Bayesian models - condittional probability - recommended to use for what assessment, benefit for breaking down into accuracy
- Conditional probability in controls - more realistic scenarios often have partialy dependant control failures that require conditional probabilities.
- Bayesian models - update likelihood assessments based on new information or expert opinions, using conditional probabilities to refine estimations
- Recommendation for using FTA as a scenario assessment.
- breaking down scenarios into likelihood and impact components enhances accuracy and transparency of the assessment process.
- Use experts/stakeholders to review scenarios and offer feedback. Also need to ensure they are regularly updated
Scenario documentation and validation - contents of a scenario summary doc, type of template, who reviews it, relies on what
- The entire scenario analysis process must be documented in detail with each scenario summarised in a sheet with title, description, rationale, assessment range and relevant incidents.
- use standardised templates to ensure consistency
- Use independant third parties to review consistency of the process
- Validation is reliant on documents from scenario workshops etc.
- Similar scenarios can be combined to assess collective risk.
- The scenario list is then presented to the board for approval
Large firms will have 50 scenarios, mid firms have 15 and small 6-10
benefits of scenario assessment
- Enhanced preparedness - improves firm’s readiness for unexpected events
- Improves strategic decision making/planning
- supports regulatory compliance for risk assessments
- Improves risk mgmt and mitigation plans
- Facilitiates continuous improvement in risk mgmt practice
Systematic estimation and mitigation of bias
- SMEs create a scenario assessment based on likelihood of worst case events occuring in a variety of time frames
This is however, less popular with regulators due to it’s lack of structure/unreliability
.+ quick and inexpensive
.- relies on selecting experts who may be biased
Mgmt lessons from scenario analysis
- puts focus on response and risk mitigation instead of exact probabilities of the risk happening
- scenarios should be grouped by their impact on the firm - facilitates focusses assessments and mitigation efforts
- If risks breach the firms appetite, further mitigation/escalations should be required and in place
- Risks within the appetite are required to be continuously monitored
- firms must have responses in place for events even if they are unlikely
Regulatory capital
= minimum amount of capital that a fin. institution must hold as required by regulators
It ensures that institutions can absorb a reasonable loss and it protects depositors/clients and the financial system
It helps prevent runs on banks thus enhancing the strength of the fin. system.
3 Basel pillars
- min. regulatory capital to cover mkt, credit and operational risks
- supervisory review process - allows for adjustments to the capital required for pillar 1 based on an institution’s risk exposure.
- market discipline -
Regulatory capital for operational risk - Basel pillar 1 - % firms must hold, based on what over what time.
- Standard approach for op risk = regulatory capital is based on the avergae annual gross income over the past 3 years. Basic indicator approach - Of this, 15% (alpha factor) will be required to held by local banks. For the standard ised approach (TSA) reg. capital is based on the banks risk profile (beat approach) and can vary from 12,15,18%
- Beat values - calculated in the late 1990s out of a sample of 29 firms, NOT REPRESENTATIVE OF TODAY’S LANDSCAPE, therefore the focus for banks/regulators have shifted to pillar 2.
- Sound management should be a key focus as well as having regulatory capital to prevent failures of operation.
- Principles for sound operation risk introducted in 2003 and revised iin 2011 and 2014
Capital modelling approaches - standard and internal
- Standardised approach: uses predefined risk weightings for certain asset classes. it is very simple but does not effectively consider the institution’s risk profile
- Internal ratings based approach (IRB): allows banks to estimate risk using their own risk models. more effectively models the firm’s risk but requires regulatory approval
Advanced modelling techniques
- Value at risk (VAR) - measure potential loss n a portfolio over a defined period of time to a set confidence level (ie 90% chance of loss in a 10 year period with a 95% confidence rate)
- stress testing: tests extreme scenarios to assess impact on capital
- scenario analysis - evaluates teh impact on various scenarios on a firm’s capital
- principals of sound op risk mgmt
- Op risk culture
- Op risk mgmt framework
- board of directors
- Op risk appetite and tolerance
- senior mgmt
- risk identification and assessment
- change mgmt
- moitoring and reporting
- control and mitigation
- business resillience and continuity
- role of disclosure
Advanced measurement approach (AMA) criteria - reporting history of…, map risks to what, implications on who, encourage active day to day what
- Incident reporting history of 5 years (now 10)
- mapping of risks and losses to regulatory categories
- operational risk mgmt function
- implicationof the senior mgmt in risk management
- written policies and procedures
- active day to day op risk mgmt
regulatory capital for op risk - internal and external loss data use
- internal loss data - info on previous losses/trends
- rules on how to map incidents and their data
- external data - data sourced from public/private databases to compare against internal data
- Mixing internal and external data is important to adjust data to suit the firm’s size as well as using one data set when the other does not have enough info.
4 types of models
- Stochastic - part of the loss distribution approach (LDA), purely quantitative and focus on past losses. Extrapolate future losses distributions up to the 99.9th %. quite common
- scenario based - qualitative model for when internal loss data is insufficient. Common in the EU and insurance firms
- Hybrid - most common and allign with AMA reg expectations. creates a 99.9% confidence loss distribution report based on past incident data and scenario based losses.
- Factor models - explain behaviour of variables based on influencing factors (economy, controls etc) and are common in equity pricing. Overtaken by stochastic models as they are hard to calibrate