Define the basic indicator approach, the standardized approach, and the alternative standardized approach for calculating the operational risk capital charge
Basel II proposed three approaches for determining the operational risk capital requirement (i.e., the amount of capital needed to protect against the possibility of operational risk losses). The basic indicator approach (BIA) and the standardized approach (TSA) determine capital requirements as a multiple of gross income at either the business line or institutional level. The advanced measurement approach (AMA) offers institutions the possibility to lower capital requirements in exchange for investing in risk assessment and management technologies.
Basic Indicator Approach
The BIA for risk capital is simple to adopt, but it is an unreliable indication of the true capital needs of a firm because it uses only revenue as a driver. For example, if two firms had the same annual revenue over the last three years, but widely different risk controls, their capital requirements would be the same. Note also that operational risk capital requirements can be greatly affected by a single year’s extraordinary revenue when risk at the firm has not materially changed.

The Standardized Approach
For the standardized approach (TSA), the bank uses eight business lines with different beta factors to calculate the capital charge. With this approach, the beta factor of each business line is multiplied by the annual gross income amount over a three-year period. The results are then summed to arrive at the total operational risk capital charge under the standardized approach. The beta factors used in this approach are shown as follows:

Alternative Standardized Approach

Unanticipated Results from Negative Gross Income
The BIA and TSA capital charge methodologies can produce inappropriate results when accounting for negative gross income.
The Basel Committee has recognized that capital under Pillar 1 (minimum capital requirements) may be distorted and, therefore, recommends that additional capital should be added under Pillar 2 (supervisory review) if negative gross income leads to unanticipated results.
Describe the modeling requirements for a bank to use the Advanced Measurement Approach (AMA)
The advanced measurement approach (AMA) allows banks to construct their own models for calculating operational risk capital. Although the Basel Committee allows significant flexibility in the use of the AMA, there are three main requirements. A bank must:
Under the AMA, capital requirements should be made for all seven risk categories specified by Basel II. Some firms calculate operational risk capital at the firm level and then allocate down to the business lines, while others calculate capital at the business line level. Capital calculations are typically performed by constructing a business line/event type matrix, where capital is allocated based on loss data for each matrix cell.
Additional quantitative requirements under the AMA include:
While the four data elements must be considered in the capital calculations, many banks use some of these elements only to allocate capital or perform stress tests, and then adjust their models, rather than using them as direct inputs into capital calculations. Regulators have accepted many different types of AMA models, such as the loss distribution approach, given the rapid development of modeling operational risk capital.
Describe the loss distribution approach to modeling operational risk
capital
The loss distribution approach (LDA) relies on internal losses as the basis of its design. A simple LDA model uses internal losses as direct inputs with the remaining three data elements being used for stressing or allocation purposes. However, according to Basel II, a bank must have at least five years of internal loss data regardless of its model design but can use three years of data when it first moves to the AMA.
Explain how frequency and severity distributions of operational losses are obtained, including commonly used distributions and suitability guidelines for probability distributions
Modeling Frequency
Modeling Severity
Explain how Monte Carlo simulation can be used to generate additional data points to estimate the 99.9th percentile of an operational loss distribution
Once the frequency and severity distributions have been established, the next step is to combine them to generate data points that better estimate the capital required. This is done to ensure that likely losses for the next year will be covered at the 99.9% confidence level. Monte Carlo simulation can be used to combine frequency and severity distributions (a process known as convolution) in order to produce additional data points with the same characteristics as the observed data points.
With this process, we make random draws from the loss frequency data and then draw those events from the loss severity data. Each combination of frequency and severity becomes a potential loss event in our loss distribution.
Explain the use of scenario analysis and the hybrid approach in modeling operational risk capital
Insurance and its influence on operating risks
A bank using the AMA for calculating operational risk capital requirements can use insurance to reduce its capital charge. However, the recognition of insurance mitigation is limited to 20% of the total operational risk capital required.
Insurance typically lowers the severity but not the frequency.
Standardized measurement approach (SMA)
The standardized measurement approach (SMA) represents the combination of a financial statement operational risk exposure proxy (termed the business indicator, or BI) and operational loss data specific for an individual bank. Because using only a financial statement proxy such as the BI would not fully account for the often significant differences in risk profiles between medium to large banks, the historical loss component was added to the SMA to account for future operational risk loss exposure.
The Business Indicator
The business indicator (BI) incorporates most of the same income statement components that are found in the calculation of gross income (GI). A few differences include:
The SMA calculation has evolved over time, as there were several issues with the first calculation that were since remedied with the latest version. These items include:
Business Indicator Calculation
The BI is calculated as the most recent three-year average for each of the following three components:
BI = ILDCavg + SCavg + FCavg
where:
The three individual components are calculated as follows, using three years of average data:
interest, lease, dividend component (ILDC) =
min[abs(IIavg - IEavg), 0.035 x IEAavg] + abs(LIavg - LEavg) + DIavg
where:
services component (SC) =
max(OOIavg, OOEavg) + max{abs(FIavg - FEavg), min[max(FIavg, FEavg), 0.5 x uBI + 0.1 x (max(FIavg, FEavg) - 0.5 x uBI)]}
where:
financial component (FC) = abs(net P<Bavg ) + abs(net P&LBBavg)
where:
For the purposes of calculating the SMA, banks (based on their size for the BI component) are divided into five buckets as shown in Figure 1.
The BI component calculation should exclude all of the following P&L items:

Internal Loss Multiplier Calculation
Ideally, a bank will have 10 years of quality data to calculate the averages that go into the loss component calculation. If 10 years are not available, then during the transition to the SMA calculation, banks may use 5 years and add more years as time progresses until they reach the 10-year requirement. If a bank does not have 5 years of data, then the BI component becomes the only component of the SMA calculation.

SMA Capital Requirement Calculation
The SMA is used to determine the operational risk capital requirement and is calculated as follows:
For BI bucket 1 banks:
SMA capital = BI component
For BI bucket 2—5 banks:
SMA capital = 110M + (BI component - 110M) x internal loss multiplier
For banks that are part of a consolidated entity, the SMA calculations will incorporate fully consolidated BI amounts (netting all intragroup income and expenses). At a subconsolidated level, the SMA uses BI amounts for the banks that are consolidated at that particular level. At the subsidiary level, the SMA calculations will use the BI amounts from the specific subsidiary. If the BI amounts for a subsidiary or subconsolidated level reach the bucket 2 level, the banks must incorporate their own loss experiences (not those of other members of the group). If a subsidiary of a bank in buckets 2—5 does not meet the qualitative standards associated with using the loss component, the SMA capital requirement is calculated using 100% of the BI component.
It is possible that the Committee will consider an alternative to the calculation of the internal loss multiplier shown earlier, which would replace the logarithmic function with a maximum multiple for the loss component. The formula for the internal loss multiplier would then be updated as:

Compare the SMA to earlier methods of calculating operational risk capital, including the Alternative Measurement Approaches (AMA), and explain the rationale for the proposal to replace them.
The advanced measurement approach, which was introduced as part of the Basel II framework in 2006, allowed for the estimation of regulatory capital based on a range of internal modeling practices. This approach was a principles-based framework allowing for significant flexibility. Although the hope of the Basel Committee was for best practices to emerge as flexibility declined, this never happened and challenges associated with comparability among banks (due to a wide range of modeling practices) and overly complex calculations remained.
Given these challenges, the Basel Committee set a goal of creating a new measure to allow for greater comparability and less complexity relative to prior methods. The SMA was created as this measure, with the intent of providing a means of assessing operational risk that would include both a standardized measure of operational risk and bank-specific loss data. Unlike AMA, the SMA is a single, non-model-based method used to estimate operational risk capital that combines financial statement information with the internal loss experience of a specific bank. The SMA is to be applied to internationally active banks on a consolidated basis, whereas it is optional for non-internationally active institutions. Although it is a relatively new measure, the SMA combines key elements of the standardized approach along with an internal loss experience component that was central to older approaches.
Describe general criteria recommended by the Basel Committee for the identification, collection, and treatment of operational loss data.
Banks that incorporate the loss component into the SMA calculation must follow the following general criteria:
Describe specific criteria recommended by the Basel Committee for the identification, collection, and treatment of operational loss data.
In addition to the general criteria noted previously, specific criteria must also be followed as described as follows:
Explain the importance and challenges of extreme values in risk management
Describe extreme value theory (EVT) and its use in risk management
Extreme value theory (EVT) is a branch of applied statistics that has been developed to address problems associated with extreme outcomes. EVT focuses on the unique aspects of extreme values and is different from “central tendency” statistics, in which the central-limit theorem plays an important role. Extreme value theorems provide a template for estimating the parameters used to describe extreme movements.

Three general cases of the GEV distribution

Describe the peaks-over-threshold (POT) approach
The peaks-over-threshold (POT) approach is an application of extreme value theory to the distribution of excess losses over a high threshold. The POT approach generally requires fewer parameters than approaches based on extreme value theorems. The POT approach provides the natural way to model values that are greater than a high threshold, and in this way, it corresponds to the GEV theory by modeling the maxima or minima of a large sample.

Evaluate the tradeoffs involved in setting the threshold level when applying the GP distribution
