Robertson Flashcards

1
Q

Calculate Excess Losses at limit L

A

Expected XS Losses = E(L)*ELF(L)
ELF = XS Ratio = LER

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Briefly describe the 1993 NCCI procedure to map classes into His

A
  1. NCCI defined 7 variables indicator of XS loss potential
  2. Because corr between them, the 7 variables were grouped into 3 subsets based on their partial correlations
  3. Principal components analysis was done and linear combination of the first 2 and last one was used to best explain variance
  4. Since there were 4 HGs prior to study, NCCI decided to keep 4 and thus continued to map each class to 1 of 4 HGs.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Briefly describe the 2 variables used by WC insurance ranting bureau of California (WCIRB) in mapping classes to HGs in 2001 analysis

A
  1. % claims in XS of $150K
  2. Diff between class loss distribution & avg loss distribution across all classes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define Hazard Group (HG)

A

Collection of WC classifications that relatively similar ELFs over broad range of limits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Calculate XS Ratio (2007 analysis)

A

r = L/E(X)
S(r) = (E(y) - E(y;r)) / E(y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Calculate credibility-weighted class XS ratio

A

Rc(final) = ZRc + (1-Z)Rhg

Rhg is the HG that contained the class prior to analysis

Z varies by class but not by limit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How was the credibility standard selected in 2007 analysis

A

Several alternatives were considered and none significantly impacted end results

Decided to stick with same crew formula used in prior review:
Z = min(1.5n/(n+k), 1)

n is the number of claims in class
k is avg number of claims per class

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Discuss 3 other credibility options considered

A
  1. Using median instead of average for k
  2. Excluding med only claims from calculation of n and k (no impact on ELF)
  3. Including only serious claims in calculation of n and k
  4. Requiring min number of claims for classes used in calc of k
  5. Various square root formulas (ex: Z = (n/384)^0.5
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Discuss why NCCI used 5 limits for cluster analysis

A

Prior to 2005, NCCI used and published ELFs for 17 different limits

Ultimately, they choose to use 5 of those limits for cluster analysis ($100K, $200K, $500K, $1M and $5M) for 2 reasons:
1. XS ratios at any pair of limits are highly corr across classes (especially for limits closer together)
2. Limits below $100K were heavily represented in 17 limits

Using fewer than 5 was also considered: while 1 limit would have not captured full variability in XS ratios, 2 principal components of 5 limits still explained 99% of variation in data.

Results of using 5 limits were not so different than 17 limits.

Ultimate driver decision: wanted to cover the range of limits commonly used for retro rating

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Calculate Euclidean distance (L2)

A

L2 = square root of sum of (xi - yi)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explain why NCCI decided to use L2 over L1

A

L1 = sum of abs value of (xi - yi) was also considered, this would have minimized relative error in estimating XS premiums.

L2 has 2 advantages:
1. Penalize large deviations: one big is worse and many small deviations (outliers have more impact)
2. Minimize squared error

Ultimate driver: analysis was not sensitive to distance measure so NCCI used traditional L2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Discuss how XS ratios could have been standardized before clustering

A

XS ratios at lower limits are higher so without standardization the lower limits would naturally end up with more weight in clustering procedure

Standardization ensures each variable has similar impact on clusters.

NCCI explored 2 methods:
1. Zi = (xi - xbar)/s
2. Zi = (xi - minxi)/(maxxi - minxi)
Zi is standardization of xi
s is sample std dev

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Discuss why NCCI did not standardize XS ratios

A
  1. Resulting HGs did to differ much from not using it
  2. XS ratios at different limits have similar unit of measure, which is $ of XS loss per $ of total loss
  3. All XS ratios are between 0 and 1, while standardization have to led to results outside of range
  4. There is greater range of XS ratios at lower limits & this is good thing since it is based on actual data

Standardization is appropriate when spread of values in data is due to normal random variation, but quite inappropriate if spread is due to presence of subclasses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe the steps in using cluster analysis with k-means

A

Goal is to group classes with similar vectors of XS ratios as measured by L2 distance into HGs using premium weights

  1. Decide on k number of clusters (potential HGs)
  2. Start with arbitrary initial assignment of classes to k clusters
  3. Compute centroid of each cluster:
    sumprod(prem, R(L))/sum(prem)
  4. For each class, find closest centroid using L2 distance and assign class to that cluster

If any class has been re-assigned during step 4, go back to step 3 (iterative process)

Continue process until no classes is reassigned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the purpose of cluster analysis

A

Minimize variance within clusters and maximize variance between clusters

Means HGs will be homogeneous and well separated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Describe the 2 statistics NCCI used to decide on number of HGs (k)

A
  1. Calinski and Harabasz (C-H) statistic
    Trace(B)/(k-1) / Trace(W)/(c-k)
    Higher num = greater separation
    Higher denom = greater homogeneity in clusters
    Higher value indicates better number of clusters
  2. Cubic clustering criterion (CCC) statistic
    Compares amount of var explained by given set of clusters to that expected when clusters are formed at random based on multi-dimensional uniform distr
    High value indicates better performance
    Less reliable when data is highly correlated

7 groups was best choice under C-H and 2nd best CCC (1st was 9), NCCI gave more weight to C-H statistic

To make final choice, they re-ran analysis using only classes with at least 50% cred and then only with 100% credibility. Both showed 7 groups as optimal number.

17
Q

Identify 4 reasons NCCI did not use 9 groups

A
  1. Separate study (Milligan and Cooper) found C-H statistic outperformed CCC statistic
  2. CCC statistic deserves less weight when correlation is present
  3. Crossover in XS ratios between HGs using 9 groups (not appealing in practice)
  4. Selection of number of HGs ought to be driven by large classes where most of experience was concentrated.
18
Q

Describe how segmentation was improved with analysis

A

7 new HGs had more even spread of number of classes and premium across groups compared to prior groups

New groups showed much better separation between groups (higher between var relative to tot var)

Way less overlap

60% of premium did not move after change because old groups were used as complement of cred

Distribution is closer to uniform across classes

19
Q

Describe 3 things considered by underwriters in reviewing HGs definitions

A
  1. Similarity between class codes that were in different HGs
  2. Degree of exposure to automobile accidents in given class
  3. Extent heavy machinery is used in given class

NCCI made some changes in HG mapping based on feedback

When deciding whether reassignment was appropriate, authors considered consistency of feedback, credibility of each class and XS ratios of nearest HGs.

20
Q

State 3 key ideas Robertson gives for remapping of HGs

A
  1. Computing XS ratios by class
  2. Sorting classes based on XS ratios
  3. Cluster analysis
21
Q

Calculate Guaranteed Cost Premium

A

GCP = Std P * (1 - Discount) + Expense Fee
Std P = Mod P * Schedule Mod
Mod P = Manual P * Experience Mod
Manual P = (Payroll/100)*Manual Rate

22
Q

Summarize updated NCCI approach to develop new factors (4 steps)

A
  1. Sort classes into groups based on ELFs
  2. Cred weight indicated ELFs with those corresponding to current HG
  3. Use cluster analysis to group classes with similar ELFs (determine optimal number of groups)
  4. Compare results with prior assignments and them if needed