CS2 - Part 3 Flashcards

(90 cards)

1
Q

General formula for Cox proportional hazard (PH) model

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Ratio of hazards of lives with covariate vectors z1 and z2 (Cox PH model)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Proportional hazards model: Likelihood estimator for beta vector

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Aims of graduation

A
  • Produce smooth set of rates that are suitable for a particular purpose
  • Remove random sampling errors
  • Use the information available from adjacent ages
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Desirable features of graduation

A
  • Smoothness
  • Adherence to data
  • Suitability to purpose to hand
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Degrees of freedom for Xi-Squared test

A
  • Start with the number of groups
  • If the groups form a set of mutually exclusive and exhaustive categories (probabilities add up to 1), subtract 1
  • Subract further 1 for each parameter that has been estimated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Distributions of D_x and mu~x

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Mortality experience: Deviation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Mortality experience: Standardised deviation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Degrees of freedom when comparing an experience with a standard table

A

Degrees of freedom = number of age groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Xi-squared failures: Standardised deviations test

A

To detect a few large deviations that the Xi-square test did not detect

Check if standardised deviations of mortality are following the standard normal distribution with Xi-Squared test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Xi-squared failures: Signs test

A

To detect imbalance between negative and positive deviations

Binomial distribution

N number of negative deviations:

Check that 2*P(N <= x) > 5%

P number of positive deviations:

Check that 2*P(P >= x) > 5%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Xi-squared failures: Cumulative deviations

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Xi-squared failures: Grouping of signs test

A

Detects ‘clumping’ of devations with the same sign.

Check ‘Grouping of signs test’ in tables.

If number of groups of positive (or negative) runs is lower or equal than the test statistic, we can reject the null hypothesis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Testing smoothness of graduation

A

Third difference (change in curvature) of the graduated quantities should

  • Be small in magnitude compared with the quantities themeselves
  • Progess regularly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Methods of graduation

A
  • Graduation by parametric formula
    • a1 + a2 exp(a3x + a4x^2+…)
    • well-suited to the production of standard tables from large amounts of data
  • Graduation by reference to standard table
    • (a+bx) mu_x^s
    • Can be used to fit relatively small data sets where a suitable standard table exists
  • Gradution using spline functions
    • Method is suitable for quite small experiences as well as very large experiences.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Morality projection - Method based on expectation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Autocovariance function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Simplify:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Autocorrelation function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Correlation formula

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Autoregressive process of order p

AR(p)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Moving average process of order q

MA(q)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Autoregressive moving average

ARMA(p,q)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Condition for stationarity of AR(p) process
26
Conditions for invertibility of MA processes
Invertibility: White noise process e can be written explicitly in terms of X process
27
Moving average model MA(q), in backwards shift notation
28
ARMA(p,q) process defined in Backward operation notation
29
Definition of an ARIMA process
30
Features of MA(q) process
31
Features of AR(p) process
32
Features of ARMA (p,q) process
33
Three possible causes of non-stationarity
1. Deterministic trend (e.g. exponential or linear growth) 2. Deterministic cycle (e.g. seasonal effect) 3. Time series is integrated
34
Methods for compensating for trend/seaonality (6)
* Least squares trend removal (Tables p.24) * Differencing * Differencing d times will not only make I(d) series stationary but will also remove linear trend * Seasonal differncing * E.g. differencing 12 times for annual seasonality * Method of moving averages * Create transformation such that transformed time series is moving average of original time series * Method of seasonal means * Transformation of the data * E.g. take log
35
Check if observed time series is stationary
Autocorrelation function should converge to 0 exponentially
36
Identification of white noise
Option 1: * Check if values of the SACF or SPACF fall outside the range of * +-2/sqrt(n) --\> Approximated from +-1.96/sqrt(n) * Note that there is a chance of 1/20 that one value will fall out of the range (95% quantile) Option 2: * Portmanteau test (tables p. 42)
37
Identification of MA(q)
38
Identification of AR(q)
39
Identification of appripriate order of differencing (d) of sample data
* Slowly decaing sample autocorrelation function indicates time series need to be differenced * Look for smallest sample variance for d=1,2,3,...
40
Diagnostic checking for fitted ARIMA model
41
Condition for stationarity of vector autoregressive process
42
Calculate eigenvalues of matrix A
Values lambda, such that det (A-lambda\*I) = 0
43
Two time series processes X and Y are called cointegrated if:
* X and Y are I(1) random processes * there exists a non-zero vector (a,b) such that aX+bY is stationary The vector (a,b) is called cointegration vector.
44
Moment generating function (formula)
45
Cumulant generating function
46
Coefficient of skewness
47
Kurtosis
* Fourth standardised moment * kurtosis = 3: mesokurtic (normal distribution) * kurtosis \>3 leptokurtic * more peaked, fatter tail * kurtosis \<3 platykurtic * broader peak, more slender tails
48
Standardised moment
49
Varying volatility over time
heteroscedacity
50
Central limit theorem
51
Generalized extreme value distribution
52
GEV distributions: Different values of shape parameter gamma
53
Rough criteria to chose family of GEV distributions
54
Distribution of excess above u
55
kth moment of a continuous positive-valued distribution with density function f(x)
56
Measures of tail weight
57
Coefficient of upper tail dependence
58
Coefficient of lower tail dependence in terms of the copula function
59
Coefficient of upper tail dependence in terms of the copula function
60
Fundamental copulas
61
Graphical representation of independence copula
62
Graphical representation of comonotonous copula
63
Graphical representation of counter-monotonic copula
64
Gumbel copula
* Upper tail dependence determined by parameter alpha * No lower tail dependence
65
Clayton copula
* Lower tail dependence determined by alpha * No upper tail dependence
66
Frank copula
* Interdependence structure in which there is no upper or lower tail dependence
67
Gaussian copula
68
Archimedean copula
69
Student's t copula
70
Tail dependence of all copulas
71
PDF of the reinsurer's claim amount under XOL with retention M
72
Variance, mean and skewness of compound poisson process with parameter lambda
73
Coefficient of skewness of compound poisson distribution
74
Sum of independent compound Poisson random variables
75
n choose k
76
Machine Learning: Confusion matrices
77
Machine Learning: Hyperparameters
Variables external to the model whose values are set in advance by the user. They are chosen based on the user’s knowledge and experience in order to produce a model that works well.
78
Machine Learning: Parameters
variables internal to the model whose values are estimated from the data and are used to calculate predictions using the model.
79
Machine Learning: Regularisation or penalisation
80
Branches of Machine Learning
81
Machine Learning: Stages of analysis
82
Machine Learning: Data Types
83
Machine Learning: Train-Validate-Test approach
Split data into * data for training (60%) * data for validation (20%) * data for testing (20%)
84
Machine Learning: Requirements for analysis to be reproducible
* Data used should be fully described and available to other researchers * Any modification to the data should be clearly described * Selection of the algorithm and the development of the model should be desribed (including parameters and why they are chosen) * Ideally would provide computer code used * Specify seed value
85
Machine Learning: Penalised general linear models
Maximize penalized likelihood
86
Machine Learning: Naive Bayes Classification
87
Machine Learning: Gini index of a final node in a decision tree
88
Machine Learning: Gini index of a decision tree
89
Machine Learning: K-means clustering advantages and disadvantages
90