Chapter 8 Flashcards

1
Q

3 broad categories of goals of time series analysis

A

time series analysis:

  1. understanding periodicity and trends
  2. forecasting
  3. control

> change the course of a temporal pattern

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

3 components time series analysis can be decomposed to

A
  1. periodic variations

> daily, weekly, monthly seasonality

  1. trend

> how its mean evolves over time

  1. irregular variations

> after removing periodic and trend

> residuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what does the concept of stationarity mean in the context of time series analysis?

A

stationarity:

we call a time series stationary if when its trends and periodic variations are removed and the residuals are constant over time

> both the expected mean and variation (as well as autocorrelation) of a time series is constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

time series analysis:

> what is autocorrelation?

A

autocorrelation:

autocorrelation with lag lambda measures the correlation between a time series and a shifted variant (by lambda steps) of itself

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

why can we not use ARMA models to model real time series?

> solution?

A

in real time series (e.g. mood of bruce) we often observe that they are not stationary

> do not meet assumptions of ARMA

>>> apply differencing to remove e.g. drift in the mean

> this what the ARIMA model does

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what is often a disadantage when removing the mean from the data?

A

it increases the variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

whats the problem with backpropagation in recurrent neural networks?

> solution?

A

backpropagation does not take time into account

> when including time in the setup the new predictions do not only depend on the inputs but also on the values of the neurons in the previous step

>>> solution: unfold the network throuh time: create an instance of the network for each previous timestep we consider (backpropagation through time)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what is the underlying principle of reservoir computing?

A

reservoir computing:

> a huge reservoir of fully connected hidden neurons with randomly assigned weights

> weights from input to reservoir are also randomly assigned

> only the weights from the reservoir to the output layer are learned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

why are echo state networks a special case of reservoir computing?

A

in ESN the connections within the reservoir can be cyclic

> this allows to model temporal patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

ESN: what is the “washout time”?

A

washout time:

> initialization period that is neither used in the training or test period

> the reservoir/network needs to stabilize first

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

explain the echo state property

A

echo state property:

> the effect of a previous state r_i and a previous input x_i on a future state r_i+k should vanish gradually as time passes (k -> infinity) and not persist or even get amplified

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

explain dynamic systems model

> use them in which case?

A

dynamic systems models:

> knowledge based models

> represent temporal relationships between attributes and targets by means of differential equations

> assume only numerical states

>>> use those if we have some domain knowledge we want to use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

how does simulated annealing work?

> when to use it?

A

simulated annealing:

> used to tune the parameters of e.g. a dynamic systems model

> make random steps in the parameters state and see whether performance improves

> moves that do not result in better performance can however still be accepted

>>> this helps exploring the whole parametter space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

simulated annealing:

> what does the “temperature” refer to?

A

simulated annealing: temperature

> moves in the parameter space that do not have a positive impact on the error are still accepted with a certain probability

> this probability decreases with running time

>>> “temperature” decreases with search time: the lower the temperature the less we explore the search space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

genetic algorithms:

> what is the basic starting point of a genetic algorithm?

A

genetic algorithm:

a population of candidate solutions, e.g. paramter vectors

> represented by means of binary strings (genotype)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

genetic algorithm:

how does parent selection work?

A

parent selection:

  1. assign a fitness value to each of the individuals (e.g. error of that specific parameter vector)
  2. assign probabilities of being selected to the individuals based on their fitness
17
Q

genetic algorithm: how does crossover work?

A

crossover:

  1. select pairs of individuals
  2. perform crossover with probability p_c

> randomly select one point in the bit string and create two children

> take some part of individual 1 and the rest of individual 2

  1. if we do not perform crossover, the two children are identical to mother and father
18
Q

genetic algorithm: how does mutation work?

A

mutation:

> for each individual resulting from crossover: flip each bit of the individual with probability p_m

19
Q

multi-criteria optimization problem:

> definitio pareto dominance

A

a model instance a is dominated by another model instance b when the mean squared error obtained by model instance b is lower for at least one target and not higher for any of the other targets

20
Q

multi criteria optimization problem:

> explain NSGA-II

A

NSGA-II:

  1. run all of your models (parent and children) and find the pareto front
  2. form new population by adding models, starting with the first pareto front, then the second, etc until new population is full
  3. generate offspring using crossover and muation
  4. repeat