Time-Series Flashcards

1
Q

What is time-series ?

A

a set of time-ordered observations of a process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

how is the time organised?

A

the intervals between observations remain constant (minutes / years / anything)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is univariate time-series?

A

Univariate time-series = many observations originating from one source

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what is multivariate time-series?

A

Multi-variate time-series = many observations originating from multiple different sources

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the goal of time-series?

A

predicting and explaining the properties of time-series

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what are the key properties of time-series data?

A

Ð Variation
Ð Autocorrelation
Ð Stationarity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what are the types of variation ?

A

(trends, seasonality, cycles, irregular variation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what are the types of Forecasting & Prediction?

A

Ð Predicting the evolution of a process (but also using the past)
Ð Directionality Analysis: how time-series influence / predict each other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is a trend?

A

ANY systematic change in the level of a series, its long term direction / effect. (increases-decreases)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what do we have to do with trends?

A

Ð Modelling it explicitly

Ð Detrending

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

why model a trend ?

A

various characteristics of time series data can be of theoretical interest—in which case they should be modeled

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

why detrend?

A

if trends are of no theoretical interest …. they should be removed so that the aspects that are of interest can be more easily analyzed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what is SEASONALITY ?

A

A repeating pattern of increase / decrease in the series that occurs consistently throughout its duration.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

give an example of seasonality….

A

Ð E.G – For instance, restaurant attendance may exhibit a weekly seasonal pattern such that the weekends routinely display the highest levels within the series across weeks (i.e., the time period), and the first several weekdays are consistently the lowest
Ð Or
Ð A naturally occurring time period: ‘seasonal’ factors (monthly or weekly event changes)

The underlying pattern remains fixed in seasonality, yet its magnitude may vary in effect size.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Once a systematic component has been identified in time-series…. it is often ……

A

modelled or removed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

if seasonality is of not interest… we would thus…

A

remove it

called seasonal adjustment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what is a cycle?

A

A cyclical component in a time series is conceptually similar to a seasonal component: It is a pattern of fluctuation (i.e., increase or decrease) that reoccurs across periods of time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

how is a cycle unlike seasonal effects?

A

However, unlike seasonal effects whose duration is fixed across occurrences and are associated with some aspect of the calendar (e.g., days, months), the patterns represented by cyclical effects are not of fixed duration (i.e., their length often varies from cycle to cycle) and are not attributable to any naturally-occurring time periods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

WHAT IS IRREGULAR VARIATION?

A

Randomness: Any remaining variation in a time series

after removing the systematic changes in the time series (trend, seasonality, cycles).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

what is irregular variation also referred to as?

A

white noise

It constitutes any remaining variation in a time series after these three systematic components have been partitioned out. In time series parlance, when this component is completely random (i.e., not autocorrelated), it is referred to as white noise, which plays an important role in both the theory and practice of time series modeling.

Equivalent to the error term in a statistical model. Residual time series left after fitting a model to the data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

After a model has been fit to the data, the residuals form…..

A

After a model has been fit to the data, the residuals form a time series of their own, called the residual error series. If the statistical model has been successful in accounting for all the patterns in the data (e.g., systematic components such as trend and seasonality), the residual error series should be nothing more than unrelated white noise error terms with a mean of zero and some constant variance

22
Q

what is STATIONARITY?

A

stationarity is the most important assumption when making predictions based on past observations

Our time series is stationary, if the means and variance do not change/vary over time.

A complication with time series data is that its mean, variance, or autocorrelation structure can vary over time. A time series is said to be stationary when these properties remain constant. Thus, there are many ways in which a series can be non-stationary (e.g., an increasing variance over time), but it can only be stationary in one-way (viz., when all of these features do not change).
Stationarity is a pivotal concept in time series analysis because descriptive statistics of a series (e.g., its mean and variance) are only accurate population estimates if they remain constant throughout the series. With a stationary series, it will not matter when the variable is observed: “The properties of one section of the data are much like those of any other”. As a result, a stationary series is easy to predict: Its future values will be similar to those in the past. As a result, stationarity is the most important assumption when making predictions based on past observations,

23
Q

what is the alternative assumption to stationarity?

A

weak stationarity

24
Q

what is weak stationarity?

A

Needs the mean to be stable across time (not time independent) and the auto-covariance depends only on the time diff between two time points.

25
Q

What can we do if the time-series exhibit non-stationarity?

A

transform with differencing

26
Q

what is differencing ‘basic’

A

Ð Take every single point, and subtract the next point.
Ð Take a point X and subtract a copy of X at one previous time-point, this is differencing the time-series.
Ð Then it might turn into a stationary time series (iteratively – you can stop at some point).

27
Q

any further details about differencing ?

A

Ð One or two differences might be enough. If it gets better… closer to stationarity… the variance will be reduced. If you do again and gets better again, then reduced variance. But, once variance starts to increase when differencing, you have gone too far and the last step was enough.
Ð A common mistake in time series modeling to “overdifference” the series, when more orders of differencing than are required to achieve stationarity are performed. This can complicate the process of building an adequate and parsimonious model
Ð Statistical test e.g. is the augmented Dickey–Fuller test (ADF; Said and Dickey, 1984) which tests the null hypothesis that the series is non-stationary. Thus, rejection of the null provides evidence for a stationary series.

28
Q

what is AUTO-CORRELATION (AC)?

A

It is a tool to find patterns in the data

Ð Specifically, the autocorrelation function tells you the correlation between points separated by various time lags

In psychological research, the current state of a variable may partially depend on prior states. That is, many psychological variables exhibit autocorrelation: when a variable is correlated with itself across different time points … Time series designs capture the effect of previous states and incorporate this potentially significant source of variance within their corresponding statistical models

29
Q

why perform autocorrelation ?

A

WHY? to determine how past and future data points are related in a time series.

30
Q

what is cross-correlation?

A

CROSS correlation – Correlations in time with another different time-series or the past of another time-series. Time lag is important.

31
Q

what are the Three types of autocorrelations?

A

1) LONG-RANGE CORRELATIONS
2) ANTI-CORRELATED (SHORT TERM AND OPPOSITE)
3) UNCORRELATED

32
Q

Name two types of pattern analysis in time-series?

A

Fourier

Wavelet

33
Q

what is fourier analysis?

A

Fourier look at oscilations through sine and cosine

34
Q

what is wavelet anaysis?

A

Ð Instead of oscillations, we look at diff types of waves.

Ð Not how much sinusoid, but how much of this pattern in data in small time windows.
Wavelet basis functions and time-frequency boxes.

35
Q

their is a trade off in wavelet?

A

Ð Good time resolution at high frequencies
Ð Good frequency resolution at low frequencies.

Ð Kind of trade-off between them

36
Q

what is Autoregressive Modelling

A

Autoregressive Modelling - An important aspect of time series analysis is forecasting: Assess the variation in a time series variable as a function of predictors and some stochastic term (error term).

an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model specifies that the output variable depends linearly on its own previous values and on a stochastic term (an imperfectly predictable term); thus the model is in the form of a stochastic difference equation.

37
Q

what is the basic theory behind how Autoregressive Model (AR) works?

A

Autoregressive Model (AR) - With an autoregressive model AR the values of a time series x(t) are described by the sum of the linearly weighted values of the time series at previous time points (lagged). But usually we only need a certain number of previous time lags: model order p. FINDING P IS AN IMPORTANT THING IN TIME-SERIES

38
Q

what can Autoregressive Models be used for?

A

Ð Forecasting:
Predict the future of a process using past instances of different signals.
Ð Estimating spectral quantities:
Example: Power, Coherence. Coefficients transformed to the frequency space.
Ð Causality analysis:
Directional influence between time series.

39
Q

what is granger-causality?

A

Granger-Causality is a measure of causal or directional influence from one time series to another and is based on linear predictions of time series.

Granger-causality is not real causality, it reflects predictability.

If adding past values of signal x2 helps in predicting the evolution of x1 more than not adding those values (i.e. leads to smaller variance of the error term), then x2 Granger-causes x1.

We can assess whether having additional signals can help in predicting the evolution of a specific process we are interested in.

40
Q

what is the problem with granger causality?

A

We still have an issue with MEDIATION effects and discriminating between influential signals. In EEG, everything influences everything.

41
Q

what was developed to address this?

A

To resolve such ambiguity conditional granger causality was developed

Also, PDC is a popular index derived from Granger-causality used in multi-channel systems to disentangle direct and indirect connections

42
Q

what is a long-range temporal correlation?

A

Time series can present persistent temporal correlations that extend over many time scales: long-term memory processes.

This means, if there is an increase over some time scale, it will be very probable to be followed by an increase in the next time scale. As if some memory. The same is true of decreases, where a decrease will tend to follow a decrease. If something is growing at a time point, from here to here to here, it will also grow elsewhere in the same pattern – it is not changing randomly. Has a persistent pattern.

43
Q

example of long-range temporal correlation?

A

E.G. Persistent patterns in walking gait in parkinsons

44
Q

what is the hurst component?

A

Hurst exponent (H): is a measure of long-term memory of time series.

45
Q

what is Detrended Fluctuation Analysis (DFA) ?

A

Detrended Fluctuation Analysis (DFA)
An influential method to analyse long-range correlations in time series with non-stationarity and noise (Peng et al. 1995).

46
Q

what are the associated values with HURST component?

A

Ð 0.5 -> 1: Long-range dependency (long-range temporal correlations).
Ð 0.5: Uncorrelated series.
Ð 0 -> 0.5: Anticorrelations of values.

47
Q

again, we transform a non-stationary time series to a stationary time series via xxxxx.
In this exercise we will generate a non-stationary time series as a fractional xxxxx process with xxxxx exponent:

A

differentiation.

Brownian process

with Hurst exponent:

48
Q

The higher Hurst parameter is, the…….

A

the smoother the curve will be.

49
Q

again… what do these values mean?
 H = 0.5
 or H = 0.2
 or H =0.8

A

 H = 0.5 (true random process. Uncorrelated time serie. Brownian time series),
 or H = 0.2 (“anti-persistent behavior” or negative autocorrelation),
 or H =0.8 (long-range correlations, long memory process).

50
Q

if the auto-correlation function exhibits a positive value at time lag 1, a negative value at time lag 2, and is zero for the remaining time lags, the time series presents a…

A

anticorrelation: an increase in an interval is followed by a decrease in the subsequent interval.