Time Series Analysis Flashcards

1
Q

Time Series Analysis

A

A quantitative forecasting method to predict future value

Uses numerical data obtained at regular time intervals

projections based on past and present observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Components of time series analysis

A

trend, cycles, seasonal, and random factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Cyclical vs Seasonal

A

Regular patterns when looking at yearly increments

VS

upward and downward swings of varying lengths

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Random Component

A

-erratic, nonsystematic, random, or residual flucuations.

Short duration and non-repeating

Due to nature or accidents

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Univariate vs Multivariate Time Series Models

A

Univariate
Uses only one variable
Cannot use external data
Based only on relationships between past and present

Multivariate
Uses multiple variables
can use external data
based on relationships between past and present AND between variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Time Series Decomposition

A

A technique to extract multiple types of variation from your dataset.

There are 3 important components in the temporal data of a time series - seasonality, trend, and noise (can not be explained by either season or trend)

It results in a graph of each component

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Trend

A

a longterm upward or downward pattern in your data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Autocorrelation

A

the correlation between a time series’ current value and its past value. If a correlation exists, you can use present values to better predict future values

Positive autocorrelation - a high value now is likely to yield a high value in the future and vise versa

Negative autocorrelation - an inverse relationship. A high value implies a low value tomorrow and vice versa. Think about population control via competition in the wild.

ACF = autocorrelation function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

PACF

A

An alternative to ACF. Rather than giving the autocorrelations, it gives you PARTIAL autocorrelations. It is partial because with each step back in the past, only additional autocorrelation is listed.

ACF contains duplicate correlations when variability can be explained by multiple points in time.

For example, if the value of today is the same as the value of yesterday, but also the same as the day before yesterday, AFC would show 2 highly correlated steps. PACF would only the yesterday’s correlation and will remove the ones further in the past

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Stationarity

A

A time series that has no trend. Some time series models are not able to deal with trends. You can detect non-stationarity using the dickey fuller test and you can remove non-stationarity using differencing

The mean and variance do not change over time!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Dickey Fuller Test

A

ADF test

p value smaller than .05 - reject the null (non-stationarity) and accept the alternative (trend)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Differencing

A

Removing the trend from your time series, the goal is to only have seasonal variation left.

This allows you to use models that can handle seasonality but not trend

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

One Step Time Series models

A

Designed to forcast only one step into the future
Can generate multistep forecasts by windowing over predictions
can be less performant for multistep forecasts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Multistep Forecasts

A

designed to forecast multiple steps into the future

no need to window over predictions

more appropriate for multistep forecasts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Classical Time Series Models

A

These models are strongly based on temporal variation inside a time series and they work well with univariate time series

ARIMA Family falls in this category

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

ARIMA Family

A

Autoregression (AR) - Uses a regression model that explains a variables future based on its past values. It only uses the value of the previous step to predict the current value

Moving Average (MA) - Same in concept as AR - uses past to predict future. BUT the past values used are not the value of the variables, rather it uses the prediction error in the previous steps to predict the future

17
Q

Autoregressive moving average (ARMA)

A

Combines both AR and MA into one. It uses both previous value and prediction errors from the past to predict the future.

Requires stationary data!! Must remove trend via differencing before using

18
Q

Autoregressive Integrated Moving Average

A

ARIMA adds auto differencing right into your model, so you can feed it non-stationary data

19
Q

Smoothing

A

smoothing is a basic statistical technique that can be used to smoothen out time series. Time series patterns often have a lot of long term variability but also short term variability (noise)

Smoothing reduces the short term variability so you can see the long term trends more easily

20
Q

Simple moving average

A

the simplest smoothing technique. It replaces current value and surrounding value with a very local average of a few pre and post values

21
Q

Simple Exponential smoothing

A

an adaptation of moving average. But rather than taking a simple average, it takes a weighted average (a value that is further back will count less and a more recent value will count more)

The weights are chosen subjectively

When trends are present (non-stationary) you should avoid using this technique.

22
Q

Double Exponential smoothing

A

You can use this for smoothing non-stationary data.