time series models Flashcards

1
Q

exponential smoothing equation

A

st - the baseline at time period t
xt- the observed vlaue (response)

st = axt + (1-a)st-1

0<a<1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

exponential smoothing tradeoff

A

a -> 0 there is a lot of randomness in the system, fluctuations are due to randomness, therefore yesterdays baseline is probably a good indicator of todays baseline
a-> 1 not much randomness in the system, fluctuations are probably due to changes in baseline

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How to start exponential smoothing?

A

S1 or baseline = x1 or first value recorded

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does exponential smoothing not deal with?

A

-doesn’t deal with trends or cyclical variations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

trends

A

the value is increasing or decreasing over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is time series data?

A

that in which the same response is known for many time periods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

cyclical patterns

A

ex - annual temp cycles
-weekly sales cycles
-daily blood pressure cycles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

add trend to exp smoothing calculation

A

Tt: trend at time period t

st = alphaxt + (1-alpha)(st-1+Tt-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do you calculate trend?

A

just like you do for the baseline

Tt = Beta(St- St-1) + (1-Beta)Tt-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Trend initial condition

A

T1 (trend) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What two methods can you use to deal with cyclical patterns?

A

additive and multiplicative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Seasonalities additional variables

A

L: length of cycle
Ct: the mutiplicative seasonality for time t
(inflate or deflate the observed observation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Baseline formula w/ trend and seasonality (multiplicative)

A

st = alphaxt/Ct-L + (1-alpha)(st-1+Tt-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do we update seasonality? What is the initial value?

A

Ct = gamma(xt/St)+ (1-gamma)Ct-L
-no initial cyclic effect b/c it can’t be measured until the end of the first season

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How would you interpret a seasonality value of 1.1 for weekly cyclical data? How would this change your results?

A

on that day, the value is 10% higher just because it is that day.

if you sold 550 items, 500 was your baseline and 50 was because it was sunday

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Multiplicative seasonality starting condition

A

-start by multiplying by 1(no seasonality/cyclic effect) for the first L values of C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

exponential smoothing also sometimes called

A

single, double or triple exponential smoothing depending on how many aspects like trend and seasonality you include

18
Q

triple exponential smoothing with base equation plus trend and seasonality is also called?

A

winters method or holt-winters

19
Q

What does exponential smoothing do?

A

peaks and valleys are smoothed out

st = alphaxt + (1-alpha)st-1

ex- alph = 1/2

when xt is high, St is not as high, its pulled down by (1-alpha)st-1
when xt is low, St is not as low, it’s pulled up by (1-alpha)st-1

20
Q

Why is exponential smoothing exponential?

A

St-1 can be rewritten as

st-1 = alphaxt-1 + (1-alpha)st-2

and when you substitute the value, (1-alpha) multiplies the St-1 substitution so that you get (1-alpha )alpha xt-1 + (1-alpha )^2 St-2
and so on, all the way back to the first value in the time series

21
Q

Which time periods contribute to current baseline estimate? Which time periods contribute the most?

A

Every past observation, all the previous data is baked in to St-1

The more recent time periods contribute the most, because we take (1-alpha)^exponent that increases by 1 with each time period

22
Q

exponential smoothing forecasting baseline

A

original equation - st = alphaxt + (1-alpha)st-1

prediction

st+1 = alphaxt+1 + (1-alpha)st

xt+1 is unknown, so our best guess is St or the previous periods baseline

doing that calculation we get our forecast…

Ft+1 = alphaSt + (1-alpha)st, so Ft+1 = St

Ft+k = St, k = 1,2,…. the estimate remains the same for all future time periods

forecast error gets higher farther into the future

23
Q

exponential smoothing forecasting including trend

A

we just include the trend

the best estimate of the next baseline is our current baseline estimate
best trend estimate is our most current trend estimate

therefore
our forecast for time t+1 is
Ft+1 = St+Tt

the trend is the same going forward

24
Q

exponential smoothing forecasting including trend and multiplicative seasonality

A

the best estimate of the next time periods seasonal factor is

Ct+1 = CT(t+1) - L or the multiplicative seasonality the last cycle at this time

Forecast
Ft+1 = (St+Tt) C(T+1) - L and remains the same going forward

25
3 key parts of ARIMA
1-Differences if the data is not stationary (trend or seasonality), differences in the data might be stationary 2-Autoregression - predicting current value based on previous time periods values 3-Moving Average - previous errors as predictors
26
What does ARIMA stand for?
autoregressive integrated moving average
27
What does it mean for data to be stationary?
mean variance, and other measures are all expected to be constant over time
28
Types of differences
First order differences - differences of consequtive observations 2nd order - difference of differences 3rd order - differences of differences of differences or the dth order difference (infinite differecnes of differences of differences...)
29
types of auto regression
order infitinity regressive model- exp smoothing uses autoregression because we're making predictions based on the same value -uses data as far back as we have order p autoregressive - go back p timeperiods
30
autoregression meaning breakdown
regression - predicting the value based on other factors auto -instead of using other factors to predict we use earlier values of what we're measuring to predict -only works with time series data
31
What does arima do?
it combines autoregression and differencing -autoregression on the differences -use p time periods of pervious observations to predict the dth order differences
32
Moving avg part of arima components
previous errors as predictors Et =(xhatt - xt) order -q moving avg go back q time periods
33
Arima model
arima pdq model D(d)t = avg + sum pth order autoregression on dth order differences - sum of qth order moving avg -dth order differences -pth order autoregression -qth order moving avg statistical software can find pdq
34
Arima model equivalence
specific values of PDQ give other more basic models arima(0,0,0) -white noise ( no patterns) arima(0,1,0) -random walk arima(p,0,0) -autoregressive arima(0,0,q) -moving avg arima(0,1,1) -basic exponential smothing
35
Can arima be used for short term forecasting like exponential smoothing?
short term forecasting -better than exp. smoothing -when the data is more stable with fewer peaks valleys and outliers usually need need about 40 past data points for data to work well
36
What does GARCH stand for?
generalized autoregressive conditional heteroscedasticity
37
What does garch do?
estimate or forecast the variance of something
38
what does variance do?
estimate the amount of error in our estimate
39
Why is variance estimation important in finance?
investment -traditional portfolio optimization model balances expected return of a set of investments with the amount of volatility tradeoff between return and risk variance - proxy for volatility or risk
40
Differences between Garch and Arima
garch uses -varinaces/squared errors, not observations/ linear errors -raw variances - not differences of variances like we use differences in arima -otherwise they're very similar
41
3 methods for analyzing time series data
-exp smoothing -arima -garch
42