Week 4 Flashcards

(28 cards)

1
Q

How to choose alpha for exponential smoothing?

A

trade off between trusting xt when alpha is large and trusting s_t-1 when alpha is small.

More randomness - trust previous estimate S_t-1
Less randomness - trust what you see x_t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Time series complexities

A

Trends - increasing an decreasing
Cyclical patterns - Annual temp cycles, weekly sales cycles, daily blood pressure cycles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How to deal with cyclic patterns

A

Like trned- additive component of a forula
Multiplicative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Multiplicative

A

L: length of a cycle
C_t: the multiplicative seasonality factor for time t: Inflate or deflate the observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If C is 1.1 on Sunday when sales were ____ higher just because it was a sunday

A

10%

If 550 sold on Sunday then 500 is baseline value and 50 is 10% extra

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Starting condition for Trend in exponential smoothing

A

Trend: T1 = 0, shows no initial trend

Multiplicative seasonality
ultiplying by 1
shows no initial cyclic effect
First L values of C set to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Triple exponential smoothing is called

A

WInter’s method or Holt - WInters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

In exponential smoothing, more recent observations or ore important . T F

A

T. Newer observations are weighted more

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Forecasting with trend

A

The best estiate of the next baseline is the most current baseline estimate
The best estimate of the trend is the most current trend estimate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How to find good values of alpha, beta , and gamma

A

Optimization. Minimize sum of squared error across dataset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

ARIMA

A

Autoregressive integrated moving average(ARIMA)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Three key parts to ARIMA

A

1) DIfferences: Sometimes differences in data can be stationary
2) Auto regression: Predicting current value based on previous time periods
3) Moving average: Previous errors as predictors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Stationary process

A

if the mean, variance, and other measures are expected to be constant over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Regression

A

predicting the value based on other factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

auto

A

using earlier values to predict
This only works with time series data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

ARIMA combines autoregression and differencing

A

autogression on the differences
use p time periosds of previous observations to predict dth order differences

17
Q

ARMIMA(0,0,0)

18
Q

ARIMA(0,1,0)

19
Q

ARIMA(p,0,0)

20
Q

ARIMA(0,0,q)

A

Moving average model

21
Q

ARIMA(0,1,1)

A

basic exponential smoothing model

22
Q

WHen does ARIMA work better than exponential smoothing

A

When data is stable with fewer peaks valleys and outliers

23
Q

How many past data points do you need for ARIMA to work well?

24
Q

GARCH

A

Generalized Autoregressive Conditional Heteroscedasticity
Estimate or forecast variance of something for which we have time series data

25
Variance
Estimate the amount of error
26
Difference between ARIMA and GARCH
Variance/squared errors instead of observations/linear errors Raw variances: Not differences of variances
27
GARCH needs d parameter TF
False . GARCH doesn't use differences
28
Three most common methods for analyzing time series
Exponential smoothing ARIMA GARCH