Markov PCA Flashcards

1
Q

Markov Chain

A
  • Finite number of states
  • Transition matrix dictates probability of variable to change.
  • “Ability to forget the past”
  • x-order Markov Chain considers previous x states (including itself)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Markov Chain 1-step

A

P(x_ = P(x1|x0) * P(x0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Hidden Markov Model

A
  • Current state hidden
  • Emits a symbol when at a state with probability
  • Exact state never know, only guess from output
  • Underlying states of HMM are in a Markov Chain
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

HMM 1 step sequence

A

P(X) = E_s P(x|S)P(S)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Maximum Likelihood

A

Estimate T by maximising probability of data P(D;T) as a function of matrix T.
Good estimate, asymptotically consistent and efficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Max Likelihood equation

A

P(D;T)=P(x1) . nij Ti->j, n(i->j)

n(i->j) is number of transitions from i->j

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

PCA Description

A

Reduce dimensions of the matrix by keeping only principle data. Key principle: Maintain as much variance in data as possible whilst reducing dimensionality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

PCA Method:

A

Centre data around origin.
Calculate Covariance matric
Take eigenvectors of the largest eigenvalue in the covariance matrix to capture most data.
Take highest d eigvecs such that sum of d eigvecs is 0.95x total variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Eigenvector EigenValue

A

EigVec of a matrix is a vector that when multiplied by original matrix comes out parallel to direction is went in (scale factor - the eigenvalue)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly