9.2 Sequential Models Flashcards

1
Q

What is the forward algorithm?

A

The forward algorithm, in the context of a hidden Markov model (HMM), is used to calculate a ‘belief state’: the probability of a state at a certain time, given the history of evidence. The process is also known as filtering. The forward algorithm is closely related to, but distinct from, the Viterbi algorithm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the The Viterbi algorithm?

A

The Viterbi algorithm is a dynamic programming algorithm for obtaining the maximum a posteriori probability estimate of the most likely sequence of hidden states—called the Viterbi path—that results in a sequence of observed events, especially in the context of Markov information sources and hidden Markov models (HMM)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Given a Markov chain in the following figure, what is the transition matrix (A) of this Markov chain?

A

Element (i,j) in transition matrix shows the transition probability from state i to state j, and the sum of the elements in each row is 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Given the Markov chain in the figure from Q1, what is the probability that a process beginning at state B is at state A after 2 moves?

A

0.32

Beginning at state B we can arrive to state A in two moves in two ways:

(1) State B to State B to State A; or (2) State B to State A to State A.

The probability of getting state A after 2 moves is:

P(State = B|State = B)P(State = A|State = B) + P(State = A|State = B)P(State = A|State = A) = 0.6*0.4 + 0.4*0.2 = 0.32.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Which of the following tasks are fundamental problems associated with HMM?

A
  • Decoding
  • Evaluation

The fundamental problems associated with HMM are evaluation, decoding and learning. Evaluation is to estimate the likelihood of an observation sequence. Decoding is to find the most probable state sequence. Learning is to estimate the parameters of HMM.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Given an HMM with N states, for the Forward Algorithm, what is the definition of the forward probability αt(i) ? How many times a variable αt(i) will be reused at time step t+1 ?

A

Forward probability is the probability of the partial observation sequence, o1,o2,…,ot, and state is si at time t. Each term is reused N times for computing N terms of αt+1(i). Reusing forward probability can save computation time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a markov chain?

A

A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a hidden markov model (HMM)?

A

In a Hidden Markov Model (HMM), we have an invisible Markov chain (which we cannot observe), and each state generates in random one out of k observations, which are visible to us.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly