10 e 11 - Hidden Markov Chains Flashcards

1
Q

What are Hidden Markov Models?

A

Hidden Markov Models (HMMs) are stochastic processes in which the distribution that generates an observation depends on the state of an underlying and unobserved Markov chain.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why HMMs?

A

Non-stationary time series can be modelled with HMMs. They also allow to naturally model discrete or continuous time series.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition of HMMs?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Assumptions of a HMM

A

L’ultima significa che la distribuzione di una osservazione data è condizionata solo dallo stato attuale. Questo significa che la probabilità di osservare qualcosa è solo una funzione dello stato attuale, non degli stati passati o delle osservazioni passate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How is the probability law of a hidden Markov process specified?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

General for of emission distribution in state j, in the discrete case. Furthermore, compact notation.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

General for of emission distribution in state j, in the other cases. Furthermore, example with Poisson.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Conditional independence structure of a HMM with DAG

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the typical problems when using HMM?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the two questions in model specification?

A
  1. Choosing the number k of hidden states is a first, natural question. In some application, k
    is suggested by knowledge of the phenomena under study. In other cases, one could try a few different values of k, and compare the performance of the models. Another solution is to take a Bayesian approach, and assign a prior distribution on k, then make inference on k through its conditional distribution, given the data; or use recent developments in machine learning and Bayesian nonparametrics, namely infinite HMMs that do not require to fix the number of states in advance.
  2. The appropriate emission distributions are instead often fairly naturally suggested by the nature of the data in the problem under study.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Estimate of unknown parameters nel caso discreto

A

Il senso è che devo trovare il likelihood che è uguale alla probabilità delle realizzazioni di Yt dato phi. Grazie al factto che la struttra del HMM è condizionatamente indipendente, questa probabilità di realizzazione di Yt può essere vista come la probabilità di realizzazione di Yt dato St, moltiplicato per la probabilità che si realizzi St stesso.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Decoding with the local criterion: how to and problems

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Decoding with the global criterion: how to and problems

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly