318 test 2 Flashcards

1
Q

What is probabilistic modeling? (3)

A
  • It utilizes the effects of random occurrences or actions to forecast future possibilities.
  • This method is quantitative, focusing on projecting various possible outcomes.
  • It can predict outcomes that may extend beyond recent events.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does nondeterminism in algorithms mean in computer science?

A

In computer science, nondeterminism refers to an algorithm that can exhibit different behaviors on different runs with the same input, as opposed to a deterministic algorithm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the Markov property?

A
  • the conditional probability distribution of
    future states of the process depends only on the present state
  • It implies a lack of memory in the process, where past events do not influence future
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a Markov chain?

A
  • A Markov chain is a discrete-time stochastic process.
  • It satisfies the Markov property.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does a Markov chain differ from a continuous-time stochastic process?

A

Unlike continuous-time stochastic processes, a Markov chain specifically operates in discrete time intervals.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Time Series

A
  • a series of data points ordered in time (discrete-time data)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Univariate

A

: one variable is varying over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Multivariate

A

: multiple variables are varying over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

examples of Markov chains

A
  • stock market
  • Random walk on a grid (see slide)
  • Weather forecast
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can one characterize real signals in terms of signal models?

A
  • provide a basis for a theoretical description of signal processing systems
  • enhance understanding of the signal source even if the source is unavailable
    (through simulation of the real-world process)
  • enable building prediction systems, recognition systems, identification systems
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what are 2 Approaches to signal modeling

A

. Deterministic models
Statistical models:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Deterministic models:

A

use known specific properties of a signal (amplitude, frequency)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
  1. Statistical models:
A

characterize only statistical signal properties (Gaussian, Poisson)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are the Three fundamental problems in Hidden Markov model (HMM) design and analysis, and what do they each mean:

A

likelihood, best sequence, adjust parameters to account for signals

1.Evaluation of the probability (or likelihood) of a sequence of
observations generated by a given HMM
2. Determination of a “best” sequence of model states
3. Adjustment of model parameters so as to best account for the observed signals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what is a stochastic model

A

A stochastic model is a type of mathematical or computational model that incorporates randomness and unpredictability as intrinsic elements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How do simpler Markov models differ from HMMs?

A

In simpler Markov models, states are directly observable.
In an HMM, the state sequence the model passes through remains hidden.

17
Q

What are some applications of Hidden Markov Models in temporal pattern recognition?

A
  • speech
  • handwriting
  • gesture recognition
  • part-of-speech tagging
  • musical score following
  • partial discharges
  • bioinformatics
18
Q

understand the cointoss example for HMM

A
19
Q

is a 3 state model for a coin good? how many unknow parameters does a 3 state model have? explain why or why not

A
  • if the actual experiments use only a single coin, the 3-state model would be
    highly inappropriate as the model would not correspond to the actual physical event
20
Q

understand the ball and urn problem

A
21
Q

HMM state

A
  • These represent the different conditions or configurations that can produce the observations but are not directly visible.
  • Example: In weather modeling, the states might represent “sunny”, “cloudy”, and “rainy”. In the urn example, they represent different urns with specific distributions of colored balls.
22
Q

state transition probabilities

A
  • these indicate the probability of transitioning from one state to another.
  • Example: In the weather example, aij could represent the probability that a sunny day follows a cloudy day. In the urn example, aij would represent the probability of choosing urn j after urn i.
  • This creates a total of N^2 transition probabilities because there are
    N probabilities for each of the
    N states.
23
Q

urn example: initial state probability:

A
  • This is the probability distribution over the initial state, i.e., where the process is likely to start.
  • Example: For the weather, it might be the historical likelihood of the weather being sunny, cloudy, or rainy at the start of the observation period. In the urn example, it would be the likelihood of starting from each specific urn.
24
Q

Likelihood of the observed sequence:

A

Given the model, how likely is the sequence of observations to occur?

25
Q

what is “hidden” in the urn problem

A
  • the process of picking the ball
  • The Markov process itself cannot be observed, only the sequence of labeled balls
26
Q

what are the two parameters of HMM

A
  • TRANSITION PROBABILITIES and
    INITIAL STATE PROBABILITIES,
  • and (ii) OUTPUT PROBABILITIES
27
Q

TRANSITION PROBABILITIES

A

The transition probabilities control
the way the hidden state at time T is chosen given the hidden state at time T-1

28
Q

what are the 5 HMM characteristics

A
  • N number of states
  • M number of distinct observation symbols
  • state transition probability distribution
  • observation symbol probability distriution
  • initial state distribution
29
Q

what are the 3 things you need to fully define a HMM model

A
  • Two parameters: the number of states (N) and the number of observation symbols (M).
  • The observation symbols themselves, which can be either discrete or continuous.
  • Three sets of probabilities: the transition probabilities (A), the observation probabilities (B), and the initial state probabilities (π).

compact notation: lambda = (A,B,pi)

30
Q

what are the 3 basic problems in HMM:

A

Efficiency, state, optimize

  1. given observation sequence, how do we efficiently compute the probability of the observation sequence given the model. Aka scoring:
    - how well
    a given model matches a given observation sequence.
  2. how to choose optimal state
    - aka: attempting to uncover the
    hidden part of a model.
    - there is no correct state
    - state is chosen based on situation
  3. how do we optimize model parameters?
    - train then fit the data
31
Q

what are the ways you can compute the probability of the given model

A

brute force: is by enumerating each and every possible state sequence of length T. This could take a while though.

32
Q

how does HMM generate a sequence of observations?

A

first select the first state based on the initial probability, namely Pi

using that state and transition matrix we can get state2.

rinse and repate

33
Q

what is the time complexity for HMM

A

2T * N^T
- T represents the length of the observation sequence
- N represents the number of possible states

34
Q
A