318 test 2 Flashcards
(34 cards)
What is probabilistic modeling? (3)
- It utilizes the effects of random occurrences or actions to forecast future possibilities.
- This method is quantitative, focusing on projecting various possible outcomes.
- It can predict outcomes that may extend beyond recent events.
What does nondeterminism in algorithms mean in computer science?
In computer science, nondeterminism refers to an algorithm that can exhibit different behaviors on different runs with the same input, as opposed to a deterministic algorithm.
What is the Markov property?
- the conditional probability distribution of
future states of the process depends only on the present state - It implies a lack of memory in the process, where past events do not influence future
What is a Markov chain?
- A Markov chain is a discrete-time stochastic process.
- It satisfies the Markov property.
How does a Markov chain differ from a continuous-time stochastic process?
Unlike continuous-time stochastic processes, a Markov chain specifically operates in discrete time intervals.
Time Series
- a series of data points ordered in time (discrete-time data)
Univariate
: one variable is varying over time
Multivariate
: multiple variables are varying over time
examples of Markov chains
- stock market
- Random walk on a grid (see slide)
- Weather forecast
How can one characterize real signals in terms of signal models?
- provide a basis for a theoretical description of signal processing systems
- enhance understanding of the signal source even if the source is unavailable
(through simulation of the real-world process) - enable building prediction systems, recognition systems, identification systems
what are 2 Approaches to signal modeling
. Deterministic models
Statistical models:
Deterministic models:
use known specific properties of a signal (amplitude, frequency)
- Statistical models:
characterize only statistical signal properties (Gaussian, Poisson)
what are the Three fundamental problems in Hidden Markov model (HMM) design and analysis, and what do they each mean:
likelihood, best sequence, adjust parameters to account for signals
1.Evaluation of the probability (or likelihood) of a sequence of
observations generated by a given HMM
2. Determination of a “best” sequence of model states
3. Adjustment of model parameters so as to best account for the observed signals
what is a stochastic model
A stochastic model is a type of mathematical or computational model that incorporates randomness and unpredictability as intrinsic elements.
How do simpler Markov models differ from HMMs?
In simpler Markov models, states are directly observable.
In an HMM, the state sequence the model passes through remains hidden.
What are some applications of Hidden Markov Models in temporal pattern recognition?
- speech
- handwriting
- gesture recognition
- part-of-speech tagging
- musical score following
- partial discharges
- bioinformatics
understand the cointoss example for HMM
is a 3 state model for a coin good? how many unknow parameters does a 3 state model have? explain why or why not
- if the actual experiments use only a single coin, the 3-state model would be
highly inappropriate as the model would not correspond to the actual physical event
understand the ball and urn problem
HMM state
- These represent the different conditions or configurations that can produce the observations but are not directly visible.
- Example: In weather modeling, the states might represent “sunny”, “cloudy”, and “rainy”. In the urn example, they represent different urns with specific distributions of colored balls.
state transition probabilities
- these indicate the probability of transitioning from one state to another.
- Example: In the weather example, aij could represent the probability that a sunny day follows a cloudy day. In the urn example, aij would represent the probability of choosing urn j after urn i.
- This creates a total of N^2 transition probabilities because there are
N probabilities for each of the
N states.
urn example: initial state probability:
- This is the probability distribution over the initial state, i.e., where the process is likely to start.
- Example: For the weather, it might be the historical likelihood of the weather being sunny, cloudy, or rainy at the start of the observation period. In the urn example, it would be the likelihood of starting from each specific urn.
Likelihood of the observed sequence:
Given the model, how likely is the sequence of observations to occur?