Naive Bayes Flashcards

(30 cards)

1
Q

What does Bayes’ Theorem compute?

A

The probability of a hypothesis given observed evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the formula for Bayes’ Theorem?

A

P(A|B) = P(B|A) * P(A) / P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is P(A) in Bayes’ Theorem?

A

The prior probability of the hypothesis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is P(B|A) in Bayes’ Theorem?

A

The likelihood of the evidence given the hypothesis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is P(B) in Bayes’ Theorem?

A

The total probability of the evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is P(A|B) in Bayes’ Theorem?

A

The posterior probability of the hypothesis after seeing the evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does the Thomas (librarian vs. farmer) example illustrate?

A

The importance of including base rates (priors) in probability judgments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the Naïve Bayes classifier assume?

A

That all features are conditionally independent given the class label.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the main benefit of the independence assumption in Naïve Bayes?

A

It simplifies the computation of class probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the goal of the Naïve Bayes classifier?

A

To find the most probable class label given a set of features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Does Naïve Bayes calculate the denominator P(x1,…,xn) for classification?

A

No, it’s the same for all classes and can be ignored when comparing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What type of output does Naïve Bayes provide?

A

Probabilistic class predictions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does ‘naïve’ refer to in Naïve Bayes?

A

The assumption that all features are independent given the class.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In the one-feature walking example, what was the predicted probability of walking on a sunny day?

A

0.75

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What happens when Naïve Bayes is extended to multiple features?

A

It multiplies the conditional probabilities of all features for each class.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is Gaussian Naïve Bayes used for?

A

Modeling continuous input features using normal distributions.

17
Q

What distribution is assumed in Gaussian Naïve Bayes?

A

The normal (Gaussian) distribution.

18
Q

What is the formula for the Gaussian probability density function?

A

P(x|μ,σ) = (1 / √(2πσ²)) * e^{-(x - μ)² / (2σ²)}

19
Q

What parameters are estimated in Gaussian Naïve Bayes for each class?

A

Mean (μ) and standard deviation (σ) for each feature.

20
Q

What is the role of priors in Naïve Bayes classification?

A

They adjust the likelihoods based on class frequencies in the data.

21
Q

How does Naïve Bayes handle continuous and categorical data?

A

It models continuous features with Gaussians and categorical ones with frequencies.

22
Q

What kind of features does Naïve Bayes assume are conditionally independent?

A

All features given the class label.

23
Q

What is one limitation of Naïve Bayes?

A

It assumes all features are independent, which is often not true.

24
Q

What is Laplace smoothing used for in Naïve Bayes?

A

To prevent zero probabilities for unseen feature-class combinations.

25
Why is Naïve Bayes fast to train?
It only needs to calculate class priors and conditional probabilities.
26
Can Naïve Bayes handle high-dimensional data?
Yes, it scales well with many features.
27
What is a strength of Naïve Bayes in terms of generalisation?
It is robust to irrelevant features.
28
What is a limitation of Naïve Bayes with correlated features?
It can double-count evidence, leading to poor estimates.
29
Is Naïve Bayes useful for real-time applications?
Yes, due to its speed and simplicity.
30
How does Gaussian Naïve Bayes predict class membership?
By computing the probability of each feature value under a normal distribution per class.