Alternative Classification Techniques Flashcards

1
Q

6 ALTERNATIVE CLASSIFICATION TECHNIQUE

A
  1. Rule-Based Classifier
  2. Nearest Neighbor Classifier
  3. Naïve Bayes Classifier
  4. Artificial Neural Network
  5. Support Vector Machines
  6. Ensemble Method
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

classify record by using a collection of “if…then..” rules.

A

Rule-Based Classifier

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

These rules are ranked according to their priority.

A

Ordered Rule Set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

an ordered rule set is known as this.

A

Decision List

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

2 RULE ORDERING SCHEMES

A
  1. Rule-based Ordering
  2. Class-based Ordering
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

an ordering scheme where individual rules are ranked based on their quality.

A

Rule-based Ordering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

an ordering scheme where rules that belong to the same class appear together.

A

Class-based Ordering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

fractions of records that justify the antecedent of a rule.

A

Coverage of a Rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

fraction of records that satisfy both the antecedent and consequent of a rule.

A

Accuracy of a Rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

2 Characteristic of Rule-Based Classifier

A
  • Mutually Exclusive Rules
  • Exhaustive Rules
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

2 Effect of Rule Simplification:

A
  • Rules are no longer mutually exclusive
  • Rules are no longer exhaustive
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

5 Advantages of Rule-Based Classifier

A
  1. Highly expressive as a decision tree
  2. Easy to interpret
  3. Easy to generate
  4. Can classify new instances rapidly
  5. Performance comparable to decision tree
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

are lazy learners, it does not build model explicitly, needs to store all training data, and classifying unknown records are expensive.

A

Nearest Neighbor Classifier

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

3 Requirements for Nearest Neighbor Classifier

A
  1. Set of stored records
  2. Distance Metric to compute distance between records
  3. The value of k, the number of nearest neighbors to retrieve.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

3 Ways to Identify Unknown Values in Nearest Neighbor Classifier

A

1.Compute distance to other records

  1. Identify k nearest neighbor
  2. Use class label of nearest neighbor (majority vote)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

is robust to isolated noise points, handles missing valued by ignoring instance during probability estimate calculation, and robust to irrelevant attributes.

A

Naïve Bayes Classifier

17
Q

involves learning the weights of the neurons.

A

Artificial Neural Networks

18
Q

Algorithm for learning Artificial Neural Networks (3):

A
  1. Initialize the weights
  2. Adjust the weights in such a way that the output of ANN is consistent with class labels of training examples.
  3. Find weights that minimize the above function.
19
Q

uses a hyperplane (decision boundary) to separate the data.

A

Support Vector Machines

20
Q

In SVM, means it is more robust and is less expected to generalization error.

A

Larger Margins

21
Q

2 Methods for Ensemble Method

A
  1. Construct a set of (possible weak) classifiers from the training data.
  2. Predict class label of previously unseen records by aggregating predictions made by multiple classifiers.
22
Q

3 Advantages of Ensemble Method

A
  • Improve stability and often also accuracy of classifiers
  • Reduce variance in prediction
  • Reduces overfitting
23
Q

What is the general idea of Ensemble Method

A
  1. Create multiple data sets
  2. Build multiple classifiers
  3. Combine classifiers
24
Q

3 Examples of Ensemble Method

A
  1. Bagging
  2. Boosting
  3. Random Forests
25
Q

3 Steps of Bagging

A
  • Sampling with replacement (bootstrap sampling)
  • Build Classifier
  • Aggregate the classifiers’ results by averaging or voting.
26
Q

an example of Ensemble Method where records that are incorrectly classified in one round will have their weights increased in the next.

A

Boosting

27
Q

is a popular algorithm that typically used a decision tree as the weak learner.

A

AdaBoost

28
Q

an example of Ensemble Method that introduces two sources of randomness: Bagging and Random Input Vector.

A

Random Forests

29
Q

each tree is grown using a bootstrap sample of training data.

A

Bagging Method

30
Q

at each node, best split is chosen only from a random sample.

A

Random Vector Method