Final Flashcards

1
Q

Machine learning

A

Machine learning is the study of algorithms that:
improve their performance P
at some task T
with experience E

Put data and output into computer to come out with a program

Examples of tasks best solved by program learning: recognizing patterns, generating patterns, recognizing anomalies, prediction/recommendation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Machine learning problems

A

Categorization: Discrete supervised learning
Clustering: Discrete unsupervised learning
Regression: Continuous supervised learning (function that predicts y given x)
Dimensionality reduction: Continuous unsupervised learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Test-set method

A

To avoid the problem of overfitting (fitting noise, too precise)
Randomly choose 30% of the data to be in a test set, remainder is a training set, perform your regression on the training set, estimate your future performance with the test set
It imposes a penalty for unnecessary complexity (best method is quadratic)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

KNN

A

K-nearest neighbour
Assign label to item based on the class of the 1 or 3 or whatever nearest neighbour, an item is classified by a plurality vote of its neighbours

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Risk attitudes

A

§ You’re risk-adverse if you have two alternatives with the same EV, and you choose the alternative with less variation in outcomes. (gains)
§ You’re risk-neutral if you have two alternatives with different variations in outcomes and the same EV and you’re indifferent between those
alternatives.
§ You’re risk-seeking if two alternatives have the same expected monetary value and you choose the one with the highest variability. (losses)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Heuristics

A

§ Availability
The easier it is to consider instances of class Y, the more frequent we think it is
§ Representativeness
The more object X is similar to class Y, the more likely we think X belongs to Y
§ Anchoring
Initial estimated values affect the final estimates, even after considerable adjustments

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Gambler’s fallacy

A

Assuming that a departure
for what occurs on average will be corrected in the short run.
In other words: because an event has not
happened recently, it has become “overdue”
and is more likely to occur

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Design principles for software

A

User familiarity
* The interface should be based on user-
oriented terms and concepts rather than
computer concepts.
à e.g. letters, documents, folders etc.
rather directories, file identifiers, etc.
Consistency
* Information, Commands, and Menus
should have the same format
Minimal surprise
* If an action operates in a known way,
the user should be able to predict the
operation of comparable commands

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Modality

A

An human-machine interface is modal with
respect to a given gesture when (1) the current state of the interface is not the user’s locus of attention and
(2) the interface will execute one among several different responses to the gesture, depending on the system’s current state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Affordances

A
  • The perceived and actual
    fundamental properties of the
    object that suggest how it
    could be used
  • As electronic devices grow in
    complexity and functionality,
    understanding an object’s
    functions from its appearance
    grows harder
  • Just by looking the user should know: State of the system + Possible actions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly