Lecture 3 - Processing II Flashcards

1
Q

Symbolic vs. connectionist representation

A

Physical symbol system: discrete items and relationship to each other
-has the necessary and sufficient means for general intelligence action (Newell and Simon)… very idealistic
Connectionist : fuzzy, but concept distributed (see image)
Don’t need to associate single concept to single thing

(see images)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Fodor’s Modularity of Mind (’83)

A

Features:
-Encapsulation and inaccessibility: don’t have access to info from other modules
-Mandatoriness: cannot NOT process ex: English if English speaker, Speed: very quickly, and Superficiality: shallow processing (see lecture)… shallow processing is more about the structures / features of the thing being processed, but deep processing is more semantic (what’s the meaning of the thing. So shallow processing of “dog would be “brown”, “four legs” etc. but deep could be “thats MY dog, not just an instance of dog”
-Dissociability : if one fails, others unaffected, Localizability :pre-defined location
-Domain specificity : each does its own thing, can’t do other feature (not true)
-Innateness : born with certain modules taking on certain processes

Fodor criticism
-positive aspect : brain damage (local damage) still functional in other domains
-negative aspect: brain damage… possible to have other areas take over, plasticity. Also can cause general deficits
-negative aspect: bilaterality: many areas are doubled (2 different brain hemispheres, so duplicated areas)

Current: some things innate, some things general

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Minsky’s society of mind (‘86)

A

Different expert subsystems
Bureaucracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

From processing to perception: Template matching theory

A

For each concept you learn, store different copies of them, and when you see thing, see if it matches template
But would grow out of proportion…. Too many instances to store ☹
No universal matching process in your brain (there is no difference operator) ☹
Time for retrieval ☹
Applications (can work for things like video games, very structured setting) 😊

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

From processing to perception: FMT/PanDEMONium theory

A

Whole bunch of little ‘’demons’’ in your head… image demon, feature demons, cognitive demons, decision demon
Ex: letter R
Criticism (see slides):
CNNs 😊 convolutional neuro network
Neurophysio experiments (single cell) 😊
Top-down? ☹

Relisten to explanations about these slides:
Deep neural networks learn hierarchical feature representations
Cat visual cortex thing
ABC 12 13 14 thing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

From processing to perception: Component theory

A

Geons… primitive geometrical shapes

Marr’s vision

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Marr’s 3-level framework

A

1- Computational:
-Task?
-Inputs/outputs?
-Why?

2-Algorithmic:
-Program or “Symbol Transformations”
-Representations

3-Implementation:
-Neural subtrate

(rewatch cuz idgi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Summary

A

● Representation can be symbolic/connectionist
● Fodor/Minsky → mind = modules (specific, autonomous, etc.)
● Template matching: learn copies of each thing
● Feature Detection/Pandemonium: have dedicated pattern recognizer “daemons”
● Component theory: everything is made out of (3D) primitives (geons)
● Marrʼs tri-levels: computation, algorithm, implementation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly