Connectionism Flashcards

(49 cards)

1
Q

Neural network

A

Consists of an input layer, hidden layers and an output layer
Overall the network transforms any input pattern into a corresponding output pattern as dictated by the arrangement and strength of the many connections between neuro

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Chinese gym

A

Many Chinese room’s might solve the symbol grounding problem (but we need to many since 1 neuron = one person)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Gradient descend

A

An optimization algorithm for finding a local minimum of a differentiable function
Used for back-propagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Solution to symbol grounding problem

A

Rises from complexity.
How we assign meaning lays beyond language and humans
One must look at how neurons code and transform sensory signals
the relations between weights and activations of the neurons of the trained model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Weights

A

How much one neuron is firing to another (the lines that connect the units)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Activation

A

If a neuron/unit is activated (or not)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Analogue

A

On/off

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Parallel processing

A

Instead of having serial processing (one at a time) one has multiple at a time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Back-propagation

A

Learning neural networks by gradient estimation (optimization algorithm)
The strategy exploits the calculated error between the actual values of the processing units in the output layer and the desired values, which is provided by a training signal. The resulting error signal is propagated from the output layer backward to the input layer and used to adjust each weight in the network. The network learns, as the weights are changed, to minimize the mean squared error over the training set of words

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Fundamental research program of classical AI

A

Identify the undoubtedly complex function that governs the human pattern of response to the environment and then write the program (the set of recursively applicable rules) by which the SM machine will compute it.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Classical AI

A

It’s like a programmed robot following a strict set of human-made rules

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Anatomic points in the brain that inspired connectionism

A

Nervous system: Parallel machines
The neurons are comparatively simple: Analog response
- They are somewhat digital - it is firing or not
- We look at the firing frequency and translate it into a value between 0 and 1
- This is a contrast to computers 0’s and 1’s
In the brain, axons projecting from one neuronal population to another are often matched by axons returning from their target population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Function (connectionism)

A

Any vector-to-vector transformation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Materialism

A

Everything is physical

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Dualism

A

Distinction between body and mind

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

NETtalk

A

It converts English text to speech sounds
1. Receives as input letter in a word
○ Local representation
2. Performs the transformation
3. Yield the elementary speech sounds
Has 309 processing units and 18,629 connection strengths (weights) that must be specified. The network does not have any initial or built-in organization for processing the input or (more exactly) mapping letters onto sounds. All the structure emerges during the training period. The values of the weights are determined by using the “back-propagation” learning algorithm developed by Rumelhart, Hinton, and Williams (1986)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

SM machine

A

Symbol manipulating machine

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Church’s thesis

A

Every effective computable functions is recursive computable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Effectively computable

A

There is a “rote” procedure for determining, in finite time, the output of the function for a given input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Recursively computable

A

There is a finite set of operations that can be applied to a given input, and then applied again and again to the successive results of such applications, to yield the function’s output in finite time

21
Q

The Luminous room (Maxwell)

A

Because something is valid it doesn’t need to be sound
Maxwell and light (theoretical)
Axiom 1. Electricity and magnetism are forces.
Axiom 2. The essential property of light is luminance.
Axiom 3. Forces by themselves are neither constitutive of nor sufficient for luminance.
Conclusion 1. Electricity and magnetism are neither constitutive of nor sufficient for light.
Problem: If you stand in a dark room with a magnet there will be no light
Maxwell’s responses:
The “luminous room” experiment is a misleading display of the phenomenon of luminance because the frequency of oscillation of the magnet is absurdly low, too low by a factor of 10^15.
But the room already contains everything essential to light according to Maxwell’s own theory

22
Q

Main problems with the Chinese rom

A

What people can or cannot imagine often has nothing to do with what is or is not the case, even where the people involved are highly intelligent
Axiom 3 is false

23
Q

The failure of classical AI

A

The functional architecture of classical SM machines is simply the wrong architecture for the very demanding jobs require

24
Q

Process training up network

A

There are various procedures for adjusting the weights so as to yield a network that computes almost any function-that is, any vector-to-vector transformation-that one might desire. In fact, one can even impose on it a function one is unable to specify, so long as one can supply a set of examples of the desired input-output pairs.

25
Why is it important that the system is parallel?
- A parallel architecture provides a dramatic speed advantage over a conventional computer, for the many synapses at each level perform many small computations simultaneously instead of in laborious sequence. - Massive parallelism means that the system is fault-tolerant and functionally persistent; the loss of a few connections, even quite a few, has a negligible effect on the character of the overall transformation performed by the surviving network. - A parallel system stores large amounts of information in a distributed fashion, any part of which can be accessed in milliseconds
26
Weight space
The space of all possible weights
27
Naturalism
The idea that only natural laws and forces operate in the universe
28
Mind-body problem
Are mental phenomena actually phenomena of the physical brain?
29
Identity theory
Mental states are identical to states in the brain
30
Reductions
Explanations of phenomena described by one theory in terms of the phenomena described by a more basic theory
31
Reductionist
Integration of psychology and neurobiology theory
32
Sententialism
The research paradigm that it is the task of cognitive science to figure out what programs the brain runs, and neuroscience can then check these top-down hypotheses against the wetware to see if they are generally possible
33
Connectionist model
Characterized by connections and differential strengths of connection between processing units Are designed to perform a task by specifying the architecture: the number of units, their arrangement in layers and columns, the patterns of connectivity, and the weight or strength of each connection
34
Processing units (connectionist model)
Neurons that communicate with one another by signal (firing rate)
35
Computational level (connectionist model)
The task is specified
36
Implementation level (connectionist model)
The task is physically instantiated
37
Booluean dream
(Old idea) all cognition is symbol-manipulation according to the rules of logic
38
Neurobiologists' dream
(Old idea) the faith that the answers we seek will be manifest once the fine-grain details of each neuron are revealed
39
Marr's Dream of three levels of explanation
In Marr's view, a higher level was independent of the levels below it, and hence computational problems could be analyzed independently of an understanding of the algorithm that executes the computation, and the algorithmic problem could be solved independently of an understanding of the physical implementation Marr's assessment of the relations between levels has been reevaluated, and the dependence of higher levels on lower levels has come to be recognized Marr's three levels of analysis and the brain's levels of organization do not appear to mesh in a very useful or satisfying manner
40
Connectionist
Network models are not independent of either the computational level or the implementational level - maybe there are more than 3 levels (more than 1 implementational level)
41
Tensions and shortcomings in computationalism
It is bound up on the research of AI ○ Example: Robot Shakey § Why is he so slow when computers usually are so fast? § 1) Signal propagation in the computer much faster than in the brain (~1e6 times faster) The context problem
42
The context problem (Dreyfus argument)
Humans: Have inarticulate background knowledge, which can be applied in a flexible manner to changing contexts Classical artificial intelligence: Background knowledge needs to be stored explicitly. Access faces the double problem that: Search time is long, and what to search for (what is the context)?
43
Why neural networks?
Reverse engineering the brain - and hopefully we will get the mind as well
44
Redundancy (Fault tolerance)
The network doesn't get down if we remove one piece
45
Flexible storage
All knowledge is stored in the system
46
Advantages of parallelism
Speed Redundancy Flexible storage
47
Representation
Emerges from the structure of the neural network
48
Emergent properties
A property of a complex system is said to be ‘emergent’ just in case, although it arises out of the properties and relations characterizing its simpler constituents, it is neither predictable from, nor reducible to, these lower level-characteristics.
49
Question fallacy
An attempt to prove something is true while simultaneously taking that same thing for granted