Lab 1 Questions Flashcards

1
Q

In what year Alan Turing develops the Turing Test

A

1950

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What year the term “artificial intelligence is first used

A

1956

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

While at the Cornell Aeronautical Laboratory, Frank Rosenblatt develops the perceptron,
the first artificial neuron, that will set the stage for the development of networks of
artificial neurons, a.k.a. neural networks

A

1957

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Joseph Weizenbaum develops Eliza, the first chatbot. It is based on a set of re-write rules
written by hand.

A

1966

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Marvin Minsky & Seymour Papert publish a book that shows the limits of perceptrons
and argue for more work in symbolic computation (aka Good-Old-Fashioned AI, GOFAI).
The book is often cited as the main reason for the abandoning research on neural networks

A

1969

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The Lighthill and the ALPAC reports that showed little progress in AI, kills research
funding and leads to the first AI Winter.

A

early 1970s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The robot Shakey1
, programmed in LISP, resulted in the development of the A* search
algorithm.

A

1972

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Alain Colmerauer, who was professor at the University of Montreal for a few years,
develops Prolog; a programming language based on logics that is very popular in AI to
write rule-based systems

A

1972

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Marvin Minsky develops the Frames to reason with world-knowledge. Years later, frames
turned out to be the basis of object-oriented programming.

A

1974

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The expert system MYCIN is developed to recognise bacterial infections and recommend
antibiotics. Its recommendations are often better than those of human experts. It is
based on a knowledge base of ≈ 600 hand-written rules (written in Lisp) and developed
in collaboration with medical doctors.

A

1975

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The METEO rule-based machine translation system, developed at the University of Montreal, is deployed at Environment Canada to translate weather forecasts from English to
French.

A

1975

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Expert systems, such as MYCIN, and other types of systems made of hand-written rules
are considered too expensive to maintain and to adapt to new domains. The industry
drops research in such systems. It is the 2
nd AI Winter

A

early 1980’s - early 1990’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Corinna Cortes and Vladimir Vapnik develop an approach to machine learning called soft
margin Support Vector Machines (SVM), which quickly becomes one of the most popular
machine learning algorith

A

1993

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

After finishing his PhD on handwriting recognition, Yann Lecun makes public the MNIST
dataset. The dataset contains 70,000 images of handwritten digits and becomes the
benchmark to evaluate machine learning.

A

1998

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Google launches its Google Translate service based on Statistical Machine Translation.
Translation rules are found automatically based on a statistical analysis of parallel texts
in different languages.

A

2006

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Netflix initiates a competition in machine learning to beat its own film recommendation
system. It provides a data set of about 100 millions movie ratings to learn recommendations automatically

A

2006

17
Q

The AlexNet system, developed at the University of Toronto by Alex Krizhevsky, a PhD
student of Geoffrey Hinton, wins the ImageNet Challenge and shows that deep learning
techniques can achieve significantly better results than classical machine learning techniques in image processing. This is considered to be a defining moment in the history of
AI.

A

2012

18
Q

Google launches its Neural Machine Translation system based on recent advances in Deep
Neural Networks, and little by little drops the Statistical Machine Translation approach
of the years 2000’s.

A

2016

19
Q

While having a beer at Les 3 Brasseurs at the corner of McGill College and Ste-Catherine,
Ian Goodfellow, a PhD student of Yoshua Bengio, comes up with an idea for a neural network that could generate realistic images. He called it Generative Adversarial Networks,
a.k.a. GANs.

A

2014

20
Q

AlphaGo beats the world’s champion at the game of Go. AlphaGo’s strategy is learned
automatically from playing a large number of games against itself

A

2016

21
Q

Canada invests massively in 3 AI research institutes: Amii in Alberta, Mila in Montréal,
and Vector Institute in Toronto.

A

2017

22
Q

A Portrait of Edmond Belamy created by a Generative Adversarial Network (GAN) sells
for $432,500, at Christie’s auction in New York city.

A

2018

23
Q

The Association for Computing Machinery (ACM) names Yoshua Bengio (from MILA,
Montreal), Geoffrey Hinton (from Vector Institute, Toronto), and Yann LeCun (from
Facebook) recipients of the Turing Award for their contribution in the field of AI and
Deep Learning.

A

2018

24
Q

OpenAI publishes its GPT-2 model, astounding both public and experts with its text
generation capabilities, such as the Unicorn story generated using a prompt.

A

2019

25
Q

OpenAI publishes ChatGPT, a large language model (LLM) based on GPT-3, which has
a huge impact on the adoption of AI tools worldwide.

A

2022

26
Q
A