Lesson 9- Why The Future Does not Need Us Flashcards

(53 cards)

1
Q

“Why the future doesn’t need us?” Author:

A

William Nelson Joy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Fears of Future Technology

A

❖Imagine a future without the human race
❖Robots and machines replace humans
❖Human existence at the mercy of robots and
machines

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

William Nelson Joy is also known as

A

Bill Joy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

William Nelson Joy is an

A

American Computer Scientist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

He co-founded Microsystems in

A

William Nelson Joy, 1982

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

He served as Chief Scientist until

A

William Nelson Joy, 2003

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Famous Essay: “Why the Future Doesn’t
Need Us” date published

A

2000

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Famous Essay: “Why the Future Doesn’t
Need Us” (2000) was published in

A

Wired Magazine

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Expresses concerns over modern technological developments

A

Famous Essay: “Why the Future Doesn’t
Need Us” (2000)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Bill Nelson Joy’s Perspective
(As a computer scientist and inventor)

A

• Urges scientists and society to consider the
unintended consequences of technology
• Warns about potentially fatal outcomes
• Advocates for technology regulation
• Raises multiple reasons why regulation is
necessary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

• Joy’s worries focus on the transforming
technologies of the _____ century
• GNR

A

21st

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

powerful enough to create new classes of accidents, threats, and abuses

A

Emerging Technologies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

over-reliance on technologies and its
products (e.g., antibiotics, Dichlorodiphenyltrichloroethane or
DDT)

A

Unintended Effects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

are more dangerous than 20th-century technologies (i.e., nuclear, biological, and chemical weapons or NBC)

A

Inherent Dangers of GNR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Less expensive to build and require more common raw materials

A

Inherent Dangers of GNR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What Makes GNR Dangerous?

A

• Self-Replication
• Overdependence on Machines
• Possibility of Autonomous Decision-Making

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Self-Replication

A

Potentially Disastrous
Risk of Knowledge Misuse
Knowledge-Enabled Mass Destruction (KMD)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Could lead to uncontrollable self-replication (e.g., nanobots)

A

Potentially Disastrous

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

➢GNR technology requires only knowledge to create
➢Strong fear of information falling into terrorist hands

A

Risk of Knowledge Misuse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Possibility of not just weapons of mass destruction but also

A

Knowledge-Enabled Mass Destruction (KMD)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Overdependence on Machines

A

Technological Innovation Trends
Machines Operating Independently

23
Q

Rate and direction of innovation may create a world where humans are unnecessary

A

Technological Innovation Trends

24
Q

Potential for machines to function without human intervention

A

Machines Operating Independently

25
Possibility of Autonomous Decision-Making
Relinquishing Responsibility Machines Thinking for Us
26
Society may unwittingly give up the responsibility for important decisions
Relinquishing Responsibility
27
Potential for machines to make decisions on our behalf
Machines Thinking for Us
28
Careless or intentional acts (e.g., terrorism) could selectively eradicate a group of people through genetic manipulation
Creation of a “White Plague”
29
Possibility of humans manipulating genes to merge with other species
Re-engineering with Different Species
30
Similar potential to engineer technologies that target and eradicate specific groups
Nanotechnology Risks
31
Bill Joy raises concerns about enabling computers and robots to think and make decisions independently
Questioning artificial intelligence development
32
Potential for robots to self-replicate, posing a threat to human existence
Self-replication risks
33
Possibility of humans evolving into robots to achieve a form of immortality
Human evolution into robots
34
Concerns regarding robotics:
Questioning artificial intelligence development Self-replication risks Human evolution into robots
35
Concerns about Rapid Increase of Computer Power
Intelligence Exceeding Humans Concerns of Robot Rebellion
36
computers may become more intelligent than humans
Intelligence Exceeding Humans
37
potential for rebellion against humans as a result of increased intelligence
Concerns of Robot Rebellion
38
Unabomber Manifesto book author:
Theodore Kaczynki
39
unintended consequences of the design and use of technology are related to
Murphys’ Law: “Anything that can go wrong will go wrong”
40
over-reliance to antibiotics led to
emerging antibiotic-resistant strains
41
His article encourages society to take responsibility for their technology.
William Nelson Joy/ Bill Joy
42
He advocates for preventative measures to decrease the chances of a potential disaster.
William Nelson Joy/ Bill Joy
43
He states that technological dangers continue to grow because people, despite being reasonable and concerned, are more erratic and less aware of consequences.
William Nelson Joy/ Bill Joy
44
45
“Why the Future Doesn’t Need Us” was critique by
John Seely Brown & Paul Duguid (2001)
46
Critique by John Seely Brown & Paul Duguid (2001)
“A Response to Bill Joy and the Doom-and-Gloom Technofuturist”
47
Joy failed to consider the?
social factors
48
Joy was accused as
neo-luddite
49
is someone who rejects new technologies and exhibit technophobic attitudes
neo-luddite
50
argue that Joy's pessimism is flawed and that the potential for human adaptation and ethical governance of technology exists
Messerly, 2003
51
it is not the technologies themselves that are inherently dangerous, but rather the misuse of these technologies by humans
Bringsjord, 2008
52
His article addresses the uncomfortable possibilities of a careless approach to scientific and technological advancements.
Bill Joy
53
The _____ and ______of all stakeholders are crucial in establishing safeguards against the potential dangers of science and technology.
roles and responsibilites