W3: Knowledge Clips Flashcards

(31 cards)

1
Q

Triple bottom line

A

Not only measuring success by financial gains, but also by social and environmental impact

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Moral and ethical dilemmas

A

Involve deeply ingrained societal norms, cultural differences, and philosophical debates. These complexities make it difficult for AI to provide meaningful solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Behavioural ethics

A

Refers to individual behaviour that is judged according to socially accepted moral norms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Trolley problem

A

Do nothing and it hits 5 people, or change the outcome and hit 1 person

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Utilitarianism

A

Consequences are what matters; the greatest good for the greatest number of people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Deontology

A

Judges morality based on whether somebody adheres to different rules, duties, or expectations. It looks at motives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Footbridge dilemma

A

A trolley is coming down the tracks. There are 5 people standing on the track and 1 on the bridge. If you push them off the bridge they can stop the trolley and save 5 people. Often, people get more uncomfortable and start to question utilitarianism. This shows how important context is in guiding our behaviour based on our ethical or moral approach

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Machine learning

A

Helps us get from place to place, gives us suggestions, translates stuff, even understands what you say to it. Computers learn the solution by finding patterns in the data. However, just because something is automatic, doesn’t mean it’s neutral. It’s impossible to separate ourselves from our human biases, so they become part of the technology we create

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Traditional programming

A

People hand code the solution to a problem step by step

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Interaction bias

A

E.g. people were asked to draw shoes. As more people interacted with the game, the computer didn’t recognise other types of shoes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Latent bias

A

Training a computer based on what a physicist looks like and using past pictures, leads to a latent bias skewing towards men

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Selection bias

A

E.g. you’re training a model to recognise faces. Are you making sure to select photos that represent everyone?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Robustness

A

Can you assure end users nobody can hack such an AI model such that a person could disadvantage other people and or benefit one perosn over another?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

“Social”

A

Meaning people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

“Holistically”

A

There’s three major things to think about; people, process or governance, and tooling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

People

A

The culture of the organisation, the diversity of your teams

17
Q

“Wisdom of crowds”

A

The more diverse your group of people, the less chance for error

18
Q

Tooling

A

What are the tools, engineering methods, or frameworks that you can use to ensure the five pillars?

19
Q

Process or governance

A

What are you going to promise employees and the market in terms of standards for the AI model for fairness, explainability, accountability, etc.

20
Q

Moral machine experiment

A

Looks at who autonomous cars should save. The goal was to open the discussion to the public. How do people’s culture and background affect the decisions they make? It became a viral website

21
Q

Anthropomorphism

A

When humans attribute human characteristics, behaviours, and emotions to non-human entities. It can be a way of making sense and relating to the world around us. It is often used in storytelling design and interaction with technology as it helps to explain certain phenomena. It also serves as a form of psychological projections where humans project their own feelings and experiences

22
Q

Artifical General Intelligence (AGI)

A

Can potentially become as intelligent as humans. It would possess the ability to understand, learn, and apply knowledge to solve unfamiliar problems in diverse contexts. This approach is anthropomorphic, taking human intelligence as a golden standard of ultimate intelligence

23
Q

Intelligence

A

Broadly defined as the ability to acquire and apply knowledge, solve problems, adapt to new situations, and learn from experience. This includes thinking of creative solutions, flexibly using contextual and background information, being able to think and understand, and take into account emotions in an ethical consideration

24
Q

Spatial awareness

A

Involves understanding and mentally manipulating objects in space

25
Motoric skills
Involve physical coordination and dexterity
26
Emotional intelligence
The ability to recognise, understand, and manage one's own emotions as well as perceive and influence the emotions of others. It is the product of long-term biological evolution and still carries a lot of biological limitations
27
Intelligence (author definition)
The capacity to realise complex goals
28
General artifical intelligence (author definition)
The non-biological capacity to autonomously and efficiently achieve complex goals in a wide range of environments. It takes away the humanlike aspect of intelligence as the only possible intelligence
29
Moravec's paradox
What might be easy for humans is not necessarily easy for AI, and vice versa. Tasks that require high-level cognitive functions and perceptual motor skills are easy for humans but difficult for AI. On the other hand, computers excel at performing repetitive tasks with high accuracy and speedT
30
Task difficulty
Subjective, meaning something is difficult for humans to be dome
31
Task complexity
More objective and means that tasks have a lot of aspects to be taken into account or to apply solutions to