Deception in Social Robotics Flashcards

1
Q

Paper’s Main Questions

A

What constitutes deception, when is it wrong, who should be held responsible, can it be prevented/avoided

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Social robot

A

physically embodied robot able to socially interact with people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Perspectives on Deception in Social Robotics

A

P1: Techniques enabling robots to detect human social gestures and respond with human-like social cues constitute deception.
- robot pretends to have mental or emotional capabilities it does not actually have

P2: Only misleading people into thinking that the robot is something it is not actually (like a human or animal) is deception.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Deception without intention

A

intention is not necessary for deception

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When is deception wrong? Perspectives:

A

P1: Deception involved in a relationship with a robot is inherently wrong and violates a duty to see the world as it is

P2: Deception in robotics is wrong only when the deceiver deceives for their own interests.

P3 (author’s opinion): determining whether or not deception is wrong is based on the actual impact of the deception on the deceived

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Risks emerging from development and presentation of social robots:

A

Those stemming from the deception involved in robots appearing to have emotions and care for us (significant for children, babies, elderly people)

Those that originate in over-estimations of the ability of robots to understand human behaviour and social situations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Can we prevent deception in social robotics?

A

Impossible, even undesirable, to prevent all deception in robotics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Who is responsible for deception in social robotics?

A

Robot itself cannot as they only do what they are programmed to

Users partially but not entirely since they did not create the robot and also may be vulnerable to different degrees.

Robot manufacturers and marketers bear most of the responsibility since they build and market the product, the two most important factors in deception potentially taking place

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Suggestions to minimise the negative effects of deception

A

Legally requiring a robot to continuously remind the user that it is only a machine and has no emotions

Equipping the robot with an emotional system or developing robots with a sense of morality

Requiring manufacturers and sellers to provide evidence that a given robot application would not cause psychological harm or derogate any human rights before releasing it

Preventing robots that masquerade as friends or companions from using users’ data to manipulate them

Requiring any sharing of information obtained by the robot to be made explicit and transparent

Assessing and limiting promotional descriptions of robots that exaggerate their functionality and benefits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Pilot Assessment Framework for a Quality Mark for AI based robotics products:
8 principles:

A

Security
Safety
Privacy
Fairness
Sustainability
Accountability
Transparency
Well-being

How well did you know this?
1
Not at all
2
3
4
5
Perfectly