Topic 6: Privacy Flashcards
(19 cards)
Technological Privacy Protection Strategies
- Audience Segmentation: Choose post visibility
- Un-tag: Remove yourself from tagged posts
- Delete: Remove posts, including ephemeral content e.g. stories
- Block: Prevent contact with specific individuals
- Deactivate: Terminate accounts
Social Privacy Protection Strategies
- Social Steganography: Use of social context to make the underlying message of content visible to only certain readers
- Self Censorship: Reveal only what you want online
- Multiple Spaces: Segment audience with multiple accounts
- White Walling: Post no content
Information Asymmetry
Social media companies have more information about users than what the user chooses to explicitly disclose e.g. data from observed patterns is collected such as liked content to derive details of gender, ethnicity, and sexuality
Digital Self
An online representation of an individual shaped through their interactions and behaviour. Consisting of:
1. A Digital Mosaic
2. A Digital Persona
Digital Mosaic
A multifaceted collection of a user’s digital presence built from digital footprints:
- Digital Footprints: Traces left behind from online activity, such as from watching a video
- 3rd Party Footprints: Traces left behind from a user’s connections/friends
Digital Persona
A digital persona is an aggregation of multiple data sources (including digital mosaics and third party data). A user can have multiple digital personas, which are all encapsulated under one digital self
Categories in Westin’s Attitudes to Privacy
- Fundamentalists: Highly distrustful of data-collecting organisations, they prioritise privacy rights and legal protections over consumer benefits
- Pragmatic: Balance privacy concerns with consumer advantages, they expect organisations to earn trust and provide opt-out choices
- Unconcerned: Trust organisations easily and are willing to trade privacy for consumer benefits, they oppose additional privacy regulations
Article 12 of the UN’s Universal Declaration of Human Rights (UDHR)
Safeguards privacy and dignity to ensure individuals are not unfairly subject to surveillance, defamation, or intrusion without cause
Article 7 of the EU’s Charter of Fundamental Rights (CFR)
Similar to Article 12 of the UDHR
Article 8 of the EU’s Charter for Fundamental Rights (CFR)
Establishes protection of personal data, requiring data to be processed fairly, for specified purposes and with consent or legal justification. (Basis of GDPR)
What does Intellectual Privacy safeguard?
The ability to freely think, explore ideas and form beliefs without fear of external pressure, crucial for independent thought and creativity
Why is privacy important?
It acts as a barrier against excessive state control, it protects democracy by ensuring individuals can think critically, express dissent and contribute to democratic processes without self-censoring
Privacy Trade-off
A balance between what information is shared and what is kept private e.g. a company receives information about you and you receive free services / tailored content
Privacy Calculus
The process of weighing perceived costs/risks (loss of privacy, data misuse) and perceived benefits (convenience, personalisation). Users are unlikely to disclose data is perceived risks outweigh perceived benefits
The Privacy Paradox
The discrepancy between individuals’ expressed behaviour about privacy and their actual behaviour (people report intending not to share much personal data but in practice share more than anticipated) due to
- Bounded Rationality
- Behavioural Anomalies
Bounded Rationality
The capability of individuals to make fully rational decisions is limited due to
- Incomplete Information
- Asymmetric Information
- Rational Ignorance: The cost of learning something outweighs the perceived benefit of knowing it
Behavioural Anomalies
A set of behavioural patterns that deviate from true rationality
- Valence Effect: Tendency to overestimate of the likelihood of favourable events
- Overconfidence Bias: Tendency to overestimate abilities and knowledge
- Status Quo Bias: Tendency to prefer things to remain unchanged
- Reciprocity and Fairness: Innate desire to act fairly in transactions with others
- Inequity Aversion: Preference for fairness, individuals dislike when others receive resources they do not deserve
Structuration
The process of which social structures (rules, norms, etc) are both created and reinforced by actions of individuals and the structures in turn shape and constrain actions
8 Approaches to Solving Privacy Issues
- Transparency: Through laws (e.g. privacy notice)
- Purpose Limitation: Data must be collected for a defined purpose i.e. GDPR
- Self-management: Give users more control over their data but requires digital literacy to be effective
- Right to be Forgotten: Inadequate or irrelevant data should be deleted
- Data Amnesia: Add expiry dates to data so no intervention is required from individuals. Challenging to implement as it requires new policy and technical design
- Anonymity: Remove or encrypt identifying information
- Safe Haven: Keep data under strict access controls like the census data
- Privacy by Design: Incorporate privacy into every step of the design process