Week 12 - Privacy Flashcards
(15 cards)
Privacy
The desire of people to choose freely under what circumstances and to what extent they will expose themselves, their attitude, and their behaviour to others.
Privacy paradox
When people disclose personal info in ways that are inconsistent with the high value they claim to place on privacy.
E.g., tldr approach to privacy policies (rational ignorance), overload of complicated details (transparency paradox).
Is there actually choice or an illusion of choice?
EU Artificial Intelligence Act (EU AI Act)
- It is an act which aims to address risks of unregulated AI applications and promote trustworthy AI aligned with ethical principles. It was proposed in 2021 and is expected to be enforced from 2025.
- It will classify AI systems by risk - unacceptable, high, limited and minimal risk.
- It has been criticised for having the potential to stifle innovation and having unclear scope for SMEs.
UK privacy regulation
UK adopted GDPR as UK data protection post Brexit. Data protection act 2018 complements GDPR.
There is a focus on balancing data-driven innovation with privacy.
Autonomous system
A system that can operate and make decisions independently, without direct human control or constant supervision.
Why do autonomous systems collect lots of data
Autonomous systems collect of data constantly because of their need for real time decision making.
e.g., navigation, Human Computer Interaction (HCI).
Privacy risks with autonomous systems
- Data breaches/ unauthorised access. Lots of data could be accessed in a breach because of the large amount of data autonomous systems collect.
- Lack of transparency in AI algorithms - black-box problems: the lack of transparency in how machine learning models arrive at their conclusions.
- Bias in AI leading to unfair outcomes.
- Ethical concerns in surveillance applications.
Privacy preserving technologies
- Federated learning
- Differential privacy
- Encryption
Federated learning
- Processing data for machine learning on a decentralised machine to avoid transferring raw data to a centralised server.
- Processing data locally is more secure - eliminates need for central storage and reduces risk of data breaches.
Differential privacy
- Protecting individual data points by introducing statistical noise, making it impossible to trace data back to a specific individual.
- Ensures useful insights into output data without compromising individual privacy by adding statistical noise which keeps the same aggregate trend but makes it impossible to trace data back to a specific individual.
Encryption
- Converting data (plaintext) into an unreadable format (cyphertext) that can only be accessed with a decryption key.
- During transmission data is encrypted so it cannot be intercepted and read (eavesdropping) or changed (tampering)
- Data is encrypted into cyphertext when sending and decrypted from cyphertext when receiving so both ends can read the data.
New privacy concerns
- Generative AI and deepfakes: risk of identity theft and spreading misinformation.
- Biometric data: challenges from face/voice/gait recognition technologies.
- Neurotechnology: brain-computer interfaces (BCI) raises questions about mental privacy.
- Consent fatigue: endless pop-ups results in users ignoring privacy agenda.
- Rapid tech evolution: laws struggle to keep up with advancements.
Nothing to hide argument
People claim “if you have nothing to hide, you have nothing to fear”.
Counterpoints of the nothing to hide argument
- Aggregation: Can take multiple harmless data points and put them together to find a real conclusion.
- Distortion: Data taken out of context can be misleading/ harmful.
- Exploitation: Collected data can be weaponised for manipulation.
Hopeful trust
- People trust systems even when privacy is violated.
- Justify trust because of a high number of users or idea of “if it was really bad, someone would step in”