Week 13 - Surveillance and Censorship Flashcards

(31 cards)

1
Q

Surveillance

A

Systematic observation or data collection concerning people with the aim of influencing or managing their behaviour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Key concepts of surveillance

A

Consent: are we aware and okay with being watched?
Power: who has the authority to watch/ access data and is there a way to remove consent from being watched?
Data: what is being collected and how is it used?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Types of surveillance

A
  • State/Government surveillance.
  • Corporate surveillance.
  • Personal surveillance.
  • Self surveillance.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Covert vs overt surveillance

A

Covert surveillance: Techniques used discreetly so the subject is unaware of being monitored.
Overt Surveillance: Visible and recognizable monitoring methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

State/government surveillance uses

A
  • national security
  • law enforcement and crime prevention
  • public safety (e.g., counterterrorism)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

State/government surveillance issues

A
  • Privacy violations: lots of data collected, often without informed consent.
  • Power imbalance: government holds vast data, citizens have little oversight.
  • Overreach and abuse: risk of targeting dissidents, indefinite data retention
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Corporate surveillance uses

A
  • Profit motive: selling behavioural data, optimizing adds.
  • Consumer profiling: predicting preferences and tailoring marketing.
  • Productivity oversight: monitoring employees for efficiency.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Corporate surveillance issues

A
  • Lack of consent/ transparency: users rarely realise how much is tracked.
  • Data monetisation: data may be sold to third parties.
  • Ethical and legal concerns: biased analytics and manipulative recommendation systems.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Personal surveillance uses

A
  • Safety: child protection, home security.
  • Personal convenience: home deliveries, letting family know whereabouts.
  • Peace of mind: tracking personal belongings.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Personal surveillance use issues

A
  • Consent and boundaries: monitoring someone else (spouse/child) can erode trust.
  • Misuse and abuse: stalkerware and controlling behaviour in domestic contexts.
  • Data security: Personal devices are susceptible to hacking or data leaks.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Self surveillance uses

A
  • Self improvement: health goals, productivity.
  • Personal insight: tracking habits, measuring performance.
  • Sharing achievements: gamification (elements of games being used in other contexts), social bragging rights.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Self surveillance issues

A
  • Data privacy: personal health metrics stored on corporate servers.
  • Over-monitoring: obsession with metrics may create anxiety or skew behaviour.
  • Commercial exploitation: collected data can be resold or used for advertising.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

USA Patriot Act (2001)

A

Signed into law after 9/11 terror attacks to allow searching of emails and telephone records without a warrant. Openly conducted surveillance on US and foreign citizens.
Criticised for not informing of consent and did not follow “innocent until proven guilty”.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Mass surveillance

A

The practice of spying on a significant part of the population.
E.g., GCHQ used Karma Police to access website metadata, stored it in a repository called Black Hole and used a tool called Mutant Broth which allowed searching of Black Hole. This violated the legal principle of probable cause.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

UK surveillance legislation

A
  • Anti-Terrorism, Crime and Security Act, 2001: enabled retention of communication data voluntarily but does not include content of communications.
  • Communications Data Bill, 2012 (Snooper’s Charter): Requires all ISPs to store user data for 12 months “to catch criminals and protect children”.
  • Investigatory Powers Bill, 2016 (Snooper’s Charter 2.0): Enables bulk collection of data and requires companies to assist in bypassing encryption.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Big data surveillance

A

The systematic collection, analysis, and use of massive datasets for monitoring and control.
Enables predictive policing, counterterrorism strategies, and broader control of populations through pattern recognition.

17
Q

Application areas for big data surveillance

A
  • National security: predicative models for identifying potential threats.
  • Law enforcement: real-time data from internet of things (IoT), CCTV and AI-driven analytics.
  • Corporate security: protecting assets and monitoring employees.
18
Q

Processing techniques for big data surveillance

A
  • Machine Learning (ML): algorithms for behavioural analysis.
  • Natural Language Processing (NLP): used for monitoring communications.
  • Graph theory: used to map social networks e.g., identifying key influencers in a network.
19
Q

Predictive intelligence

A
  • Big data analytics can anticipate events, such as potential crimes or terrorist attacks, using historical data.
  • E.g., PredPol is a predictive policing tool that analyses crime patterns and deploys resources proactively.
  • Can be used for network anomaly detection, fraud detection in financial systems and insider threat detection within organisations.
20
Q

Sousveillance:

A
  • The practice of individuals monitoring those in power, such as governments, corporations, or other authorities.
  • E.g., recording police during protests or public events, whistleblowing to expose missuses of power (Edward Snowden).
  • Sousveillance is intended to empower individuals to hold authorities accountable and challenge abuses of surveillance systems.
21
Q

Nothing to hide argument for surveillance

A

It can be possible to find someone guilty of something even when they are innocent.
Possible through:
- Distortion: surveillance can make someone seem guilty through misinterpretation of data pr framing innocent behaviours as suspicious.
- Exclusion: surveillance systems often prevent people from knowing how their data is being used or correcting inaccuracies - errors can misrepresent individuals as criminals.

22
Q

Traditional vs Digital censorship

A
  • Traditional censorships: blocking books, banning films or controlling media broadcast.
  • Digital censorship: automated systems filtering content, blocking websites or suppressing dissenting opinions online.
23
Q

Actors of censorship

A
  • State actors: governments imposing restrictions to control public discourse (e.g., China’s great firewall).
  • Corporate actors: platforms like YouTube, Facebook etc. censor misinformation, hate speech and politically sensitive topics.
  • Algorithmic moderators: AI systems tasked with removing harmful content. Can often result in unintended censorship due to biases.
24
Q

Types of censorship

A
  • Network level censorship: blocking websites or services through DNS tampering, IP blocking or deep packet inspection.
  • Platform level censorship: content moderation on platforms like Twitter, Facebook, YouTube by using algorithms to detect and remove flagged content.
  • Self-censorship: Individuals modify behaviour knowing they are being monitored or flagged (linked to surveillance).
  • Algorithmic censorship: AI filters unintentionally remove content due to training bias or lack of contextual
    understanding
25
Network level controls (techniques for censorship)
- Deep packet inspection (DPI): Scans packet data in real time to block or restrict specific types of content like keywords or URLs. - Firewalls: centralised systems for blocking access to domains or IP addresses.
26
Automated content moderation (techniques for censorship)
- AI moderators: natural language processing (NPL) used to detect inappropriate or harmful content.
27
Data manipulation (techniques for censorship)
- Search engine filtering: algorithms prioritising or supressing search results based on political/corporate interests. - Social media echo chambers: algorithms amplify specific content while suppressing opposing views, limiting information diversity.
28
IoT and censorship (techniques for censorship)
- Smart devices: IoT sensors can restrict or block access to specific functionalities (e.g., disabling internet access in certain areas during protests).
29
Ethical considerations in censorship
- Algorithmic transparency: how content moderation algorithms make decisions and are the decisions explainable and justifiable. - Bias in AI: training data often includes societal biases leading to over censorship of marginalised voices. Need to try and eliminate/ reduce bias as much as possible. - Balancing free speech and harm reduction: Need to find the right balance between free expression and preventing harm. ] - Government vs corporate power: debates on who should decide wat content is censored - government, private companies or the public.
30
Ethical frameworks for Privacy in surveillance systems
- Privacy-by-Design: Embed privacy features into technology at the design stage. - Transparency: Clear data usage policies for users. - Anonymisation: differential privacy in datasets, limit identifiability in analytics data.
31
Three Core Barriers to Protecting Privacy
- Ignorance: Is it realistic to expect to understand every app, device, or platform we use? - Futility: The idea that there is no point trying to resist. - Foreclosure of Alternatives: Big Tech has a near-monopoly.