Justis International Law & Technology Writing Competition 2020 Essay

My article The ‘Personal’ in Personal Data: Who is responsible for our data and how do we get it back? won best essay in the Social Media, Data and Privacy category as part of the Justis International Law & Technology Writing Competition, with entries from students at 98 universities in 30 countries around the world. In the piece, I outline how data subjects may feel disempowered from the data protection process, why data controllers only represent part of the problem, and the necessity for a multidisciplinary, collaborative solution – which is what I'm working on for my PhD!

The ‘Personal’ in Personal Data: Who is responsible for our data and how do we get it back?

In our data-driven society, every piece of technology that connects us to the internet collects our personal data (any information relating to an identified or identifiable natural person), building elaborate profiles on what we are doing, where we are, and even who we are.[1] As data subjects (those about whom personal data are collected), we can no longer hide from data controllers (those who collect and determine what these data are used for). With every data breach and data sharing revelation from Cambridge Analytica[2] to Google’s Project Nightingale,[3] our personal data is becoming less personal, where data attached to our identity are no longer in our control and becomes harder for us to identify who is responsible.

The data subject’s struggle

Recognising the need to protect privacy as an individual’s right, data protection attempts to rebalance power between data subjects and data controllers. The European General Data Protection Regulation (GDPR)[4] grants data subject rights such as the right of access,[5] right to be forgotten,[6] and right not to be subject to a decision based solely on automated processing.[7] Data controllers must also follow the principles of data protection by design and by default.[8] However, even with the GDPR, data subjects still lack the extra hours and cognitive capacity to exercise these rights.[9] Only 15% of EU citizens feel completely in control of their personal data.[10] Additionally, while there are multiple means for lawful processing of personal data,[11] data controllers have weaponised consent by using privacy policies written in legalese and dark patterns to hide privacy-protecting options, obfuscating how data subjects’ data are reused, aggregated, and anonymised to make decisions about them.[12]

Everyone is a data controller

Responsibility over personal data is further complicated where judgements have expansive interpretations of who could be considered a data controller. A user who administers a Facebook Group or Page,[13] a website operator who has a Facebook ‘like’ button or other social plug-ins,[14] and a religious community whose congregation conducted preaching activities and collected personal data[15] are ‘joint controllers’ who are all liable if one controller breaches requirements on those data. This significantly increases the number of data controllers and people responsible for personal data, where not all joint controllers need to have access to the data for joint controllership to occur. While these judgements introduce more responsibility, they also disperse where data responsibility lies, increasing the ambiguity over who can share, reuse, and repurpose data.

From my data to our data to your data

Beyond the individual, initiatives such as Decode encourages public institutions to be more responsible with its citizens’ data.[16] However, governments continue to watch over its people through social credit scoring,[17] criminal sentencing,[18] and partnerships with privately-owned, pervasive technologies.[19] In the age of surveillance capitalism,[20] where personal experiences are translated into free raw material for behavioural data, our personal and derived data are collectively used against us. Although data protection and information rights enable some forms of transparency and accountability, our data are still often used without our knowledge and without legal recourse as decisions are made using unexplainable black-box algorithms.[21]

Reclaiming our personal data

In order to better understand how our personal data are being used and abused, we need to look beyond data protection on an individual level. Instead, privacy should represent an ecosystem that requires legal and socio-technical collaboration between lawyers, technologists, policy makers, and most importantly, us as data subjects.

Firstly, stronger regulation beyond data protection is required to fully realise the responsibility data controllers have over our personal data. While the European Data Protection Board established guidelines to clarify the GDPR,[22] further regulatory guidance has only been provided by academics and has yet to be codified.[23] Regulators should do more to prevent ‘ethics washing’, whereby data companies use ethics boards and policies to limit regulation.[24] Competition law in particular may help us escape the grasp of digital behemoths. Looking beyond fines, Margrethe Vestager, the EU’s competition commissioner, plans to regulate industries such as artificial intelligence and gig economy companies to return the ethos of ‘consumer is king’ back to data subjects.[25] Other mechanisms include using legal data Trusts to empower data subjects by facilitating access to pre-authorised, aggregated data and remove key obstacles to the realisation of the potential underlying large datasets.[26]

Secondly, although many of the challenges described are driven by the business models of data controllers, technology should be considered part of, and not excluded from, solutions that help data subjects better understand how our data are processed and managed. Tools such as Databox,[27] Jumbo Privacy,[28] and DoNotPay[29] are already beginning to challenge the data protection practices of Big Tech companies, providing alternatives to existing services and mechanisms for control.

Finally, in considering how personal data should be best protected, data protection must be considered beyond the individual. Data protection should look beyond privacy as control and be expanded to include the ability to participate and engage with other individuals and groups, crowdsourcing information and solutions to personal data challenges. Philosophical discussions surrounding group privacy can be put into practice. Developing a data protection public sphere and commons, regulators, lawyers, and technologists can support data subjects in minimising the risks involved in the public use of anonymised personal data[30] and establish the necessity for collective rights[31] before and after data are collected. The protection of data subjects with regard to the processing of personal data can only be achieved where legal frameworks and technological mechanisms include input from data subjects to respect their data protection requirements.

The responsibility over our personal data should not burden data subjects. As data protection matures, this responsibility should be shared with all stakeholders that benefit from the personal data, not only with those about whom personal data are collected. It is only with legal and technical collaboration that data subjects can be collectively protected, governing the data protection landscape for the benefit of our current and our future selves.

[1] Surya Mattu and Kashmir Hill, ‘The House that Spied on Me’ (Wired, 2 February 2018) accessed 30 November 2019.
[2] Carole Cadwalladr and Emma Graham-Harrison, ‘Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach’ (The Guardian, 17 March 2018) accessed 30 November 2019.
[3] Anonymous, ‘I’m the Google whistleblower. The medical data of millions of Americans is at risk’ (The Guardian, 14 November 2019) accessed 30 November 2019.
[4] Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.
[5] ibid art 15.
[6] ibid art 17.
[7] ibid art 22.
[8] ibid rec 108.
[9] Rachel Coldicutt, ‘Better than ethics’ (doteveryone, 28 November 2019) accessed 30 November 2019.
[10] Bart Custers, Alan M. Sears, Francien Dechesne, Ilina Georgieva, Tommaso Tani, and Simone van der Hof, ‘Conclusions’ in Bart Custers, Alan M. Sears, Francien Dechesne, Ilina Georgieva, Tommaso Tani, and Simone van der Hof (eds), EU Personal Data Protection in Policy and Practice (T.M.C. Asser Press 2019).
[11] General Data Protection Regulation, art 6.
[12] Christine Utz, Martin Degeling, Sascha Fahl, Florian Schaub, and Thorsten Holz. ‘(Un)informed Consent: Studying GDPR Consent Notices in the Field.’ (2019) ACM SIGSAC Conference on Computer and Communications Security (CCS ’19) https://doi.org/10.1145/3319535.3354212 accessed 30 November 2019.
[13] Case C‑210/16 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH ECLI:EU:C:2018:388.
[14] Case C-40/17 Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV ECLI:EU:C:2019:629.
[15] Case C-25/17 Tietosuojavaltuutettu v Jehovan todistajat — uskonnollinen yhdyskunta ECLI:EU:C:2018:551.
[16] Decode European Commission, ‘Reclaiming the Smart City: Personal data, trust, and the new commons’ (July 2018) accessed 30 November 2019.
[17] VICE News, ‘China’s “Social Credit System” Has Caused More Than Just Public Shaming (HBO)’ (12 December 2018) accessed 30 November 2019.
[18] Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner, ‘Machine Bias’ (Propublica, 23 May 2016) accessed 30 November 2019.
[19] Sam Biddle, ‘Amazon’s Ring Planned Neighborhood “Watch Lists” Built on Facial Recognition’ (The Intercept, 27 November 2019) accessed 30 November 2019.
[20] Shoshana Zuboff, ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilization’ (2015) Journal of Information Technology 30, 75–89 https://doi.org/10.1057/jit.2015.5 accessed 30 November 2019.
[21] Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information (Harvard University Press 2015).
[22] European Data Protection Board, ‘Guidelines’ (25 May 2018) https://edpb.europa.eu/our-work-tools/our-documents/publication-type/guidelines_en accessed 30 November 2019.
[23] Jef Ausloos, René Mahieu, and Michael Veale, ‘Getting Data Subject Rights Right’ (25 November 2019) osf.io/preprints/lawarxiv/e2thg accessed 30 November 2019.
[24] James Vincent, ‘The Problem with AI Ethics’ (The Verge, 3 April 2019) accessed 30 November 2019.
[25] Adam Satariano and Matina Stevis-Gridneff ‘Big Tech’s Toughest Opponent Says She’s Just Getting Started’ (New York Times, 19 November 2019) accessed 30 November 2019.
[26] Sylvie Delacroix and Neil D Lawrence, ‘Bottom-up data Trusts: disturbing the ‘one size fits all’ approach to data governance’ (2019) International Data Privacy Law https://doi.org/10.1093/idpl/ipz014 accessed 30 November 2019.
[27] Databox accessed 30 November 2019.
[28] Jumbo Privacy accessed 30 November 2019.
[29] DoNotPay accessed 30 November 2019.
[30] Luciano Floridi, ‘Group Privacy: A Defence and an Interpretation’ in Linnet Taylor, Luciano Floridi, and Bart van der Sloot (eds), Group Privacy: New Challenges of Data Technologies (Springer International Publishing 2016).
[31] Joseph Raz, The Morality of Freedom (Oxford University Press 1986).
Avatar
Janis Wong
PhD researcher

Janis Wong is an interdisciplinary PhD researcher in Computer Science at the Centre for Research into Information, Surveillance and Privacy (CRISP), University of St Andrews. Janis is interested in the legal and technological applications in data protection, privacy, and data ethics.