1,614
Views
6
CrossRef citations to date
0
Altmetric
Articles

The risk-based approach under the new EU data protection regulation: a critical perspective

Pages 139-152 | Received 05 Mar 2018, Accepted 15 Aug 2018, Published online: 12 Jan 2019
 

Abstract

The first broad reform of personal data protection legislation in the European Union entered into force in May 2018 (Regulation (EU) 2016/679, the General Data Protection Regulation). Remarkably, with this reform a risk-based approach has been introduced as the core data protection enforcement model, while data protection authorities see their regulatory role significantly weakened. The risk-based approach is to be implemented by the data controllers (i.e. the operators) via data protection impact assessments (evoking the established environmental impact assessment procedure) and notification of breaches, among other procedures. Hence the scope of both the concepts of risk and risk regulation spread beyond conventional domains, namely the environment, public health or safety, i.e. physical risks, to encompass risks to intangible values, i.e. individual rights and freedoms, presumably harder to assess and manage. Strikingly, the reform has been accompanied by a confident discourse by EU institutions, and their avowed belief in the reform’s ability to safeguard the fundamental right to data protection in the face of evolving data processing techniques, specifically, big data, the Internet of Things, and related algorithmic decision-making. However, one may wonder whether there isn’t cause for concern in view of the way the risk-based approach has been designed in the data protection legislation. In this article, the risk-based approach to data protection is analysed in the light of the reform’s underlying rationality. Comparison with the risk regulatory experience in environmental law, particularly the environmental impact assessment procedure, is drawn upon to assist us in pondering the shortcomings, as well as the opportunities of the novel risk-based approach.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Official Journal of the European Union L 119/1, 4 May 2016.

2 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Official Journal L 281, 23 November 1995.

3 On the definitions of data controller and data processor see Art. 29 DPWP (Citation2010a).

4 Big data rely not only on the increasing ability of technology to support the collection and storage of large amounts of data, but also on its power, using algorithms, to assist in analysing, understanding and taking advantage of the value of the data to inform decisions by enabling identification of patterns among different sources and datasets. Data analytics has been defined as the practice of using algorithms to make sense of streams of data. Analytics identifies relationships and patterns across vast and distributed datasets (Mittelstadt et al. Citation2016).

5 ‘Les assureurs demandent à leurs clients de se mettre à nu. Generali lance une assurance ‘comportementale’ dans la santé. Une première en France’, and ‘Assurance: votre vie privée vaut bien une ristourne’, Le Monde, 7 September 2016.

6 The recent scandal involving Cambridge Analytica illustrates this point (Shultz Citation2016; Gibney Citation2018).

7 Note that the term “risk” appears 76 times in the text of the Regulation, whereas it appeared 8 times in the text of the Directive.

8 This is ultimately in line with the world trend towards shifting the focus of privacy protection from informed consent at the point of collecting personal data to accountable and responsible uses of personal data (ITU Citation2014).

9 The information to be submitted by the developer to the competent authority in the EIA report comprises a description of the project, a description of the likely significant effects of the project on the environment, a description of measures envisaged in order to avoid, prevent or reduce and, if possible, offset likely significant adverse effects on the environment, a description of the reasonable alternatives studied by the developer, and an indication of the main reasons for the option chosen (Article 3 (5) of Directive 2011/92/EU amended by Directive 2014/52/EU).

10 Scientific uncertainties have been categorised as: (1) Large uncertainties in outcomes relative to the expected values; (2) Poor knowledge basis for the assigned probabilities; (3) Large uncertainties about frequentist probabilities (chances); (4) It is difficult to specify a set of possible consequences; (5) Lack of understanding of how the consequences (outcomes) are influenced by underlying factors (Aven 2011, 31).

11 In the USA, an initiative by the Federal Trade Commission, named ‘Reclaim Your Name’, is meant to empower the consumer to find out how brokers are collecting and using data; give her access to information that data brokers have amassed about her; allow her to opt-out if she learns a data broker is selling her information for marketing purposes; and provide her the opportunity to correct errors in information used for substantive decisions – like credit, insurance, employment, and other benefits. https://www.ftc.gov/sites/default/files/documents/public_statements/reclaim-your-name/130626computersfreedom.pdf.

12 Some everyday examples where the logic of decision-making should be disclosed include a personalised car insurance scheme (using car sensor data to judge driving habits); credit scoring services; a pricing and marketing system that determines how much discount an individual will receive, or what media content to recommend to an individual. Transparency should include, for example, informing people about re-identification risks stemming from data collected about them (Narayanan, Huey and Felten Citation2016).

13 Suggestions in the direction of more transparency and user control came from Member States’ supervisory authorities as well. The ICO of the UK, for example, underlined the need for prior consent before an organisation starts analysing data collected for one purpose for a different purpose that is not apparent to the individuals concerned (Information Commissioner’s Office, ‘Big Data and Data Protection’, paragraph 60, https://ico.org.uk/media/for-organisations/documents/1541/big-data-and-data-protection.pdf). And, ICO adds, ‘the apparent complexity of big data analytics should not become an excuse for failing to seek consent where it is required. Organisations must find the point at which to explain the benefits of the analytics and present users with a meaningful choice – and then respect that choice. Also, ‘if, for example, information that people have put on the social media is going to be used to assess their health risks or their credit worthiness, or to market certain products to them, then unless they are informed of this and asked to give their consent, it is unlikely to be either fair or compatible’ (Ibid, paragraph 69).

14 In a report published in 2013, the World Economic Forum emphasised the importance of ensuring understanding beyond transparency, in the following terms: ‘People need to understand how data is being collected, whether with their consent or without – through observations and tracking mechanisms given the low cost of gathering and analyzing data’, (World Economic Forum Citation2013).

15 Examples of real-world systems to which such techniques have been applied include price discrimination in popular e-commerce retailers and Uber’s surge-pricing algorithm (Le Chen and Mislove Citation2015).

16 An example being Microsoft’s privacy policies, see https://www.microsoft.com/en-us/TrustCenter/Transparency. This is however a case where customer data are declared as being used only for purposes compatible with providing services like troubleshooting or improving features (such as protection from malware).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 420.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.