On October 18, 2017, the Article 29 Working Party (the “WP29”) published Guidelines clarifying the new profiling and automated decision-making provisions of the General Data Protection Regulation (“GDPR”). European Union regulatory authorities and the WP29 consider that technological developments that facilitate the creation of individual profiles, such as big data analytics, AI and machine learning, have created new risks to data protection. As the majority of industries (insurance, marketing and finance, and even healthcare) already apply and use these new techniques today, the WP29 Guidelines are very much welcomed to help understand the applicable legal framework in the EU.
The first clarification in the Guidelines is the distinction between “profiling” and “automated decision-making”. “Automated decision-making” concerns the scenario where a decision is purely based on automated processing techniques – in simple terms, this is processing carried out with no human intervention. The decision must also have a legal or similarly “significant” effect. For instance, a person who is imposed a speeding fine purely on the basis of results generated by speed cameras with number-plate recognition technology would be subject to automated decision-making.
“Profiling” on the other hand concerns (i) some sort of automated processing of personal data that (ii) allows companies to analyze personal aspects of a specific individual. Essentially, profiling means you are analyzing an individual’s personal data with the aim of categorizing and/or predicting the characteristics or preferences of the individual. The WP29 rightfully points out that profiling and automated decision-making can overlap, and the former could evolve into the latter. Returning to our speeding ticket example, WP29 specifies that there would be profiling when, for instance, driving habits of individuals are analyzed over time and consequently used to decide on the amount of a specific person’s speeding ticket (e.g. was this driver involved in any recent traffic violations, is the driver a repeat offender, etc.).
In the GDPR, profiling is subject to a right “to object” (or opt-out), whereas the deployment of automated decision-making is subject to a principled prohibition. WP29 provided the following insights into understanding the scope of the legal provision applicable only to automated decision-making:
It confirms that “human intervention” must be interpreted as an intervention in the decision-making process by a person who is able to exercise a real influence over the decision (meaning he or she considers all input and data, and has the authority and competence to change the decision);
It – rather obscurely – elaborates on the residual category of “similarly significantly affecting an individual”, which in essence amounts to a case-by-case assessment. To illustrate, some of WP29’s examples include the following: a customer of a credit card company who sees his credit card limit automatically reduced based on profiling (comparing him to customers who have similar spending habits) is deprived of specific opportunities based and is considered to be significantly affected. In the same way, targeted advertising is generally not considered to significantly affect individuals, except in specific situations. For example, targeted advertising where an individual is shown advertisements for online gambling based on particular vulnerabilities of the individual concerned, and as such could potentially incurs further debts for the individual, would be considered a similarly significant event. In sum, it will be difficult for companies to determine with absolute certainty that their business falls under this provision;
WP29 provides the heavily discussed confirmation that the legal framework, as a rule, contains a prohibition by default on solely automated decision-making. As such, it settles the discussions that started already in the context of Directive 95/46, which contained similar language on automated decision-making and which suggested the language in the provision could both imply a principled prohibition or a (more lenient) opt-out; and
It provides limited guidance that the prohibition on automated decision-making can only be lifted by three specific exceptions. One example of permitted use of automated decision-making concerns solutions to prevent fraud and tax evasion, or to ensure the security and reliability of a service provided by the controller. Automated decision-making will also still be allowed when companies obtain explicit consent from the individual(s).
In sum, questions still remain with regard to the specific scope of the legal frameworks, but the WP29 has recognized the need to clarify these provisions for businesses and has provided these Guidelines on specific aspects in response, which is a good start.
The new WP29 Guidelines are available here.
Alston & Bird is closely following significant developments in the fields of privacy, data protection and technology. For more information, contact Jim Harvey or David Keating.