The GDPR entered into force on May 25, 2018. One of the GDPR’s core going-forward obligations is the duty to conduct Data Protection Impact Assessments (DPIAs) over processing activities that create a “high risk” to individuals’ privacy. DPIAs constitute an important aspect of GDPR compliance, as they arguably replace the notifications of processing systems and activities to European Data Protection Authorities (DPAs) which pre-GDPR privacy law often obligated companies to make. Instead of notifying DPAs, the GDPR now requires companies to internally conduct DPIAs that document “high risk” processing activities and the safeguards they have implemented to protect individuals’ privacy.
The GDPR grants DPAs certain flexibility to determine when companies under their jurisdiction must – or need not – conduct a DPIA. Article 35(4) permits DPAs to issue “blacklists”, i.e. lists of processing activities that always require a DPIA. At the same time, Article 35(5) GDPR permits DPAs to conduct “whitelists”, i.e. processing activities that can be conducted without a DPIA.
The Article 29 Working Party (WP29) began the discussion on when DPIAs should be conducted by providing general guidance back in October 2017 (available here). More recently, as we have reported, the Belgian DPA issued its own proposal for black- and whitelists which largely followed the WP29 guidance, and the Austrian DPA issued the first binding whitelist on GDPR Day.
Now, the German DPAs have begun to issue blacklists binding on companies within their jurisdiction. Of course, Germany has 16 state-run DPAs with general jurisdiction over private companies and a federal DPA with jurisdiction over telecom and postal-service companies. As of the date of this post, nine of the 16 state-run DPAs and the federal DPA have issued blacklists. These blacklists are not identical, but are largely harmonized on key points. More are anticipated. We will continue to monitor the release of blacklists by German DPAs and update this post accordingly.
Since Germany’s state-run DPAs exercise general jurisdiction over private companies, their DPIA blacklist are likely of most interest to companies. As stated above, these state-DPA-issued blacklists are not identical, but theyevince general agreement that a number of activities should only be conducted in conjunction with a DPIA, including among the more salient:
• Large-scale processing of location data relating to individuals. DPAs provide examples such as (a) using automobile data to improve algorithms for self-driving cars, or (b) retailers using GPS and wireless data from customer mobile devices to track in-store behavior.
• General big data analytics, defined as “combining personal data from different sources for further processing,” when (a) the combination or further processing is “large scale,” (b) at least some data is being processed for purposes that are different than the original collection purposes, and (c) algorithms are used that are not comprehensible to data subjects. As an example, DPAs identify customer analytics in which companies combine their customer data with additional data – e.g. combining the company’s own data about “customer behavior” and “website use” with “third-party credit reporting data,” “data from social media advertising” and social media profile data in an effort to increase sales.
• Large-scale processing of HR data with potential for significant effects on employees, defined as processing personal data of employees that can used to evaluate their job performance on a large scale, and in a manner which can result in legal effects, or other similarly significant effects, for employees. Examples include (a) Data Loss Prevention (DLP) systems that create employee profiles, and (b) telematics systems or other systems that capture employee movements.
• Creating large-scale individual profiles, defined as “creating large-scale profiles about an individual’s interests, network of personal relationships, or personality.” Examples include (a) “large social networks”, or (b) dating platforms.
• AI-based interactions with individuals. DPAs indicate that any use of artificial intelligence applications to interact with individuals – such as AI-based customer support – requires a DPIA.
• Analytics with significant effects for individuals, defined as “combining personal data from different sources for further processing,” when (a) the combination or further processing is “large scale,” (b) at least some data is being processed for purposes that are different than the original collection purposes, (c) algorithms are used that are not comprehensible to data subjects, and (d) processing generates a “foundation of data” that can be used to make decisions having legal effects, or similarly significant effects, for data subjects. As examples, DPAs name fraud prevention systems used for online payments, or scoring processes used by banks, insurance companies, or credit reporting bureaus.
• Video/audio analysis tools. DPAs further indicate that automated analysis of video or audio recording require a DPIA. As an example, DPAs identify call centers employing applications that ‘read’ callers’ tone of voice.
• Reward programs that generate profiles. Some blacklists indicate that reward programs that provide points, rebates, refunds, or the like in exchange for purchases, and in the process generate “large scale” profiles of customers, require a DPIA. As an example, DPAs state that a retailer that provides “customer cards” that customers can show at the register to collect “loyalty points” requires a DPIA if the retailer is using the customer’s purchase data to generate customer profiles.
• Fitness wearables and apps. DPAs indicate that when “providers of new technology” that “evaluates the performance of data subjects” process sensitive data (under Art. 9 GDPR), a DPIA is required. As examples, DPAs state that “centrally storing data from sensors in fitness armbands or smartphones” – e.g. by an app that is used to “improve one’s training” – requires a DPIA. This seems to imply that German DPAs consider fitness and training data to constitute “health data” under Art. 9 GDPR.
• Engagement of Non-EU Vendors. Lastly, in some German states, the DPAs indicate that they require a DPIA to be conducted whenever a company within their jurisdiction engages non-EU vendor who will process sensitive data (under Art. 9 GDPR) or criminal history data (under Art. 10 GDPR). An example would be a health care provider engaging a vendor that is located outside the EU. German DPAs appear to base the requirement for a DPIA in this situation on the reasoning that courts or regulatory agencies in the vendor’s (non-EU) home jurisdiction could require the vendor to disclose the sensitive and/or criminal history data “in violation of Art. 48 GDPR.” This may indicate that at least some German DPAs read Art. 48 GDPR as requiring international agreements (such as Mutual Legal Assistance Treaties) to be in place in order for EU personal data to be produced as evidence in non-EU jurisdictions.
As stated above, the German DPAs’ DPIA blacklists are largely harmonized but not identical, and more are anticipated. The blacklists that have been issued to date provide an excellent overview of the areas that German DPAs consider to constitute “high risk” processing activities, and which could conceivably be areas of enforcement focus during the GDPR’s first years. We will continue to monitor the blacklists issued by German DPAs and to update this post on an ongoing basis.
The blacklists issued by German DPIAs can be downloaded (in German) here:
- Federal DPA
- DPA of Baden-Württemberg
- DPA of Berlin
- DPA of Brandenburg
- DPA of Hamburg
- DPA of Lower Saxony
- DPA of Rhineland-Palatinate
- DPA of Thuringia
- DPA of the Saarland
- DPA of Schleswig-Holstein
* * * *
Alston & Bird and its Brussels-based EU Privacy Team is closely following DPA action and enforcement in the EU Member States. For more information, contact Jim Harvey, David Keating, Jan Dhont, or Daniel Felz.