Date: pending publication
Publication: pending
Authors: Tytti Rintamäki, Delaram Golpayegani, Dave Lewis, Edoardo Celeste, Harshvardhan Pandit
1.Abstract
Under the EU General Data Protection Regulation (GDPR), the processing of personal data with new technologies (including Artificial Intelligence) requires conducting a Data Protection Impact Assessment (DPIA) to evaluate the potential risks to the rights and freedoms of individuals. In addition to defining categories of processing which require a DPIA, the GDPR also empowers Data Protection Authorities (DPAs) to define additional categories where a DPIA must be conducted, which led to a fragmented implementation landscape across the EU. In 2024, the EU adopted the AI Act, which classifies Artificial Intelligence (AI) technologies according to their level of risk for fundamental rights, democracy and society, and requires conducting a Fundamental Rights Impact Assessment (FRIA). The compelling question thus emerges of how and where DPIAs are required and what their relationship is vis-a-vis the risk assessment required by the AI Act. This paper first presents an analysis of DPIA requirements collected from the guidelines of all 27 EU member states and 3 EEA countries and then compares them with the ’high-risk’ areas defined in the EU AI Act’s Annex III. We show the overlaps, gaps, and divergence in EU Member States regarding applying DPIAs to AI. We also discuss how such assessments require coherence and cooperation throughout the AI lifecycle and supply chain based on ISO/IEC 5338:2023 to efficiently identify and resolve risks and impacts. Our findings are significant for the implementation of GDPR and AI Act and co-operation between their respective authorities, and highlight the necessity to harmonise the application of DPIAs with the AI Act’s high-risk areas.
GDPR, impact assessment, DPIA, FRIA, High-risk, EU AI Act, rights, AI Value Chain, AI Lifecycle.
To address this important yet under explored overlap between the GDPR and the AI Act, we investigate the intersections in the categorisation of high risk technologies across the GDPR and the AI Act as well as the implications of potential overlaps and divergences. In order to achieve this, we have the following research objectives:
RO1: We identify the key concepts that determine high-risk processing activities in the GDPR and its national implementing legislation (ADD link to Section);
RO2: We analyse high-risk AI systems in the AI Act Annex III to identify the potential applicability of the GDPR DPIA based on identified key concepts in RO1 (ADD link to Section);
RO3: We compare high-risk categorisations in the GDPR and the AI Act to identify overlaps, gaps, and variance (ADD link to Section); and finally
RO4: We assess the implications of the findings in RO3 on the AI value chain.
To identify the conditions where a DPIA is necessary, we utilised the criteria defined in GDPR Art.35(3), the lists of processing activities requiring a DPIA published by DPAs from all 27 EU and 3 EEA member states implementing the GDPR, and the Art. 29 Working Party (WP29) guidelines on DPIA endorsed by the European Data Protection Board (EDPB). In consolidating these, we differentiated between pan-EU legally binding requirements (mentioned in the GDPR or by the EDPB) and those limited to specific countries through their respective DPIA lists. Where guidelines were not present in English, we translated the documents using the eTranslation service provided by the European Commission. For each DPIA required condition, we expressed it as a set of ’key concepts’ (further described in Section: Annex III (#Annex-III-and-High-Risk-Processing-Activities), based on prior work applying similar techniques to GDPR’s Record of Processing activities (ROPA) (Source) and the AI Act’s Annex III cases (AI Act).
Through this exercise, we compiled a list of 94 distinct activities that represents all DPIA required conditions from the GDPR, EDPB’s, and member states’ guidelines. Each member states is referred to by its ISO 3166-2) code for example. AT for Austria or FR for France. The different processing activites are listed vertically (Y-axis) and whether they are present in GDPR, EDPB Guidelines or jurisdictions of specific member states is indicated horizontally. Y is used to denote that “Yes” this processing activity is explicitly listed in the list of high risk processing activities.
| ID | Activity | GDPR | EDPB | AT | BE | BG | HR | CY | CZ | DK | EE | FI | FR | DE | GR | HU | IS | IE | IT | LV | LI | LT | LU | MT | NL | NO | PL | PT | RO | SK | SI | ES | SE | TOTAL OF MEMBER STATES AND EEA COUNTRIES REQUIRING THE ACTIVITY |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| C1 | Large Scale processing of Special category personal data (Art.35-3b) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C2 | Processing of Special Category of personal data for decision- making (Art.35-3b) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C3 | Large scale purposes (Recital 91) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C4 | Profiling and/or processing of vulnerable persons data (Art.35-3b) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C5 | Large scale Systematic monitoring of a publicly accessible area (Art.35-3c) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C6 | Processing resulting in legal effects (Art.35-3a) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C7 | (Large scale) profiling (Art.35-3a) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C8 | Automated decision making and/or automated processing with legal or similar effect (Art.35-3a) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C9 | Use of new technology or innovative use (Art.35-1) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C10 | Large scale Processing of personal data relating to criminal offences or unlawful or bad conduct (Art.35-3b) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 32 |
| C11 | Processing of Biometric data | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 25 | |||||||
| C12 | Processing of Genetic data | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 24 | ||||||||
| C13 | (Large-scale) Processing of communication and location data | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 22 | ||||||||||
| C14 | Evaluation or scoring of individuals (including profiling or predicting) | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 21 | |||||||||||
| C15 | Matching or Combining separate data sets/ registers | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 19 | |||||||||||||
| C16 | (Large scale) processing of employee activities | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 19 | |||||||||||||
| C17 | Processing resulting in Access to or exclusion of services | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 12 | ||||||||||||||||||||
| C18 | Processing of data generated by devices connected to the Internet of things | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 12 | ||||||||||||||||||||
| C19 | Processing / Preventing a data subject from exercising a right | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 11 | |||||||||||||||||||||
| C20 | profiling resulting in exclusion/suspension/rupture from a contract | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | 11 | |||||||||||||||||||||
| C21 | Processing data concerning asylum seekers | Y | Y | Y | Y | Y | Y | Y | Y | 8 | ||||||||||||||||||||||||
| C22 | Processing of data revealing political opinions | Y | Y | Y | Y | Y | Y | Y | 7 | |||||||||||||||||||||||||
| C23 | The purpose of data processing is the application of ‘smart meters’ set up by public utilities providers (the monitoring of consumption customs). | Y | Y | Y | Y | Y | Y | Y | 7 | |||||||||||||||||||||||||
| C24 | Combining and/or matching data sets from two or more processing operations carried out for different purposes and/or by different controllers in the context of data processing that goes beyond the processing normally expected by a data subject, (provided that the use of algorithms can make decisions that significantly affect the data subject.) | Y | Y | Y | Y | Y | Y | Y | 7 | |||||||||||||||||||||||||
| C25 | Processing operations aimed at monitoring, monitoring or controlling data subjects, in particular by means of image or video and associated acoustic data processing | Y | Y | Y | Y | Y | Y | 6 | ||||||||||||||||||||||||||
| C26 | Processing of data related to minors | Y | Y | Y | Y | Y | Y | 6 | ||||||||||||||||||||||||||
| C27 | Profiling | Y | Y | Y | Y | Y | Y | 6 | ||||||||||||||||||||||||||
| C28 | processing concerns personal data that has not been obtained from the data subject, and providing this information would prove difficult/ impossible | Y | Y | Y | Y | Y | Y | 6 | ||||||||||||||||||||||||||
| C29 | (credit score) The purpose of data processing is to assess the creditability of the data subject by way of evaluating personal data in large scale or systematically | Y | Y | Y | Y | Y | Y | 6 | ||||||||||||||||||||||||||
| C30 | Processing of personal data for scientific or historical purposes where it is carried out without the data subject consents and together with at least one of the Criteria; | Y | Y | Y | Y | Y | Y | 6 | ||||||||||||||||||||||||||
| C31 | Processing concerned with evaluating indivduals for various insurance purposes | Y | Y | Y | Y | Y | 5 | |||||||||||||||||||||||||||
| C32 | Use of AI in processing | Y | Y | Y | Y | Y | 5 | |||||||||||||||||||||||||||
| C33 | Large scale processing in the context of fraud prevention | Y | Y | Y | Y | Y | 5 | |||||||||||||||||||||||||||
| C34 | Large scale processing of financial data | Y | Y | Y | Y | Y | 5 | |||||||||||||||||||||||||||
| C35 | The processing of children’s personal data for profiling or automated decision-making purposes or for marketing purposes, or for direct offering of services intended for them; | Y | Y | Y | Y | Y | 5 | |||||||||||||||||||||||||||
| C36 | Use of facial recognition technology as part of the monitoring of a public--accessible area | Y | Y | Y | Y | Y | 5 | |||||||||||||||||||||||||||
| C37 | The use of new technologies or technological solutions for the processing of personal data or with the possibility of processing personal data to analyse or predict the economic situation, health, personal preferences or interests, reliability or behaviour, location or movements of natural persons; | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C38 | The processing of personal data by linking, comparing or verifying matches from multiple sources | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C39 | Large scale systematic processing of personal data concerning health and public health for public interest purposes as is the introduction and use of electronic prescription systems and the introduction and use of electronic health records or electronic health cards. | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C40 | Large scale data collection from third parties | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C41 | Processing of data used to assess the behaviour and other personal aspects of natural persons | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C42 | Health data collected automatically by implantable medical device | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C43 | Processing operations aimed at monitoring, monitoring or controlling data subjects, in particular Roads with public transport which can be used by everyone | Y | Y | Y | Y | 4 | ||||||||||||||||||||||||||||
| C44 | The processing of considerable amounts of personal data for law enforcement purposes. | Y | Y | Y | 3 | |||||||||||||||||||||||||||||
| C45 | Processing of personal data for the purpose of systematic assessment of skills, competences, outcomes of tests, mental health or development. (Sensitive personal data or other information reveals a sensitive nature and systematic monitoring). | Y | Y | Y | 3 | |||||||||||||||||||||||||||||
| C46 | Extensive processing of sensitive personal data for the purpose of developing algorithms | Y | Y | Y | 3 | |||||||||||||||||||||||||||||
| C47 | Extensive processing of data subject to social, professional or special official secrecy, even if it is not data pursuant to Article 9(1) and (10 GDPR) | Y | Y | Y | 3 | |||||||||||||||||||||||||||||
| C48 | Processing that involves an assessment or classification of natural persons | Y | Y | Y | 3 | |||||||||||||||||||||||||||||
| C49 | Use of a video recording system for monitoring road behaviour on motorways. The controller intends to use a smart video analysis system to isolate vehicles and automatically recognise their plates. | Y | Y | Y | 3 | |||||||||||||||||||||||||||||
| C50 | Electronic monitoring at a school or preschool during the school/storage period. (Systematic and disadvantaged bodies). | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C51 | The purpose of data processing is to assess the solvency of the data subject by way of evaluating personal data in large scale or systematically | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C52 | where personal data are collected from third parties in order to be subsequently taken into account in the decision to refuse or terminate a specific service contract with a natural person; | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C53 | Processing of personal data of children in direct offering of information society services. | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C54 | Migration of data from existing to new technologies where this is related to large-scale data processing. | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C55 | (Systematic) transfer of special data categories between controllers | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C56 | Large-scale processing of behavioural data | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C57 | Supervision of the data subject, which is carried out in the following cases: a. if it is carried out on a large scale; b. if it is carried out at the workplace; c. if it applies to specially protected data subjects (e.g. health care, social care, prison, prison, educational institution, workplace). | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C58 | Large-scale tracking of data subjects | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C59 | Large-scale processing of health data | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C60 | Processing of personal data where data subjects have limited abilities to enforce their rights | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C61 | Processing personal data with the purpose of providing services or developing products for commercial use that involve predicting working capacity, economic status, health, personal preferences or interests, trustworthiness, behavior, location or route (Sensitive data or data of highly personal nature and evaluation/scoring) | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C62 | Innovate sensor or mobile based, centralised data collection | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C63 | Collection of public data in media social networking for profiling | Y | Y | 2 | ||||||||||||||||||||||||||||||
| C64 | Processing operations including capturing locations which may be entered by anyone due to a contractual obligation | Y | 1 | |||||||||||||||||||||||||||||||
| C65 | Processing operations including recording places which may be entered by anyone on the basis of the public interest; | Y | 1 | |||||||||||||||||||||||||||||||
| C66 | Processing operations including image processing using mobile cameras for the purpose of preventing or preventing dangerous attacks or criminal connections in public and non-public spaces; | Y | 1 | |||||||||||||||||||||||||||||||
| C67 | Processing operations including image and acoustic processing for the preventive protection of persons or property on private residential properties not exclusively used by the person responsible and by all persons living in the common household | Y | 1 | |||||||||||||||||||||||||||||||
| C68 | Processing operations including monitoring of churches, houses of prayer, as far as they are not already covered by lit. b and lit. e, and other institutions that serve the practice of religion in the community. | Y | 1 | |||||||||||||||||||||||||||||||
| C69 | Camera Surveillance | Y | 1 | |||||||||||||||||||||||||||||||
| C70 | Camera surveillance in schools or kindergartens during opening hours. | Y | 1 | |||||||||||||||||||||||||||||||
| C71 | Processing operations carried out pursuant to Article 14 of the General Data Protection Regulation. If the information that should be provided to the data subject is subject to an exception under Art.14 | Y | 1 | |||||||||||||||||||||||||||||||
| C72 | Processing of presonal data with a link to other controllers or processors | Y | 1 | |||||||||||||||||||||||||||||||
| C73 | Large scale systematic processing of data of high significance or of a highly personal nature | Y | 1 | |||||||||||||||||||||||||||||||
| C74 | The use of the personal data of pupils and students for assessment. | Y | 1 | |||||||||||||||||||||||||||||||
| C75 | Processing operations establishing profiles of natural persons for human resources management purposes | Y | 1 | |||||||||||||||||||||||||||||||
| C76 | Processing of health data implemented by health institutions or social medical institutions for the care of persons. | Y | 1 | |||||||||||||||||||||||||||||||
| C77 | Investigation of applications and management of social housing | Y | 1 | |||||||||||||||||||||||||||||||
| C78 | Treatments for the purpose of social or medico-social support for persons | Y | 1 | |||||||||||||||||||||||||||||||
| C79 | File processing operations that may contain personal data of the entire national population, | Y | 1 | |||||||||||||||||||||||||||||||
| C80 | Insufficient protection against unauthorised reversal of pseudonymisation. | Y | 1 | |||||||||||||||||||||||||||||||
| C81 | When the data controller is planning to set up an application, tool, or platform for use by an entire sector to process also special categories of personal data. | Y | 1 | |||||||||||||||||||||||||||||||
| C82 | Large scale systematic processing of personal data with the purpose of introducing, organizing, providing and monitoring the use of electronic government services | Y | 1 | |||||||||||||||||||||||||||||||
| C83 | Processing of location data for the execution of decisions in the area of judicial¬enforcement | Y | 1 | |||||||||||||||||||||||||||||||
| C84 | Processing of personal data in the context of the use of digital twins | Y | 1 | |||||||||||||||||||||||||||||||
| C85 | Processing of personal data using neurotechnology | Y | 1 | |||||||||||||||||||||||||||||||
| C86 | The processing of personal data using devices and technologies where the incident may endanger the health of an individual or more persons | Y | 1 | |||||||||||||||||||||||||||||||
| C87 | where large-scale data is collected from third parties in order to analyse or predict the economic situation, health, personal preferences or interests, reliability or behaviour, location or displacement of natural persons | Y | 1 | |||||||||||||||||||||||||||||||
| C88 | Processing of personal data carried out by a controller with a main establishment outside the EU | Y | 1 | |||||||||||||||||||||||||||||||
| C89 | Regular and systematic processing where the provision of information under Article 19 of Regulation (EU) 2016/679 | Y | 1 | |||||||||||||||||||||||||||||||
| C90 | Processing operations in the personal area of persons, even if the processing is based on consent. | Y | 1 | |||||||||||||||||||||||||||||||
| C91 | (Large-scale) Managing alerts and social and health reports or professional reports e.g. COVID-19 | Y | 1 | |||||||||||||||||||||||||||||||
| C92 | Processing using data from external sources | Y | 1 | |||||||||||||||||||||||||||||||
| C93 | Anonymisation of personal data | Y | 1 | |||||||||||||||||||||||||||||||
| C94 | Acquiring personal data where source is unknown | Y | 1 | |||||||||||||||||||||||||||||||
| C95 | Processing of location data, including matching or combining datasets | Y | 1 | |||||||||||||||||||||||||||||||
| C96 | Processing of location data concerning vulnerable data subjects | Y | 1 | |||||||||||||||||||||||||||||||
| C97 | Processing of location data using systematic monitoring of data subjects | Y | 1 | |||||||||||||||||||||||||||||||
| C98 | Processing of location data aimed at automated-decision making with legal or similar significant effect | Y | 1 | |||||||||||||||||||||||||||||||
| C99 | Processing of location data when it prevents data subjects from exercising a right or using a service or a contracts | Y | 1 | |||||||||||||||||||||||||||||||
| 100 | Processing of personal data in whistleblower systems | Y | 1 | |||||||||||||||||||||||||||||||
| 101 | Large scale processing that might pose a risk of property loss (particularly in banking and credit card services) | Y | 1 | |||||||||||||||||||||||||||||||
| 102 | Large scale processing that might pose a risk of violation of secrecy of correspondence (particularly in communication services). | Y | 1 | |||||||||||||||||||||||||||||||
| 103 | Large scale processing that might pose a risk of identity theft or fraud (particularly in digital trust services and in comparable identity management services). | Y | 1 | |||||||||||||||||||||||||||||||
| 104 | Large scale processing that might pose a risk of disclosure of personal economic standing (particularly taxation data, banking data, credit ranking data – publicly available data is not taken into account). | Y | 1 | |||||||||||||||||||||||||||||||
| 105 | Large scale processing that might pose a risk of discrimination with legal consequences or with similar impact (particularly in labor broking services and in assessment/evaluation services that have impact on salaries and career). | Y | 1 | |||||||||||||||||||||||||||||||
| 106 | Large scale processing that might pose a risk of loss of statutory confidentiality of information (restricted information, professional secrecy). | Y | 1 | |||||||||||||||||||||||||||||||
| Conditions Applicable (total = ) | 10 | 13 | 28 | 21 | 17 | 24 | 29 | 23 | 20 | 24 | 24 | 30 | 26 | 30 | 29 | 25 | 21 | 22 | 26 | 22 | 19 | 20 | 15 | 25 | 25 | 31 | 18 | 17 | 19 | 26 | 19 | 15 | 32 |
Through this exercise, we compiled a list of 100 distinct activities that represent all DPIA required conditions from the GDPR, EDPB’s, and member states’ guidelines.
Using this table of all the processing activities that are considered high-risk and require conducting a DPIA, we compiled a bar chart that shows the number of processing activities requring a DPIA by country. We also included how many activities were mentioned in GDPR and EDPB.
To show the variance between the member states and EEA countries, we compiled a bar chart of each country and the number of processing activities they consider high-risk and require conducting a DPIA. We included how many activities GDPR and EDPB list to show how much guidance there is in terms of the activities and allow for a comparison of how many more activities individual countries have added to their lists. Below is the summarised representation of the variance in the amount of DPIA required conditions across the mentioned sources. Each member state is referred to by its ISO 3166-2 code, for example AT for Austria or FR for France.
Poland has the most conditions for a DPIA (n=31). France and Greece have the second most conditions (n=30) and Malta and Sweden have the least (n=15). Of note, the bulk of DPIA required conditions in our list come from country specific lists ((93 out of 106). In these, stable activities include (large scale) Processing of communication and location data (22 countries), (large-scale) processing of employee activities (19 countries), and processing with legal effects such as access to or exclusion of services (12 countries). The use of AI required a DPIA in Austria, Denmark, Germany, Greece, and the Czech Republic.
Another visualisation of the variance in number of processing activities across the EU, is the following image of a map of the EU colour coded to show which countries have the least additions and which have the most. The gradient—from dark red, red, orange, yellow, to pale yellow—visually conveys the disparity in regulatory clarity across the EU. Red is used to show which countries have the least amount of Processing Activities listed/ least amount of additions made to the GDPR and EDPB mentioned activities. Yellow is used to show the countries that have the most additions. where Red signifies concern over the lack of clarification of what activities require a DPIA and Yellow signalling many additions have been made and guidance is therefore clearer on when to conduct a DPIA.
We proceeded then to ascertain whether any of the High-Risk AI systems listed in Annex III of the EU AI Act utilise personal data or may do so. We found that personal data is likely involved in 23 of 25 of the Annex III clauses and we identified that it may be involved in the remaining X under certain conditions. This led us ask the question of ”when do high-risk AI systems involving risky (per sonal) data processing require a DPIA?”. To do this, we utilised the key concepts in the following manner: (1) purpose of the AI system ; (2) involvement of per sonal data - especially whether it is sensitive or special cat egory; (3) the subject of the AI system - especially whether they are vulnerable; (4) possible involvement of specific technologies that trigger a DPIA - such as use of smart meters in Annex III-2a and (5) processing context - such as involvement of automated decision making, profiling, and producing legal effects. In addition to this we also had to codify certain exemptions mentioned in Annex III, such as III-5b where purpose is detecting financial fraud. The out put from this activity can be found in the Appendix 4from this activity allowed us to understand which of the GDPR’s DPIA required conditions could be applicable to each of the AI Act Annex III clauses.
First, we looked at information within the 25 high-risk descriptions (8 clauses and their subclauses) listed in Annex III, and assessed whether they involved personal data explicitly (i.e. it can be reasonably inferred from the description) or conditionally (i.e. it may be potentially involved in a particular application) to assess the applicability of GDPR to Annex III clauses. Similarly, for each of the key concepts we identified whether they are explicitly or conditionally applicable in each of the Annex III clauses. If a conditional applicability was identified, we added an identifier to distinguish the conditional clause from the main or explicit clause present in Annex III. Finally, we identified if the combination of the key concepts matched any of the DPIA required conditions identified from the analysis in Section 3. In this, we distinguished whether each condition came from the GDPR (i.e. GDPR Art.35 or EDPB) or a specific member state list - through which we determined whether the applicability of a DPIA required condition was uniform across the EU (i.e. it came from GDPR) or there was a variance (i.e. it is only present in one or more countries).
From this exercise, we found personal data was explicitly involved and hence GDPR is applicable in 23 out of the 25 Annex III clauses as seen in the following table. The only ones where GDPR is not always applicable are Annex III clause 2a and clause 8a, though GDPR may be conditionally applicable based on the involvement of personal data in a particular use-case. To distinguish such conditional applicabilities, we created additional identifiers to distinguish between each variation, for example in Annex III-2a related to critical infrastructure we identified 3 variations based on involvement of: III-2a.1 smart meters, III-2a.2 road traffic video analysis, and III-2a.3 public transport monitoring systems. Each such variation involves the conditional applicability of a concept (technology for III-2a) and allows us to match the Annex III clause to a DPIA required condition, e.g. smart meters in III-2a.1 match with DPIA required conditions from Hun gary, Poland, and Romania. We found 36 such additional use cases based on the conditional applicability of key concepts in DPIA required conditions. In sum there are 61 total (sub)clauses and use cases for which a DPIA is always re quired for 21 use cases, is conditionally required for 38 and is not required for 2, as seen below.
The table described above which shows how applicable GDPR is to the various AI systems listed in Annex three can be viewed here. It is an interactive table that by clicking on the first colomn ex. “1a. Biometrics” an explanation will appear that describes what data we found to require a DPIA either according to GDPR or according to individual member state data protection authorities.
A snapshot of what the table looks like behind the link is displayed below, without the interactive explanations.
| AI Act Annex III Clause | Purpose | Produces Legal Effects | Processing Context | Special categories of personal data | AI/ Data subject | Organisation / Agent | DPIA required by GDPR | EDPB | DPIA required by member states |
|---|---|---|---|---|---|---|---|---|---|
| 1a. Biometrics | identification of people | MAY | remote processing* | biometrics* | natural persons | - | YES | YES | AT,BE,BG,HR,CZ,DK,EE,FR,DE,GR,HU,IS,IE,LV,LI,LT,LU,MT,NL,NO,PL,PT,SK,SI,ES (Biometrics) |
| 1b. Biometrics | - | MAY | infer data*, profiling(categorisation*) | biometrics*, protected attributes or characteristics* | natural persons | - | YES | YES | AT,BE,BG,HR,CZ,DK,EE,FR,DE,GR,HU,IS,IE,LV,LI,LT,LU,MT,NL,NO,PL,PT,SK,SI,ES (Biometrics) |
| 1c. Biometrics | - | MAY | - | biometrics* | natural persons | - | YES | YES | AT,BE,BG,HR,CZ,DK,EE,FR,DE,GR,HU,IS,IE,LV,LI,LT,LU,MT,NL,NO,PL,PT,SK,SI,ES (Biometrics) |
| 2a. Critical infrastructure | controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity | MAY | - | - | - | critical infrastructure operations/ management | NO | NO | AT,CZ,DK,DE,GR (Use of AI in processing) |
| 2a.1. | controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity | MAY | Large-scale processing of personal data | - | - | critical infrastructure operations/ management | YES | YES | HU, PL, RO (smart meters), AT,CZ,DK,DE,GR (Use of AI in processing) |
| 2a.2. | controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity | MAY | Large-scale processing of personal data | - | - | critical infrastructure operations/ management | YES | YES | AT,CY (road traffic analysis) AT,CZ,DK,DE,GR (Use of AI in processing) |
| 2a.3. | controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity | MAY | Large-scale processing of personal data | - | - | critical infrastructure operations/ management | YES | YES | AT,CZ,DK,DE,GR (Use of AI in processing) RO,SK (public transport) |