High-Risk-Categorisations-in-GDPR-vs-AI-Act

High Risk Categorisations in GDPR vs AI Act: Overlaps and Implications

Date: pending publication
Publication: pending
Authors: Tytti Rintamäki, Delaram Golpayegani, Dave Lewis, Edoardo Celeste, Harshvardhan Pandit

Table of Contents

1.Abstract

  1. Methodology

  2. High Risk Processing Activities requiring a Data Protection Impact Assessment
    1. Processing Activities Requiring a DPIA
  3. Annex III and High Risk Processing Activities

  4. References

Abstract:

Under the EU General Data Protection Regulation (GDPR), the processing of personal data with new technologies (including Artificial Intelligence) requires conducting a Data Protection Impact Assessment (DPIA) to evaluate the potential risks to the rights and freedoms of individuals. In addition to defining categories of processing which require a DPIA, the GDPR also empowers Data Protection Authorities (DPAs) to define additional categories where a DPIA must be conducted, which led to a fragmented implementation landscape across the EU. In 2024, the EU adopted the AI Act, which classifies Artificial Intelligence (AI) technologies according to their level of risk for fundamental rights, democracy and society, and requires conducting a Fundamental Rights Impact Assessment (FRIA). The compelling question thus emerges of how and where DPIAs are required and what their relationship is vis-a-vis the risk assessment required by the AI Act. This paper first presents an analysis of DPIA requirements collected from the guidelines of all 27 EU member states and 3 EEA countries and then compares them with the ’high-risk’ areas defined in the EU AI Act’s Annex III. We show the overlaps, gaps, and divergence in EU Member States regarding applying DPIAs to AI. We also discuss how such assessments require coherence and cooperation throughout the AI lifecycle and supply chain based on ISO/IEC 5338:2023 to efficiently identify and resolve risks and impacts. Our findings are significant for the implementation of GDPR and AI Act and co-operation between their respective authorities, and highlight the necessity to harmonise the application of DPIAs with the AI Act’s high-risk areas.

Keywords:

GDPR, impact assessment, DPIA, FRIA, High-risk, EU AI Act, rights, AI Value Chain, AI Lifecycle.

Methodology:

To address this important yet under explored overlap between the GDPR and the AI Act, we investigate the intersections in the categorisation of high risk technologies across the GDPR and the AI Act as well as the implications of potential overlaps and divergences. In order to achieve this, we have the following research objectives:

RO1: We identify the key concepts that determine high-risk processing activities in the GDPR and its national implementing legislation (ADD link to Section);

RO2: We analyse high-risk AI systems in the AI Act Annex III to identify the potential applicability of the GDPR DPIA based on identified key concepts in RO1 (ADD link to Section);

RO3: We compare high-risk categorisations in the GDPR and the AI Act to identify overlaps, gaps, and variance (ADD link to Section); and finally

RO4: We assess the implications of the findings in RO3 on the AI value chain.

High Risk Processing Activities requiring a Data Protection Impact Assessment

To identify the conditions where a DPIA is necessary, we utilised the criteria defined in GDPR Art.35(3), the lists of processing activities requiring a DPIA published by DPAs from all 27 EU and 3 EEA member states implementing the GDPR, and the Art. 29 Working Party (WP29) guidelines on DPIA endorsed by the European Data Protection Board (EDPB). In consolidating these, we differentiated between pan-EU legally binding requirements (mentioned in the GDPR or by the EDPB) and those limited to specific countries through their respective DPIA lists. Where guidelines were not present in English, we translated the documents using the eTranslation service provided by the European Commission. For each DPIA required condition, we expressed it as a set of ’key concepts’ (further described in Section: Annex III (#Annex-III-and-High-Risk-Processing-Activities), based on prior work applying similar techniques to GDPR’s Record of Processing activities (ROPA) (Source) and the AI Act’s Annex III cases (AI Act).

Through this exercise, we compiled a list of 94 distinct activities that represents all DPIA required conditions from the GDPR, EDPB’s, and member states’ guidelines. Each member states is referred to by its ISO 3166-2) code for example. AT for Austria or FR for France. The different processing activites are listed vertically (Y-axis) and whether they are present in GDPR, EDPB Guidelines or jurisdictions of specific member states is indicated horizontally. Y is used to denote that “Yes” this processing activity is explicitly listed in the list of high risk processing activities.

Processing Activities Requiring a DPIA

ID Activity GDPR EDPB AT BE BG HR CY CZ DK EE FI FR DE GR HU IS IE IT LV LI LT LU MT NL NO PL PT RO SK SI ES SE TOTAL OF MEMBER STATES AND EEA COUNTRIES REQUIRING THE ACTIVITY
C1 Large Scale processing of Special category personal data (Art.35-3b) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C2 Processing of Special Category of personal data for decision- making (Art.35-3b) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C3 Large scale purposes (Recital 91) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C4 Profiling and/or processing of vulnerable persons data (Art.35-3b) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C5 Large scale Systematic monitoring of a publicly accessible area (Art.35-3c) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C6 Processing resulting in legal effects (Art.35-3a) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C7 (Large scale) profiling (Art.35-3a) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C8 Automated decision making and/or automated processing with legal or similar effect (Art.35-3a) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C9 Use of new technology or innovative use (Art.35-1) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C10 Large scale Processing of personal data relating to criminal offences or unlawful or bad conduct (Art.35-3b) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 32
C11 Processing of Biometric data Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 25
C12 Processing of Genetic data Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 24
C13 (Large-scale) Processing of communication and location data Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 22
C14 Evaluation or scoring of individuals (including profiling or predicting) Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 21
C15 Matching or Combining separate data sets/ registers Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 19
C16 (Large scale) processing of employee activities Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y 19
C17 Processing resulting in Access to or exclusion of services Y Y Y Y Y Y Y Y Y Y Y Y 12
C18 Processing of data generated by devices connected to the Internet of things Y Y Y Y Y Y Y Y Y Y Y Y 12
C19 Processing / Preventing a data subject from exercising a right Y Y Y Y Y Y Y Y Y Y Y 11
C20 profiling resulting in exclusion/suspension/rupture from a contract Y Y Y Y Y Y Y Y Y Y Y 11
C21 Processing data concerning asylum seekers Y Y Y Y Y Y Y Y 8
C22 Processing of data revealing political opinions Y Y Y Y Y Y Y 7
C23 The purpose of data processing is the application of ‘smart meters’ set up by public utilities providers (the monitoring of consumption customs). Y Y Y Y Y Y Y 7
C24 Combining and/or matching data sets from two or more processing operations carried out for different purposes and/or by different controllers in the context of data processing that goes beyond the processing normally expected by a data subject, (provided that the use of algorithms can make decisions that significantly affect the data subject.) Y Y Y Y Y Y Y 7
C25 Processing operations aimed at monitoring, monitoring or controlling data subjects, in particular by means of image or video and associated acoustic data processing Y Y Y Y Y Y 6
C26 Processing of data related to minors Y Y Y Y Y Y 6
C27 Profiling Y Y Y Y Y Y 6
C28 processing concerns personal data that has not been obtained from the data subject, and providing this information would prove difficult/ impossible Y Y Y Y Y Y 6
C29 (credit score) The purpose of data processing is to assess the creditability of the data subject by way of evaluating personal data in large scale or systematically Y Y Y Y Y Y 6
C30 Processing of personal data for scientific or historical purposes where it is carried out without the data subject consents and together with at least one of the Criteria; Y Y Y Y Y Y 6
C31 Processing concerned with evaluating indivduals for various insurance purposes Y Y Y Y Y 5
C32 Use of AI in processing Y Y Y Y Y 5
C33 Large scale processing in the context of fraud prevention Y Y Y Y Y 5
C34 Large scale processing of financial data Y Y Y Y Y 5
C35 The processing of children’s personal data for profiling or automated decision-making purposes or for marketing purposes, or for direct offering of services intended for them; Y Y Y Y Y 5
C36 Use of facial recognition technology as part of the monitoring of a public--accessible area Y Y Y Y Y 5
C37 The use of new technologies or technological solutions for the processing of personal data or with the possibility of processing personal data to analyse or predict the economic situation, health, personal preferences or interests, reliability or behaviour, location or movements of natural persons; Y Y Y Y 4
C38 The processing of personal data by linking, comparing or verifying matches from multiple sources Y Y Y Y 4
C39 Large scale systematic processing of personal data concerning health and public health for public interest purposes as is the introduction and use of electronic prescription systems and the introduction and use of electronic health records or electronic health cards. Y Y Y Y 4
C40 Large scale data collection from third parties Y Y Y Y 4
C41 Processing of data used to assess the behaviour and other personal aspects of natural persons Y Y Y Y 4
C42 Health data collected automatically by implantable medical device Y Y Y Y 4
C43 Processing operations aimed at monitoring, monitoring or controlling data subjects, in particular Roads with public transport which can be used by everyone Y Y Y Y 4
C44 The processing of considerable amounts of personal data for law enforcement purposes. Y Y Y 3
C45 Processing of personal data for the purpose of systematic assessment of skills, competences, outcomes of tests, mental health or development. (Sensitive personal data or other information reveals a sensitive nature and systematic monitoring). Y Y Y 3
C46 Extensive processing of sensitive personal data for the purpose of developing algorithms Y Y Y 3
C47 Extensive processing of data subject to social, professional or special official secrecy, even if it is not data pursuant to Article 9(1) and (10 GDPR) Y Y Y 3
C48 Processing that involves an assessment or classification of natural persons Y Y Y 3
C49 Use of a video recording system for monitoring road behaviour on motorways. The controller intends to use a smart video analysis system to isolate vehicles and automatically recognise their plates. Y Y Y 3
C50 Electronic monitoring at a school or preschool during the school/storage period. (Systematic and disadvantaged bodies). Y Y 2
C51 The purpose of data processing is to assess the solvency of the data subject by way of evaluating personal data in large scale or systematically Y Y 2
C52 where personal data are collected from third parties in order to be subsequently taken into account in the decision to refuse or terminate a specific service contract with a natural person; Y Y 2
C53 Processing of personal data of children in direct offering of information society services. Y Y 2
C54 Migration of data from existing to new technologies where this is related to large-scale data processing. Y Y 2
C55 (Systematic) transfer of special data categories between controllers Y Y 2
C56 Large-scale processing of behavioural data Y Y 2
C57 Supervision of the data subject, which is carried out in the following cases: a. if it is carried out on a large scale; b. if it is carried out at the workplace; c. if it applies to specially protected data subjects (e.g. health care, social care, prison, prison, educational institution, workplace). Y Y 2
C58 Large-scale tracking of data subjects Y Y 2
C59 Large-scale processing of health data Y Y 2
C60 Processing of personal data where data subjects have limited abilities to enforce their rights Y Y 2
C61 Processing personal data with the purpose of providing services or developing products for commercial use that involve predicting working capacity, economic status, health, personal preferences or interests, trustworthiness, behavior, location or route (Sensitive data or data of highly personal nature and evaluation/scoring) Y Y 2
C62 Innovate sensor or mobile based, centralised data collection Y Y 2
C63 Collection of public data in media social networking for profiling Y Y 2
C64 Processing operations including capturing locations which may be entered by anyone due to a contractual obligation Y 1
C65 Processing operations including recording places which may be entered by anyone on the basis of the public interest; Y 1
C66 Processing operations including image processing using mobile cameras for the purpose of preventing or preventing dangerous attacks or criminal connections in public and non-public spaces; Y 1
C67 Processing operations including image and acoustic processing for the preventive protection of persons or property on private residential properties not exclusively used by the person responsible and by all persons living in the common household Y 1
C68 Processing operations including monitoring of churches, houses of prayer, as far as they are not already covered by lit. b and lit. e, and other institutions that serve the practice of religion in the community. Y 1
C69 Camera Surveillance Y 1
C70 Camera surveillance in schools or kindergartens during opening hours. Y 1
C71 Processing operations carried out pursuant to Article 14 of the General Data Protection Regulation. If the information that should be provided to the data subject is subject to an exception under Art.14 Y 1
C72 Processing of presonal data with a link to other controllers or processors Y 1
C73 Large scale systematic processing of data of high significance or of a highly personal nature Y 1
C74 The use of the personal data of pupils and students for assessment. Y 1
C75 Processing operations establishing profiles of natural persons for human resources management purposes Y 1
C76 Processing of health data implemented by health institutions or social medical institutions for the care of persons. Y 1
C77 Investigation of applications and management of social housing Y 1
C78 Treatments for the purpose of social or medico-social support for persons Y 1
C79 File processing operations that may contain personal data of the entire national population, Y 1
C80 Insufficient protection against unauthorised reversal of pseudonymisation. Y 1
C81 When the data controller is planning to set up an application, tool, or platform for use by an entire sector to process also special categories of personal data. Y 1
C82 Large scale systematic processing of personal data with the purpose of introducing, organizing, providing and monitoring the use of electronic government services Y 1
C83 Processing of location data for the execution of decisions in the area of judicial¬enforcement Y 1
C84 Processing of personal data in the context of the use of digital twins Y 1
C85 Processing of personal data using neurotechnology Y 1
C86 The processing of personal data using devices and technologies where the incident may endanger the health of an individual or more persons Y 1
C87 where large-scale data is collected from third parties in order to analyse or predict the economic situation, health, personal preferences or interests, reliability or behaviour, location or displacement of natural persons Y 1
C88 Processing of personal data carried out by a controller with a main establishment outside the EU Y 1
C89 Regular and systematic processing where the provision of information under Article 19 of Regulation (EU) 2016/679 Y 1
C90 Processing operations in the personal area of persons, even if the processing is based on consent. Y 1
C91 (Large-scale) Managing alerts and social and health reports or professional reports e.g. COVID-19 Y 1
C92 Processing using data from external sources Y 1
C93 Anonymisation of personal data Y 1
C94 Acquiring personal data where source is unknown Y 1
C95 Processing of location data, including matching or combining datasets Y 1
C96 Processing of location data concerning vulnerable data subjects Y 1
C97 Processing of location data using systematic monitoring of data subjects Y 1
C98 Processing of location data aimed at automated-decision making with legal or similar significant effect Y 1
C99 Processing of location data when it prevents data subjects from exercising a right or using a service or a contracts Y 1
100 Processing of personal data in whistleblower systems Y 1
101 Large scale processing that might pose a risk of property loss (particularly in banking and credit card services) Y 1
102 Large scale processing that might pose a risk of violation of secrecy of correspondence (particularly in communication services). Y 1
103 Large scale processing that might pose a risk of identity theft or fraud (particularly in digital trust services and in comparable identity management services). Y 1
104 Large scale processing that might pose a risk of disclosure of personal economic standing (particularly taxation data, banking data, credit ranking data – publicly available data is not taken into account). Y 1
105 Large scale processing that might pose a risk of discrimination with legal consequences or with similar impact (particularly in labor broking services and in assessment/evaluation services that have impact on salaries and career). Y 1
106 Large scale processing that might pose a risk of loss of statutory confidentiality of information (restricted information, professional secrecy). Y 1
Conditions Applicable (total = ) 10 13 28 21 17 24 29 23 20 24 24 30 26 30 29 25 21 22 26 22 19 20 15 25 25 31 18 17 19 26 19 15 32

Through this exercise, we compiled a list of 100 distinct activities that represent all DPIA required conditions from the GDPR, EDPB’s, and member states’ guidelines.

Using this table of all the processing activities that are considered high-risk and require conducting a DPIA, we compiled a bar chart that shows the number of processing activities requring a DPIA by country. We also included how many activities were mentioned in GDPR and EDPB.

Number of Processing Activities requiring a DPIA

To show the variance between the member states and EEA countries, we compiled a bar chart of each country and the number of processing activities they consider high-risk and require conducting a DPIA. We included how many activities GDPR and EDPB list to show how much guidance there is in terms of the activities and allow for a comparison of how many more activities individual countries have added to their lists. Below is the summarised representation of the variance in the amount of DPIA required conditions across the mentioned sources. Each member state is referred to by its ISO 3166-2 code, for example AT for Austria or FR for France.

image

Poland has the most conditions for a DPIA (n=31). France and Greece have the second most conditions (n=30) and Malta and Sweden have the least (n=15). Of note, the bulk of DPIA required conditions in our list come from country specific lists ((93 out of 106). In these, stable activities include (large scale) Processing of communication and location data (22 countries), (large-scale) processing of employee activities (19 countries), and processing with legal effects such as access to or exclusion of services (12 countries). The use of AI required a DPIA in Austria, Denmark, Germany, Greece, and the Czech Republic.

Another visualisation of the variance in number of processing activities across the EU, is the following image of a map of the EU colour coded to show which countries have the least additions and which have the most. The gradient—from dark red, red, orange, yellow, to pale yellow—visually conveys the disparity in regulatory clarity across the EU. Red is used to show which countries have the least amount of Processing Activities listed/ least amount of additions made to the GDPR and EDPB mentioned activities. Yellow is used to show the countries that have the most additions. where Red signifies concern over the lack of clarification of what activities require a DPIA and Yellow signalling many additions have been made and guidance is therefore clearer on when to conduct a DPIA.

image

Annex III and High Risk Processing Activities

We proceeded then to ascertain whether any of the High-Risk AI systems listed in Annex III of the EU AI Act utilise personal data or may do so. We found that personal data is likely involved in 23 of 25 of the Annex III clauses and we identified that it may be involved in the remaining X under certain conditions. This led us ask the question of ”when do high-risk AI systems involving risky (per sonal) data processing require a DPIA?”. To do this, we utilised the key concepts in the following manner: (1) purpose of the AI system ; (2) involvement of per sonal data - especially whether it is sensitive or special cat egory; (3) the subject of the AI system - especially whether they are vulnerable; (4) possible involvement of specific technologies that trigger a DPIA - such as use of smart meters in Annex III-2a and (5) processing context - such as involvement of automated decision making, profiling, and producing legal effects. In addition to this we also had to codify certain exemptions mentioned in Annex III, such as III-5b where purpose is detecting financial fraud. The out put from this activity can be found in the Appendix 4from this activity allowed us to understand which of the GDPR’s DPIA required conditions could be applicable to each of the AI Act Annex III clauses.

First, we looked at information within the 25 high-risk descriptions (8 clauses and their subclauses) listed in Annex III, and assessed whether they involved personal data explicitly (i.e. it can be reasonably inferred from the description) or conditionally (i.e. it may be potentially involved in a particular application) to assess the applicability of GDPR to Annex III clauses. Similarly, for each of the key concepts we identified whether they are explicitly or conditionally applicable in each of the Annex III clauses. If a conditional applicability was identified, we added an identifier to distinguish the conditional clause from the main or explicit clause present in Annex III. Finally, we identified if the combination of the key concepts matched any of the DPIA required conditions identified from the analysis in Section 3. In this, we distinguished whether each condition came from the GDPR (i.e. GDPR Art.35 or EDPB) or a specific member state list - through which we determined whether the applicability of a DPIA required condition was uniform across the EU (i.e. it came from GDPR) or there was a variance (i.e. it is only present in one or more countries).

From this exercise, we found personal data was explicitly involved and hence GDPR is applicable in 23 out of the 25 Annex III clauses as seen in the following table. The only ones where GDPR is not always applicable are Annex III clause 2a and clause 8a, though GDPR may be conditionally applicable based on the involvement of personal data in a particular use-case. To distinguish such conditional applicabilities, we created additional identifiers to distinguish between each variation, for example in Annex III-2a related to critical infrastructure we identified 3 variations based on involvement of: III-2a.1 smart meters, III-2a.2 road traffic video analysis, and III-2a.3 public transport monitoring systems. Each such variation involves the conditional applicability of a concept (technology for III-2a) and allows us to match the Annex III clause to a DPIA required condition, e.g. smart meters in III-2a.1 match with DPIA required conditions from Hun gary, Poland, and Romania. We found 36 such additional use cases based on the conditional applicability of key concepts in DPIA required conditions. In sum there are 61 total (sub)clauses and use cases for which a DPIA is always re quired for 21 use cases, is conditionally required for 38 and is not required for 2, as seen below.

AI Act Annex III and GDPR’s DPIA

Involvement of GDPR DPIA’s and High-Risk AI systems from Annex III (Expandable explanation table)

The table described above which shows how applicable GDPR is to the various AI systems listed in Annex three can be viewed here. It is an interactive table that by clicking on the first colomn ex. “1a. Biometrics” an explanation will appear that describes what data we found to require a DPIA either according to GDPR or according to individual member state data protection authorities.

A snapshot of what the table looks like behind the link is displayed below, without the interactive explanations.

AI Act Annex III

Snipit of table describing the applicability of GDPR's DPIA on High-Risk Annex III

</table> ## References References [1] Sergio Barezzani. “Data Protection Impact Assessment (DPIA)”. In: Encyclopedia of Cryptography, Security and Privacy. Ed. by Sushil Jajodia, Pierangela Samarati, and Moti Yung. Berlin, Heidelberg: Springer Berlin Heidelberg, 2024, pp. 1–3. isbn: 978-3-642-27739-9. doi: 10.1007/978-3-642-27739-9_1813-1. (Visited on 04/26/2024).
[2] Alessandra Calvi and Dimitris Kotzinos. “Enhancing AI Fairness through Impact Assessment in the European Union: A Legal and Computer Science Perspective”. In: 2023 ACM Conference on Fairness, Accountability, and Transparency. Chicago IL USA: ACM, June 2023, pp. 1229–1245. isbn: 9798400701924. doi: 10.1145/3593013.3594076. (Visited on 04/19/2024).
[3] Maximilian Castelli and Linda C. Moreau Ph.D. “The Cycle of Trust and Responsibility in Out-sourced AI”. In: Proceedings of the 2nd Workshop on Trustworthy Natural Language Processing (TrustNLP 2022). Seattle, U.S.A.: Association for Computational Linguistics, 2022, pp. 43–48. doi: 10.18653/v1/2022.trustnlp-1.4. (Visited on 04/26/2024).
[4] CNIL. CNIL Privacy Impact Assessment. Accessed: 21.02.2025. 2024. url: https://www.cnil.fr/en/privacy-impact-assessment-pia.
[5] European Commission. eTranslation. European Commission Tool. This is an online tool. Accessed and used throughout the period of March 2024-August 2024. 2024. url: https://commission.europa.eu/resources-partners/etranslation_en.
[6] Dataschutzkonferenz. German DPIA Guide. Accessed: 21.02.2025. Oct. 2018. url: https://datenschutzkonferenz-online.de/media/ah/20181017_ah_DPIA_list_1_1__Germany_EN.pdf.
[7] Datatilsynet. Denmark DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.datatilsynet.dk/hvad-siger-reglerne/vejledning/sikkerhed/konsekvensanalyse/konsekvensanalyser.
[8] Datatilsynet. Norweigan DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.datatilsynet.no/en/#:~:text=Do%20you%20wonder%20if%20you,always%20will%20require%20a%20DPIA.
[9] Datenschutzbehörde. Austria DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.dsb.gv.at/download-links/dokumente.html.
[10] Katerina Demetzou. “GDPR and the Concept of Risk: The Role of Risk, the Scope of Risk and the Technology Involved”. In: Privacy and Identity Management. Fairness, Accountability, and Transparency in the Age of Big Data. Ed. by Eleni Kosta et al. Vol. 547. Cham: Springer International Publishing, 2019, pp. 137–154. isbn: 978-3-030-16743-1 978-3-030-16744-8. doi:10.1007/978-3-030-16744-8_10. (Visited on 03/07/2024).
[11] Autorité de protection des données. Belgium DPIA Guide. Accessed: 21.02.2025. Jan. 2019. URL:https://www.autoriteprotectiondonnees.be/publications/decision- n- 01- 2019- du- 16- janvier-2019.pdf.
[12] Commission nationale pour la protection des données. Luxembourg DPIA Guide. Tech. rep.
Accessed: 21.02.2025. Aug. 2023. url:https://cnpd.public.lu/en/professionnels/obligations/AIPD/liste-dpia.html.
[13] Mario Draghi. The Future of European Competitiveness: Report by Mario Draghi. Sept. 9, 2024. URL: https://commission.europa.eu/topics/eu- competitiveness/draghi- report_en (visited on 02/21/2025).
[14] E. Drouard et al. “The Interplay between the AI Act and the GDPR:” in: Journal of AI Law and Regulation 1.2 (2024), pp. 164–176. issn: 29424380, 29424372. doi: 10.21552/aire/2024/2/4. (Visited on 08/29/2024).
[15] EDPB. EDPB Guidelines on Data Protection Impact Assessment (DPIA) (Wp248rev.01). Accessed: 21.03.2024. Oct. 2017. url: https://ec.europa.eu/newsroom/article29/items/611236.
[16] European Parliament Legislative Resolution of 13 March 2024 on the Proposal for a Regulation of the European Parliament and of the Council on Laying down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts (COM(2021)0206 – C9-0146/2021 – 2021/0106(COD). Mar. 2024.
[17] European Union. Regulation—EU—2024/1689—EN—EUR-Lex. EU AI Act. European Union, 2024. url: https://eur-lex.europa.eu/eli/reg/2024/1689/oj.
[18] Janneke Gerards et al. Fundamental Rights and Algorithms Impact Assessment. 1–99. 2022.
[19] Delaram Golpayegani, Harshvardhan J. Pandit, and Dave Lewis. “To Be High-Risk, or Not To Be—Semantic Specifications and Implications of the AI Act’s High-Risk AI Applications and Harmonised Standards”. In: 2023 ACM Conference on Fairness, Accountability, and Transparency. Chicago IL USA: ACM, June 2023, pp. 905–915. isbn: 9798400701924. doi: 10.1145/ 3593013.3594050. (Visited on 04/26/2024).
[20] Maximilian Grafenstein. “Reconciling Conflicting Interests in Data through Data Governance. An Analytical Framework (and a Brief Discussion of the Data Governance Act Draft, the Data Act Draft, the AI Regulation Draft, as Well as the GDPR)”. In: SSRN Electronic Journal (2022). issn: 1556-5068. doi: 10.2139/ssrn.4104502. (Visited on 08/29/2024).
[21] Nemzeti Adatvédelmi és Információszabadság Hatóság. Hungary DPIA Guide. Accessed: 21.02.2025. 2024. url: https://naih.hu/data-protection/gdpr-35-4-mandatory-dpia-list.
[22] Information and Data Protection Commissioner. Malta DPIA Guide. Accessed: 21.02.2025. 2024. url: https://idpc.org.mt/for-organisations/data-protection-impact-assessment/.
[23] Datu valsts inspekcija. Latvia DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.dvi.gov.lv/lv/novertejums-par-ietekmi-uz-datu-aizsardzibu-nida.
[24] Valstybin˙e duomenu˛ apsaugos inspekcija. Lithuania DPIA Guide. Accessed: 21.02.2025. 2024. url: https://vdai.lrv.lt/en/news/list-of-data-processing-operations-subject-to-the-requirement-to-perform-data-protection-impact-assessment/.
[25] Andmekaitse Inspektsioon. Estonia DPIA Guide. Accessed: 21.02.2025. Jan. 2024. url: https://www.aki.ee/en/guidelines-legislation/cross-border-data-protection-impact-assessment.
[26] Integritetsskyddsmyndigheten. Sweden DPIA Guide. Accessed: 21.02.2025. Jan. 2019. url:https://www.imy.se/verksamhet/dataskydd/det-har-galler-enligt-gdpr/konsekvensbedomningar-och-forhandssamrad/.
[27] Ireland DPIA Guide. Accessed: 21.02.2025. url: https://www.dataprotection.ie/en/organisations/know-your-obligations/data-protection-impact-assessments?utm_source=chatgpt.com#identifying-whether-a-dpia-is-required.
[28] ISO/IEC 29134:2023. https://www.iso.org/standard/86012.html. 2023. (Visited on 05/14/2024).
[29] Commission Nationale Informatique Libertes. France DPIA Guide. Accessed: 21.02.2025. Nov. 2018. url: https://www.cnil.fr/sites/cnil/files/atoms/files/liste-traitements-aipd-requise.pdf.
[30] French Data Protection Authority Commission Nationale de l’Informatique et des Libertes. CNIL AI How to Sheets. Accessed: 21.02.2025. 2024. url: https://www.cnil.fr/en/ai-how-sheets.
[31] Datenschutzstelle Liechtenstein. Liechtenstein DPIA Guide. Accessed: 27.03.2024. Aug. 2020. url: https://www.datenschutzstelle.li/application/files/7615/9670/5293/DPIA_list_Liechtenstein_EN.pdf.
[32] Alessandro Mantelero. The Ai Act’s Fundamental Rights Impact Assessment. 2024. doi: 10.2139/ssrn.4782126. (Visited on 04/26/2024).
[33] Alessandro Mantelero. “The Fundamental Rights Impact Assessment (FRIA) in the AI Act: Roots, Legal Obligations and Key Elements for a Model Template”. In: Computer Law & Security Review 54 (Sept. 2024), p. 106020. issn: 02673649. doi: 10.1016/j.clsr.2024.106020. (Visited on 08/29/2024).
[34] Jeroen Naves and Pels Rijcken. EU Model Contractual AI Clauses to Pilot in Procurements of AI. Sept. 2023.
[35] Urząd Ochrony Danych Osobowych. Poland DPIA Guide. Accessed: 21.02.2025. 2024. URL: https://archiwum.uodo.gov.pl/pl/424.
[36] Harshvardhan J. Pandit. “A Semantic Specification for Data Protection Impact Assessments (DPIA)”. In: (June 2022). doi: 10.5281/ZENODO.6783203. (Visited on 11/29/2023).
[37] Garante per la protezione dei dati personali. Italian DPIA Guide. Accessed: 21.02.2025. 2016. url: https://www.garanteprivacy.it/valutazione-d-impatto-della-protezione-dei-dati-dpia-.
[38] Persónuvernd. Iceland DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.personuvernd.is/media/leidbeiningar- personuverndar/MAP- Mat- a- Ahrifum- a- Personuvernd.pdf.
[39] Autoriteit Persoonsgegevens. Netherlands DPIA Guide. Accessed: 21.02.2025. Nov. 2019. URL: https://autoriteitpersoonsgegevens.nl/themas/basis-avg/praktisch-avg/data-protection-impact-assessment-dpia#wat-zijn-de-criteria-van-de-europese-privacytoezichthouders-6668.
[40] Agencija za zaštitu osobnih podataka. Croatia DPIA Guide. Accessed: 21.02.2025. 2024. URL: https://azop.hr/procjena-ucinka/.
[41] Informacijski pooblaščenec. Slovenia DPIA Guide. Accessed: 21.02.2025. url: https://www.iprs.si/dokumenti/razno/Seznam_dejanj_obdelav_osebnih_podatkov__za_katere_velja_zahteva_po_izvedbi_ocene_ucinka_v_zvezi_z_varstvom_osebnih_podatkov.pdf.
[42] Comissão Nacional de Protecção de Dados. Portugal DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.cnpd.pt/organizacoes/obrigacoes/avaliacao-de-impacto/.
[43] Agencia Española de Protección de Datos. Spain DPIA Guide. Tech. rep. Accessed: 21.02.2025. Apr. 2023. url: https://www.aepd.es/documento/listas-dpia-en-35-4.pdf.
[44] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA Relevance). Apr. 2016.
[45] Paul Ryan, Rob Brennan, and Harshvardhan J. Pandit. “DPCat: Specification for an Interoperable and Machine-Readable Data Processing Catalogue Based on GDPR”. In: Information 13.5 (May 2022), p. 244. issn: 2078-2489. doi: 10.3390/info13050244. (Visited on 04/26/2024).
[46] Jonas Schuett. “Risk Management in the Artificial Intelligence Act”. In: European Journal of Risk Regulation (Feb. 2023), pp. 1–19. issn: 1867-299X, 2190-8249. doi: 10.1017/err.2023.1. (Visited on 04/19/2024).
[47] Úrad na ochranu osobných údajov Slovenskej republiky. Slovakia DPIA Guide. Accessed: 21.02.2025. 2024. url: https://dataprotection.gov.sk/en/legislation-guidelines/guidelines- faq/office-guidelines/list-processing-operations-that-are-subject-an-impact-assessment/.
[48] Autoritatea Naţională de Supraveghere a Prelucrării Datelor cu Caracter Personal. Romania DPIA Guide. Accessed: 21.02.2025. 2018. url: https://www.dataprotection.ro/servlet/ViewDocument? id=1870.
[49] Anna Thomaidou and Konstantinos Limniotis. “Navigating Through Human Rights in AI: Exploring the Interplay Between GDPR and Fundamental Rights Impact Assessment”. In: Journal of Cybersecurity and Privacy 5.1 (Feb. 11, 2025), p. 7. issn: 2624-800X. doi: 10.3390/jcp5010007. url: https://www.mdpi.com/2624-800X/5/1/7 (visited on 03/10/2025).
[50] Úřad pro ochranu osobních údajů. Czech DPIA Guide. Accessed: 21.02.2025. 2024. url: https://uoou.gov.cz/profesional/metodiky-a-doporuceni-pro-spravce/posouzeni-vlivu-na-ochranu-osobnich-udaju.
[51] Tietosuoja Virasto. Finland DPIA Guide. Accessed: 21.02.2025. 2024. url: https://tietosuoja.fi/vaikutustenarviointi.
[52] . Greece DPIA Guide. Accessed: 21.02.2025. Aug. 2018. url: https://www.dataprotection.gov.cy/dataprotection/dataprotection.nsf/page2c_gr/page2c_gr?opendocument.
[53] . Cyprus DPIA Guide. Accessed: 21.02.2025. 2024. url: https://www.dataprotection.gov.cy/dataprotection/dataprotection.nsf/page2c_en/page2c_en?opendocument.
[54] . Bulgaria DPIA Guide. url: https://cpdp.bg/en/list-of-processing-operations-requiring-data-protection-impact-assessment-dpia-pursuant-to-art-35-paragraph-4-of-regulation-eu-2016-679/ (visited on 02/21/2025)
AI Act Annex III Clause Purpose Produces Legal Effects Processing Context Special categories of personal data AI/ Data subject Organisation / Agent DPIA required by GDPR EDPB DPIA required by member states
1a. Biometrics identification of people MAY remote processing* biometrics* natural persons - YES YES AT,BE,BG,HR,CZ,DK,EE,FR,DE,GR,HU,IS,IE,LV,LI,LT,LU,MT,NL,NO,PL,PT,SK,SI,ES (Biometrics)
1b. Biometrics - MAY infer data*, profiling(categorisation*) biometrics*, protected attributes or characteristics* natural persons - YES YES AT,BE,BG,HR,CZ,DK,EE,FR,DE,GR,HU,IS,IE,LV,LI,LT,LU,MT,NL,NO,PL,PT,SK,SI,ES (Biometrics)
1c. Biometrics - MAY - biometrics* natural persons - YES YES AT,BE,BG,HR,CZ,DK,EE,FR,DE,GR,HU,IS,IE,LV,LI,LT,LU,MT,NL,NO,PL,PT,SK,SI,ES (Biometrics)
2a. Critical infrastructure controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity MAY - - - critical infrastructure operations/ management NO NO AT,CZ,DK,DE,GR (Use of AI in processing)
2a.1. controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity MAY Large-scale processing of personal data - - critical infrastructure operations/ management YES YES HU, PL, RO (smart meters), AT,CZ,DK,DE,GR (Use of AI in processing)
2a.2. controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity MAY Large-scale processing of personal data - - critical infrastructure operations/ management YES YES AT,CY (road traffic analysis) AT,CZ,DK,DE,GR (Use of AI in processing)
2a.3. controlling safety of critical digital infrastructure/ road traffic/ supply of water/gas/heating/electricity MAY Large-scale processing of personal data - - critical infrastructure operations/ management YES YES AT,CZ,DK,DE,GR (Use of AI in processing) RO,SK (public transport)