Artificial intelligence at the intersection of data protection

6 September 2024 | Knowledge, News, The Right Focus

With the rapid development of artificial intelligence, there is a need to adapt legal regulations in a way that effectively protects the rights of individuals while supporting technological innovation.

In this context, a key challenge is to harmonise the AI Act with existing data protection regulations such as the GDPR.

The AI Act responds to this challenge.

Firstly, by emphasising the importance of regulations relating to the protection of privacy in a broad sense. Secondly, by explicitly stating that it is not intended to affect existing EU law on the processing of personal data. Nor is the AI Act intended to interfere with the roles and powers of independent supervisory authorities operating in this area.

To the extent that the design, development or use of AI systems involves the processing of personal data, the AI Act also does not affect the data protection obligations, under EU or national law, of providers and users of AI systems acting as data controllers or data processors.

The AI Act states that data subjects retain all the rights and guarantees granted to them under Union law (i.e. the GDPR, among others), including those related to automated decision-making in individual cases, including profiling.

The rules set out in the AI Act for the marketing, commissioning and use of AI systems should facilitate the effective implementation of such systems and enable data subjects to benefit from the guaranteed rights and other remedies available in the EU.

The Polish approach

The President of the Office for the Protection of Personal Data (UODO) also emphasises that ensuring compatibility between these regulations is one of the most important tasks of the Polish legislator.

Indeed, the AI Act aims to regulate the use of artificial intelligence in a way that minimises risks to privacy and data security. At the same time, it should promote the development of modern technology.

What does this mean in practice?

Legislation tandem or GDPR + AI ACT

Firstly, the AI Act and the GDPR should be considered as independent, equal pieces of legislation. For this reason, they are often referred to as ‘tandem legislation’.

Given that personal data is often a key part of the functionality of AI-based technologies, GDPR compliance is essential to ensure their legitimacy.

In practice, this means that companies developing AI systems need to consider data protection from the technology design stage (privacy by design) and apply an approach based on data minimisation and other data protection rules under the GDPR.

How can this be done?

Experience to date suggests that, although the two issues overlap, an independent (albeit overlapping) approach is likely to be required from a regulatory perspective.

Developing AI systems will therefore need to meet the requirements of two independent checklists:

  • The first, based on the existing provisions of the GDPR
  • The second, adapted to the requirements of the AI Act

An alternative solution could, of course, be a common checklist, although our experience suggests that there may be some difficulties in this regard.

The GDPR and the AI Act – how to reconcile the differences

Speaking of the separation of the regulations, it’s worth noting that the independence of the two regimes is reflected in the diversity of their approaches.

The GDPR sets out fairly detailed requirements for the processing of personal data, imposing obligations on controllers and processors in relation to, among other things:

  • Transparency
  • Data minimisation
  • Purpose limitation and data subjects’ rights

Most importantly, it grants data subjects a number of rights.

While the AI Act is based on similar values such as the protection of human rights and the prevention of discrimination, it is more focused on the risks associated with AI technology, whether these risks are related to personal data or other factors.

An obvious element of this independence is that the AI Act can also apply to technologies that do not process personal data and are therefore not subject to the GDPR, such as AI systems used in the industrial sector to optimise production processes.

This shows that the two pieces of legislation, while complementary, have de facto separate purposes and areas of application.

A risk-based approach, what it means for the GDPR and the AI Act

The differences between the GDPR and the AI Act are many.

Although it is very often said that both regulations take a risk-based approach, their practical application differs significantly. Both in terms of risk assessment procedures and risk classification.

At the heart of the GDPR are the rights of the individual whose data is being processed, and the associated requirements are aimed at eliminating the risks associated with such processing.

The AI Act takes an approach based on the obligations it imposes on providers and entities using AI systems. In this context, it is more of a prohibitory regulation, focused on compliance management and compliance more broadly.

Thus, while the GDPR focuses on protecting individuals from data processing risks, the AI Act addresses a wide range of risks associated with the use of artificial intelligence in different social and economic contexts.

Data – input and output

The issue of the data itself is also an interesting one.

In the GDPR, the main focus is on data, collected by the controller for a strictly defined purpose (and that purpose is a very important concept). Input data is therefore key.

In the AI Act, input data is also very important, but in a broader sense than in the GDPR, i.e., for example, in the context of training. In contrast, the emphasis is on the output, which we will not find in the GDPR, with a few exceptions. In fact, output is relevant from the point of view of risks related to potential discrimination, in the context of copyright, etc.

The biggest challenge that will arise at the interface between the two regulations will undoubtedly be artificial intelligence systems used in medicine, employment, biometrics, etc., and generally those that are considered high-risk systems. This is because there, the personal data aspect will be crucial and in addition to meeting a number of requirements of the AI Act, deployers will have to ensure that data is processed in accordance with the GDPR in order to effectively protect privacy and prevent misuse.

Thus, the future of the practical application of artificial intelligence regulations is certainly inextricably linked to the protection of personal data.

The harmonisation of regulations and de facto their enforcement is key to ensuring that AI is developed responsibly, transparently and in accordance with the rights of individuals.

This challenge will have to be met both by companies implementing AI-based solutions, by developers in the programming phase, by lawyers giving opinions on the solutions to be implemented and, finally, by supervisory authorities which, depending on the approach adopted in a given country, will have to tackle AI and data protection under a single remit or work harmoniously in two independent regimes.

Source: BPCC Contact Online

Any questions? Contact us

Latest Knowledge

Length of service now includes periods of self-employment

The length of service no longer depends solely on work carried out under a contract of employment. The amendment to the Labour Code introduces significant changes, as work carried out under civil law contracts or as part of business activity will now also be included when calculating service, which affects employees’ rights. What will this mean for employees and employers?

Banking sector overview | Banking today and tomorrow | February 2026

The Polish banking sector is undergoing intense reshuffling on a scale not seen for years. Large banks are changing owners, foreign players are shifting their strategies and new investors are entering the market. The question is whether these are just temporary shifts in capital or the beginning of lasting change in the industry’s balance of power.

31 January. Don’t forget about the DAC7 Directive

The deadline for meeting the obligations under the DAC7 directive and the Polish regulations implementing it is fast approaching. Online platform operators must fulfil their reporting obligations by 31 January 2026 at the latest with regard to 2025 data. For many, this is the final opportunity not only to prepare the required information, but also to verify whether DAC7 obligations apply to them and, if so, to what extent.

The New Consumer Credit Act – extensive regulation with a broad market impact

In 2025, the Polish financial market entered another phase of adjustments to EU legislation. The draft new Consumer Credit Act implementing the CCD2 Directive, alongside the regulations on distance financial services, represents one of the most comprehensive attempts to standardise the rules for providing finance to consumers. The changes are so extensive that they cover all stages, from advertising and customer acquisition to the assessment of creditworthiness, the structure of agreements, the scope of the lender’s liability, withdrawal rules and the detailed organisation of remote sales.

Energy Radar 2026: Your roadmap to energy transition

Energy is no longer the exclusive domain of engineers and politicians; it is becoming the foundation of the business strategy of any company that wants to remain competitive. And 2026 will see a multitude of legislative changes that will fundamentally alter the current approach to the rules for grid connection, energy trading and reporting obligations.

Banking sector overview | Banking today and tomorrow | January 2026

On 1 January, new regulations came into force that increased the income tax rate paid by banks. The rate will be 30% in 2026. However, entities starting their business, credit and savings unions (SKOKs), small entities, and banks undergoing restructuring will pay less.

Contact us:

Natalia Kotłowska-Wochna

Natalia Kotłowska-Wochna

Attorney-at-Law / New Tech, IP, Trade & Logistics Practice Group / Head of New Tech M&A

+48 606 689 185

n.kotlowska@kochanski.pl