Jump to content
  • Blog
  • Published on: 31.03.2022
  • 4:01 mins

A digital future “Made in the EU”

Why the GDPR is not enough

If data is the new oil, then not only the value of data but also the problems associated with it are set to skyrocket. The problem of data protection is essentially the climate emergency of the digital world.

The European Union (EU) is already aware of how important it is to protect data in our digital world. This is why the GDPR came into force in 2018, the most prominent legal regulation regarding data protection to be implemented in recent years.

In addition, the European Commission published its “Shaping Europe’s Digital Future” strategy in 2020, which aims to create a single market for data with data protection at its core. The goal of the EU’s commitment to digitalization, also described as the European path into the digital decade, is to balance the EU’s need to keep pace with digital trends on an international scale with the need to remain competitive in the future. Trust is at the forefront of the European vision for the digital future, which encompasses a responsible approach to handling data and algorithms in the interests its citizens.

To do justice to this idea, implementing the following legislative proposals will be crucial: the Digital Services Act (DSA), the Data Governance Act (DGA), the Digital Markets Act (DMA) and the Artificial Intelligence Act (AIA). The aim of these legislative proposals is to accelerate digitalization programs, strengthen the fundamental rights of EU citizens, and establish additional control mechanisms.

We have used the Artificial Intelligence Act as an example of how this will work.

Artificial Intelligence Act (AIA)

The Artificial Intelligence Act is the European Commission’s proposal for ensuring that artificial intelligence (AI) used within the EU is safe, transparent, impartial, ethical, and under human control. The current draft of the act was drafted in April 2021

and uses a risk-based approach that divides AI systems into four different risk classes: “unacceptable risk,” “high risk,” “low risk,” and “minimal risk.” The current draft focuses on extensive regulation of AI systems deemed to pose a high risk, while the use of AI systems that pose an unacceptable risk is prohibited. Unacceptable solutions include AI systems that evaluate people’s social behavior, such as “social scoring” solutions. AI systems that pose a low or minimal risk will remain largely unregulated – this is a deliberate move to create conditions conducive to innovation in the face of what can sometimes be extensive obligations for companies. Taking its cue from the GDPR, the draft AIA sets out some severe penalties for infringements: Violations of the AIA can be punished with a fine of up to six per cent of a company’s total worldwide annual turnover.

When it comes to data protection, it is worth noting that the GDPR remains unaffected by this draft act. The AIA will supplement the data protection guidelines within the EU by setting out harmonized rules for the design, development and use of specific high-risk AI systems. The European Commission has deliberately chosen a flexible and risk-based approach to AI regulation to ensure that there is sufficient scope for future innovations that are “made in the EU.”

The conflict created by the GDPR that exists between the use of artificial intelligence and the need for data protection will be complicated by another legal framework when the Artificial Intelligence Act comes into force. At present, the quality of AI systems depends on the amount of training data available. In addition, it is not always clear during the development phase of self-learning systems what else the AI may be used for in the future aside from its original defined purpose. These scenarios demonstrate that current AI development is often diametrically opposed to core principles of the GDPR, such as data minimization and purpose limitation.

Despite these conflicts between European data protection law and the use of artificial intelligence, it is by no means impossible to press ahead with innovative AI projects. Technical progress like the production of synthetic data sets and other solutions are likely to remedy this situation in the future.

Given the significant focus on data protection at their core, the final format of the legislative proposals listed above will have a crucial impact on whether the European path into the digital decade is successful. But one thing is very clear: In choosing this path, the EU has certainly understood the challenges of our digital future.

Corporate Digital Responsibility (CDR)

If we come back to the comparison mentioned above, future data protection regulations are set to be just as important as today’s environmental regulations. As things stand, Corporate Social Responsibility (CSR) and therefore the responsible use of natural resources is often several steps ahead of existing legislation, which is encouraging companies to do what they can to fulfill the rising expectations of customers, social stakeholders and capital market players. The digital equivalent of the analog commitment to sustainability is the still relatively new concept of Corporate Digital Responsibility (CDR). Like CSR, CDR goes beyond existing legal requirements – such as the GDPR – and deals with ethical issues and the question of what constitutes good business practice in a digital world.

It is important that companies view both the statutory regulations and the CDR concept as a whole in the same way that they are already dealing with sustainability issues, i.e. as an opportunity rather than as a disadvantage. According to a study by the Smart Data Accompanying Research team commissioned by the Federal Ministry of Economic Affairs and Climate Action, CDR can “serve as a competitive advantage and key differentiator because customers are aware of the problem and there is an increasing demand for data protection-compliant packages and data processing guidelines that prioritize data protection.” In future, compliance with statutory regulations and the implementation of responsible (CDR) measures across data processing chains will be the greatest lever for gaining customer trust and therefore for guaranteeing business success in a digital world.

About our author

A "Better Tomorrow" is not possible without:

  • laying the foundations for sustainable, digital innovation
  • the ability to respond to change
  • Diversity

My heart beats faster for:

  • my family, nature and sport

Bastian Vogt

Senior Consultant, MHP

LinkedIn