Close

How dark patterns impact on GDPR

Dark patterns are interfaces designed to push users taking unaware or unwanted actions, potentially dangerous for individual privacy. The European Commission works on tackling and regulating these “deceptive design patters”; in fact, European Data Protection Board published guidelines to recognize and deal with dark patterns, especially highlighting the relevance of GDPR’s Articles 5 and 25, where the first regards the principles of fair treatment, transparency, purpose limitation and data minimization, and the second establishes essential data protection design requirements to build an interface and avoid deceptive patterns.
pc security concept
Reading time: 3 minutes

Content index

What are dark patterns?

On 9 December 2022, the European Commissioner for Justice and Consumer Protection, Didier Reynders, declared that in 2023 the European Commission would concentrate its efforts on the regulation of dark patterns and how they relate to the GDPR, as well as transparency in the online advertising market.

Dark patterns, according to the definition provided by the Personal Data Protection Authority (GPDP), are “deceptive design patterns” that can negatively influence people’s behavior and hinder their ability to keep their online privacy safe.

Specifically, dark patterns are interfaces and navigation paths designed to push users to take unaware or unwanted actions, potentially dangerous for what concerns individual privacy but very useful for the interest of the platform itself or the service provider. In other words, they are UX design strategies adopted by businesses to persuade people to unfold specific activities that otherwise they wouldn’t have considered.

In February 2023, European Data Protection Board (EDPB) published guidelines explaining how to recognize and avoid dark patterns. The document offers convenient advice to providers, managers, social media designers and users regarding how to behave with these interfaces that violate the GDPR privacy.

6 categories of dark patterns according to EDPB

The EDPB guidelines highlights that dark patterns can be divided into six categories:

  • Overloading: occurs when users receive a large amount of requests, information, options or possibilities that, one way or another, push them to share more data or to consent to the processing of personal data against user’s will and expectations;
  • Skipping: occurs when the whole interface is designed to confuse the user navigation journey who eventually forgets or doesn’t think about key aspects of data protection;
  • Stirring: affects the choice users would probably make by appealing to their emotions and using visual prompts;
  • Obstructing: happens when users are hindered or even blocked for what concerns process the provision of clear information on the use and management of their data;
  • Fickle: occurs when users consent to the processing of their data without clearly understanding the purposes, due to an inconsistent or unclear interface;
  • Left in the dark: happens when interfaces get designed to hide information or data protection control tools, hence users navigate in uncertainty and don’t know how anything about the processing of their data, the purposes and if and what kind of control they can still hold over it.

As a matter of fact, the interfaces and information submitted to users should always faithfully reflect the consequences of the action taken. A similar situation occurs when it comes to web cookies, a topic that easily triggers users while surfing websites and apps: you can learn more here. So, it’s key that the design approach never questions person’s decision and misleads choices to maintain a less protective environment in terms of data security. Instead, users should be warned that a specific choice could compromise the safety of their data and privacy.

How does GDPR regulate Dark Patterns?

As reported in the EDPB guidelines, GDPR’s Article 5 sets out the applicable principles regarding the data protection compliance of user interfaces. The principle of fair treatment set out in Article 5 (1a) serves as a starting point for assessing whether a design pattern actually works as a “misleading design pattern“. Other principles playing a role in this assessment are the transparency, the data minimization and the accountability, as stated in Article 5 (1a), (1c) and (2), as well as the purpose limitation according to Article 5 (1b). In some cases, the assessment is also based on the conditions of consent according to Article 4 and Article 7 or other specific obligations, such as Article 12 claims. As for the rights of data subjects, GDPR’s Third Chapter must also be taken into account. Mostly important, Article 25 plays a key role as establishes data protection design requirements to be applied before building an interface to avoid dark patterns.

Dark Patterns example: a social media account life cycle

The European Data Protection Board’s paper makes it clear that the GDPR’s regulations apply to the whole course of processing personal data in relation to the operation of social media platforms, or to the entire life cycle of a user account. Furthermore, the EDPB offers several real cases about misleading design patterns throughout the lifecycle of a social media account, from signup to account closure. In order to enable the efficient implementation of the General Data Protection Regulation, each use case provides interface designers with an in-depth explanation of which GDPR’s requirements are pertinent to it.

TAG

Listen to the article

Ascolta l'articolo

Most recent articles

Share article