The European Union has accused X, formerly known as Twitter, of using deceptive “dark patterns” to trick users. According to preliminary EU findings, these practices breach new social media block regulations under the Digital Services Act (DSA).
What are dark patterns?
Dark patterns are user interface (UI) designs designed to trick users into taking actions they might not otherwise take. Coined by Harry Brignull, founder of Deceptive Patterns, the term describes UI elements that manipulate users into decisions that benefit the service provider, often at the expense of the user. These models exploit the user’s cognitive biases and lack of attention to lead users down a path they may later regret.
Examples of dark patterns include:
- False absence: Users are pressured to act quickly by falsely indicating that what they are looking at is almost gone.
- Disguised ads: Tricking users into clicking on ads that look like navigation elements.
- Roach Motel: Making it easy to enter a situation (eg, signing up for a service) but difficult to exit (eg, canceling a subscription).
EU charges against X
The European Commission’s investigation under the DSA found that the blue X tokens, available for a monthly fee, are a form of the dark pattern. For most of Twitter’s history, these checkmarks were a sign of verification for celebrities, politicians, and other influential accounts. It was an indication that the person’s identity had been verified.
Other social networks such as LinkedIn continue to use a “verified” badge to indicate that they have verified the member’s identity.
In contrast, under Elon Musk’s ownership, any X user can get a blue token by paying $8 a month. This change has led to confusion and fraud, as users can no longer rely on the blue mark as a sign of authenticity and trustworthiness.
A quick scan of the trending topics on X showed a lot of blue flags – some familiar names, some unknowns that may or may not be real, and many anonymous accounts. At this point, apparently, the blue tick just means that someone paid X eight dollars – the person, an organization, a bot farm creator… there’s no way to tell.
According to the EU, this practice “negatively affects users’ ability to make free and informed decisions about the authenticity of accounts and the content they interact with”.
The blue checks are just one aspect of the broader allegations, which also include blocking data access for researchers and deficiencies in X’s advertising database.
Understanding EU claims for X
The EU’s findings suggest that the current blue badge system in X may mislead users into believing that some accounts are more trustworthy or authentic than they actually are. This practice undermines the original purpose of verification, which was to provide a reliable indication of an account’s legitimacy. By allowing anyone to purchase a blue token, X is effectively reducing the clarity and credibility of this token for genuine verification.
The accusations also extend to X’s compliance with transparency rules for advertising. The DSA requires platforms to keep a searchable and reliable database of all digital ads, including details of who paid for them and their target audience. The EU investigation found that X’s advertising database does not comply with these standards, hampering researchers’ ability to study the risks posed by online advertising.
The wider issue of dark patterns
Dark patterns are not unique to X. Many platforms use similar tactics to encourage users to make decisions that benefit the company. These can range from making it difficult to cancel subscriptions to automatically enrolling users in services without their explicit consent.
All major tech companies have been reported multiple times in Fraudulent Patterns. Even Trump activists and fundraisers make the list.
The most common dark patterns include making a user-friendly choice hard to find or making a brand-friendly action appear to be the default or only option. Years ago, I inadvertently shared my LinkedIn contacts by clicking a “Continue” button after adding a link. Later, they suggested that I connect with my late mother on the platform.
Regulatory Response and User Awareness
The EU’s action against X is part of a wider effort to regulate digital platforms and protect users from manipulative practices. The Digital Services Act represents an important step in this direction, setting clear guidelines for transparency and accountability.
As a user, it’s important to be aware of dark patterns and understand how they can influence our decisions.
The EU is investigating X for its alleged use of dark designs, but the issue is common across many digital platforms. Understanding and recognizing dark patterns is essential for consumers to make informed decisions and for businesses to maintain trust and transparency. By not only recognizing these tactics, but calling out the offenders, we can make better choices and hold platforms accountable.