As Companies Wrongly Invoke the Guide to Justify Deception, Agency Seeks Public Input on Possible Revisions Around Dark Patterns and Other Deceptive Tactics
"Recently, the European Data Protection Board (EDPB) adopted for public consultation its 'Guidelines [...] These guidelines, like the AEPD guide, take article 5.1.a of the RGPD as a starting point to assess when a design pattern in a user interface corresponds to a dark pattern."
This paper discusses the regulation of "dark patterns" using two European Union legal frameworks, highlighting that the General Data Protection Regulation (GDPR) offers potential through data-protection-by-design but struggles due to unclear fairness definitions. It suggests that a pluralistic approach combining the strengths of GDPR and the EU's consumer protection acquis would be more effective in addressing manipulative design techniques.
"By analysing CMP services on an empty experimental website, we identify manipulation of website publishers towards subscription to the CMPs paid plans and then determine that default consent pop-ups often violate the law"
Introductory video about Dark Patterns by NNgroup
The use of unfair practices to distort consumers’ economic behaviour is not new, but it takes a new important dimension as a result of the massive collection of data and the use of technology to build consumer profiles and anticipate consumer behaviour. EU consumer law already has partial capacity to address these situations, but it is currently not sufficiently enforced. In addition, EU law must be updated to tackle these unfair practices and ensure consumers are not harmed by misleading user interfaces and data personalisation techniques.
EU data protection authorities find that the consent popups that plagued Europeans for years are illegal. All data collected through them must be deleted. This decision impacts Google’s, Amazon’s and Microsoft’s online advertising businesses.
In this post (Part Two), we examine the FTC’s approach to this issue, now and in the past. Here, we conclude that, despite the new terminology, the practices that comprise today’s dark patterns have been core elements of FTC law and policy for years.
State and federal regulators have definitely put a new emphasis on combatting so-called “dark patterns” – but other than a catchy name, is there really anything new about the types of conduct that state and federal officials are calling illegal? This two-part blogpost will take a closer look at that question.
In this article and associated twitter thread Cennydd Bowles opines that design is not manipulative by definition. In his words: "Design influences. It persuades. But if it manipulates, something’s wrong.".
"The bitter truth of addiction is obscured by the smarmy ads and compromising relationships, and yet federal oversight is downright nonexistent."
The CMA has secured improvements for Xbox online players, following concerns about Microsoft’s use of auto-renewing subscriptions for online gaming services.
"The first edition of the book came out in 2013, and our knowledge on some topics, such as social networks and mental health, is changed A LOT."
This investigation provides extensive information about the scope of the data flows and the web of third-party companies that receive that data to build detailed and intimate profiles of individuals, often without their knowledge.
The complaints allege the company has deployed ‘dark patterns,’ design tricks that can subtly influence users’ decisions in ways that are advantageous for a business
Article 13a in the DSA "explicitly forbids the use of specific techniques to extort consent to collect personal data, for instance, via repeatedly showing pop-ups. It also prevents platforms from requesting such consent if users already choose via ‘automated means’, which might be a setting in the web browser or operating system."
"If [the sellers] can confuse the consumer enough then the consumers won't necessary know what choice they're making and they can be talked into just about anything." - Richard Cordray (Former Director of CFPB, 2014)
The French data protection authority hit Facebook and Google with multimillion-dollar fines yesterday for their use of deceptive design in their cookie consent banners.
Today, the CNIL said it’s fined Google €150M (~$170M) and Facebook €60M (~$68M) for breaching French law, following investigations of how they present tracking choices to users of google.fr, youtube.com and facebook.com.
FTC has made clear that to comply with the law, businesses must ensure sign-ups are clear, consensual, and easy to cancel.
Following investigations, the CNIL noted that the websites facebook.com, google.fr and youtube.com do not make refusing cookies as easy as to accept them. It thus fines FACEBOOK 60 million euros and GOOGLE 150 million euros and orders them to comply within three months.
"Google for years has used misleading notifications to lure users into disabling its rival’s browser extensions [...] The changes include requiring users to answer whether they would rather “Change back to Google search” after adding the DuckDuckGo extension and showing users a larger, highlighted button when giving them the option to “Change it back”.
Many of health apps also have a dark side — selling your most personal data to third parties like advertisers, insurers and tech companies. [Podcast episode]
"Cancel anytime" actually means "you need to call a phone number, wait for someone to pick up and maybe you can cancel then. Or not."
"Slack-fill is the difference between the actual capacity of a container and the volume of product contained therein."
"A container that does not allow the consumer to fully view its contents shall be considered to be filled as to be MISLEADING if it contains [...] slack-fill"
On 14 December 2021, the Internal Market and Consumer Protection (IMCO) Committee of the European Parliament adopted its report on the Digital Services Act.
In a world with the EU Digital Services Act, online platforms must design web services in a way that does not trick users into giving away their personal data. If they fail, they’ll be held accountable.
Feature requires subscription even though it doesn’t use connected services.
Hidden away in #Google adtech antitrust complaint, in ref to internal docs: “We have been successful in slowing down and delaying the [ePrivacy Regulation] process and have been working behind the scenes hand in hand with the other companies.”
A digital research platform linking together theory, methods, and practice for mapping media manipulation and disinformation campaigns.
"The roughly translated “big data swindling” (大数据杀熟, dà shùjù shā shú [...] is a hotly debated term used to describe a mix of dark patterns and dynamic pricing that online platforms employ to exploit users..."
This german language article on spiegel.de introduces the concept of dark patterns.
A review of recent (2020) work on dark patterns. The authors demonstrate that the literature does not reflect a singular concern or consistent definition, but rather a set of thematically related considerations.
Academic analysis of how Fortnite is using its platform to manipulate users.
The 'platformisation' of the games industry is posing some serious challenges for Europe and the internet at large.
The third-party cookie is dying, and Google is trying to create its replacement. No one should mourn the death of the cookie as we know it.
WASHINGTON – U.S. Sen. Mark R. Warner (D-VA) released the following statement after Governor Ralph Northam signed the Consumer Data Protection Act into state law:
Font size can be the difference between compliance and a class action lawsuit
News article summarising the findings from the research paper "Price Salience and Product Choice". "StubHub concluded that so-called “drip pricing” [...] resulted in people spending about 21% more."
"In today's video, we will go through dark patterns in UI and UX. These patterns are often misleading and almost blackmailing in nature. They make you feel bad about certain decisions you take and only benefit the business."
Des fenêtres de navigation qui s’ouvrent inopinément, des couleurs criardes qui attirent l’œil, des caractères minuscules… Internet est rempli de désagréments en tout genre. Tout ceci est savamment conçu pour piéger l’internaute et porte un nom : les dark patterns. Explications.
A not-for-profit project building a collaborative, online directory of ethical companies of all kinds.
This paper emphasizes the potential of consumer protection legislation as a powerful enforcer against dark patterns, offering more protection and enforcement opportunities than the GDPR. The modernization of consumer protection rules and enhanced harmonization, along with stronger remedies and enforcement capabilities, are expected to contribute to a more effective response to manipulative design features in digital environments.
This paper provides the end-user perspective of the felt manipulation without directly using the language of dark patterns, but the examples illustrating some strategies that align with dark patterns defined in the literature.
This paper introduces the concept of "Asshole Design" and described the properties of asshole designers. The most related part of this paper is the authors differentiating dark patterns to asshole design properties, emphasizing the definition of dark patterns in relation to bad designs, value-centered, and asshole designs.
A new California law (the California Privacy Rights Act) prohibits efforts to trick consumers into handing over data or money. A bill in Washington state (SB 5062 - 2021-22) used similar language.
"UX doesn't live up to its original meaning of 'user experience.' Instead, much of the discipline today, as it's practiced in Big Tech firms, is better described by a new name. UX is now 'user exploitation.'"
The Federation of German Consumer Organisations (vzbv) filed a complaint against “advocado”, an online service that helps people find a lawyer. With its lawsuit, the consumer protection group challenged the use of dark patterns in cookie banners used.
The CPRA defines a “dark pattern” as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice” and clarifies that it should be “further defined by regulation.
"First, the CPRA adds a new definition of "consent" to the CCPA. The new definition explicitly states that "[A]greement obtained through the use of dark patterns does not constitute consent." Then, paralleling the definitions from Deceived by Design and the DETOUR Act, the CPRA defines a "dark pattern" as "a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation." Finally, the law directs that regulations regarding the sale or sharing of personal information ensure that a business obtaining consumer consent to such sale or sharing "does not make use of any dark patterns.""
On 21 January 2019, the CNIL’s restricted committee imposed a financial penalty of 50 Million euros against the company GOOGLE LLC, in accordance with the General Data Protection Regulation (GDPR), for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.
"Video about the difference between dark patterns & things badly designed by accident. With some hilarious examples of bad design"
Like millions of others, Netflix r̶e̶c̶o̶m̶m̶e̶n̶d̶e̶d̶ autoplayed The Social Dilemma documentary to my iPhone, and it made an impression.
An interaction criticism analysis of dark patterns in consent banners.
In this video Professor Lior J. Strahilevitz presents new experimental research on Dark Patterns. He examines their effectiveness, and assesses the role of market forces and legal regulation in constraining their use.
"Last year, researchers from Princeton University and the University of Chicago published a study looking at roughly 11,000 shopping sites, and found dark patterns on more than 11 percent of them, including major retailers like Fashion Nova and J.C. Penney. The researchers discovered that the more popular the website, the more likely it was to feature dark patterns."
Some say designers are uniquely positioned to stop the madness. What will it take to make the changes we desperately need?
This paper reports on qualitative research (focus groups and interviews) carried out on the theme of Dark Patterns.
In Mexico, large Black octagons are now placed on the packaging of products that are high in saturated fat, trans fat, sugar, sodium or calories.
The findings of this paper "support the notion that the EU’s consent requirement for tracking cookies does not work as intended. Further, we give insights into why this might be the case and recommendations on how to address the issue."
Academics working with StubHub carried out a huge test on hidden fees vs vs upfront fees. Users who weren’t shown fees upfront spent ≈21% more and were 14% more likely to complete a purchase. This research involved several million participants.
"A challenging exploration of
user interactions and design patterns. To play the game, simply fill in the form as fast and accurate as possible."
Manipulative political discourse undermines voters’ autonomy and thus threatens democracy.Using a newly assembled corpus of more than 100,000 political emails from over 2,800 political campaigns and organizations sent during the 2020 U.S. election cycle, we find that manipulative tactics are the norm, not the exception. The majority of emails nudge recipients to open them by employing at least one of six manipulative tactics that we identified; the median sender uses such tactics 43% of the time. Some of these tactics are well known, such as sensationalistic subject lines. Others are more devious, such as deceptively formatted“From:” lines that attempt to trick recipients into believing that the message is a continuation of an ongoing conversation. Manipulative fundraising tactics are also rife in the bodies of emails. Our data can be browsed atelectionemails2020.org
"GDPR expects specific prerequisites for a lawful consent, which should be valid, freely given, specific, informed and active… however… the majority of people do not seem to be empowered to practice their digital right to privacy and lawful consenting"
Researchers analyzed Dark Patterns in 240 apps and ran an experiment with 589 users on how they perceive Dark Patterns. They found 95% of the apps contained Dark Patterns and that most users do not recognize Dark Patterns unless informed beforehand.
An interesting counterpoint to the many stories of organisations making it hard for consumers to cancel their subscriptions. Here, Netflix does the opposite and automatically cancels premium subscriptions on inactive accounts.
This paper traces the origins of dark patterns, highlights contemporary issues, and describes where they might be heading in the future. It offers recommendations for designers to steer clear of these patterns.
Policy brief with suggestions for how to regulate dark patterns.
"We show that digital manipulation erodes users’ ability to act rationally, which empowers platforms to extract wealth and build market power without doing so on the merits. [...] our
research asserts that antitrust enforcement should go further in promoting decisional privacy."
New consent management platforms (CMPs) have been introduced to the web to conform with the EU's General Data Protection Regulation, particularly its requirements for consent when companies collect and process users' personal data.
A book featuring research on human and automated methods to deter the spread of misinformation online, such as legal or policy changes, information literacy workshops, and algorithms that can detect fake news dissemination patterns in social media.
"Designers face the same challenges as everyone else in the complex conditions of contemporary cultural life-choices about consumption, waste, exploitation, ecological damage, and political problems built into the supply chains on which the global systems of inequity currently balance precariously. But designers face the additional dilemma that their paid work is often entangled with promoting the same systems such critical approaches seek to redress: how to reconcile this contradiction, among others, in seeking to chart an ethical course of action while still functioning effectively in the world."
Business facing guidelines by the Dutch Consumers and Markets Authority showing what manipulative practices to avoid.
"After reviewing 200 of the top shopping sites, including Amazon, eBay and Macys.com, a study by the University of Michigan’s School of Information found that all the sites had an average of 19 features that could encourage impulse buying, such as limited-time discounts and wording that made an item seem like it was almost out of stock.
The Subcommittee on Consumer Protection and Commerce of the Committee on Energy and Commerce held a hearing entitled, “Americans at Risk: Manipulation and Deception in the Digital Age.” Witnesses included Monika Bickert, Joan Donovan, Ph.D., Tristan Harris and Justin (Gus) Hurwitz.
"Sludge" is an alternative term for Dark Patterns. In this paper, Cass Sunstein argues that institutions should conduct Sludge Audits to catalogue the costs of sludge, because it can hurt the most vulnerable members of society.
Researchers analyzed 300 data collection consent notices from news outlets to ensure compliance with GDPR. The analysis uncovered a variety of dark patterns that circumvent the intent of GDPR by design.
In Design for Cognitive Bias, David Dylan Thomas lays bare the irrational forces that shape our everyday decisions and, inevitably, inform the experiences we craft. Once we grasp the logic powering these forces, we stand a fighting chance of confronting them, tempering them, and even harnessing them for good.
"If you’ve wondered whether there were actually 30 people trying to book the same flight as you, you’re not alone. As Chris Baraniuk finds, the numbers may not be all they seem."
"Many e-commerce offers are pushed with fake notifications, bogus countdown timers and other misleading tactics"
"Members of Princeton’s Web Transparency & Accountability Project (WebTAP) used automated web-crawling programs to assemble a list of the dark patterns the programs could see in a page’s text. Then they classified the dark patterns’ methods systematically.
A study of dark patterns in cookie consent dialogues.
"We present automated techniques that enable experts to identify dark patterns on a large set of websites. Using these techniques, we study shopping websites, which often use dark patterns to influence users into making more purchases or disclosing more information than they would otherwise. Analyzing ∼53K product pages from ∼11K shopping websites, we discover 1,818 dark pattern instances, together representing 15 types and 7 broader categories. We examine these dark patterns for deceptive practices, and find 183 websites that engage in such practices."
Dark patterns, present across platforms and devices, work to undermine consumer choice and autonomy — but we currently have no framework for evaluating them. How might we evaluate these deceptive design interfaces to better support consumer empowerment?
Written by Cambridge Analytica Whistleblower Christopher Wylie, this book is a detailed account of the way modern marketing technologies were applied in targeted disinformation propaganda campaigns, including those that gave us Trump and Brexit.
Comparative study on the privacy practices of Amazon, Spotify and Netflix in the EU and the US. Also looks at the use of dark patterns.
DETOUR was a bi-partisan bill that aimed to curb manipulative dark pattern behavior by prohibiting the largest online platforms (those with over 100 million monthly active users) from relying on user interfaces that intentionally impair user autonomy, decision-making, or choice.
"Senators Mark Warner (D-Virginia) and Deb Fischer (R-Nebraska) have introduced legislation to ban so-called “dark patterns” tactics designed to trick users..."
Interesting case study from the UK's Behavioural Insights team (aka "nudge unit"). NOT a dark pattern, obviously! But very relevant because the same methodology and techniques are used to create and optimise dark patterns.
This essay shows how cognitive biases and dark pattern are used to manipulate people into disclosing private information. It then explains how current law allows this to continue and proposes a new approach to reign in the phenomenon.
"Online shopping turns your brain against you, but you can fight back."
Tom Scott interviews Harry Brignull about Dark Patterns.
"We show that many services that claim compliance today do not have clear and concise privacy policies. We identify several points in the privacy policies which potentially indicate non-compliance; we term these GDPR vulnerabilities. We identify GDPR vulnerabilities in ten cloud services. Based on our analysis, we propose seven best practices for crafting GDPR privacy policies."
"Gain product design foundations by bringing design processes to light, especially for growing organizations with evolving design systems. Fast-track design work by providing practical examples of patterns for a variety of real-world purposes. Level up the breadth of your skills and understanding by illuminating user experience design concepts, such as usability, accessibility, microcopy, motion design, and information architecture."
After an FTC workshop about the astronomical fees added on to most concert tickets, it is fairly clear that nothing is being done.
A well written introductory article about Dark Patterns. Also provides 10 guidelines for designers to help steer their company away from deceptive practices.
Article on how dark patterns in cookie banners are not legally valid consent mechanisms under the GDPR.
Researchers analyzed 1002 posts from the subreddit '/r/assholedesign' to identify the types of artifact being shared and the interaction purposes that were perceived to be manipulative or unethical.
"my prediction for 2019 - let’s do this like a TV show is this is the year where dark patterns really becomes the kind of thing that we’re really talking a lot about." - Paul Ohm
"Sens. Mark Warner (D-VA) and Deb Fischer (R-NE) introduced a bill that would prohibit large internet platforms like Facebook, Twitter, and Google from using deceptive design tricks as methods to trick users into handing over their personal data."