It’s Dark, and We’re Wearing Sunglasses
The Challenge of Defining Dark Patterns in Privacy Consent User Interfaces and a Suggested Approach
Eye-catching advertisements, cunning copywriting, and influencer endorsements are ubiquitous in the global online experience. In addition to overt advertisements, user interface design is another opportunity for businesses to sway consumers to buy their products. Just as unsavory merchants can create false advertisements to trick consumers, companies can design user interfaces to trick consumers into doing things they do not want. These user interfaces are known as “dark patterns.”
Some consumer advocates have accused businesses of using dark patterns to gain consent to use their customers’ personal information.¹ Ostensibly drafted in response to this, the California Privacy Rights Act (CPRA), promulgated by Proposition 24 in November 2020, directs the California Attorney General to develop regulations defining the border between user interfaces that lawfully persuade users to share their information and user interfaces that use dark patterns to trick consumers into consenting. Unfortunately, a decade of ambiguous use of this term by consumers has set regulators up for a convoluted conversation on defining it. Proposition 24 has sent California on a wild ride into privacy regulation. However, unlike the characters in Blues Brothers,² regulators are going to have to take off their sunglasses and see the issue clearly before they “hit it.”
The CPRA introduces the term “dark pattern” to Californian law in its definition of consent.
“Consent” means any freely given, specific, informed and unambiguous indication of the consumer’s wishes by which he or she, or his or her legal guardian, by a person who has power of attorney or Is acting as a conservator for the consumer, such as by a statement or by a clear affirmative action, signifies agreement to the processing of personal information relating to him or her for a narrowly defined particular purpose. Acceptance of a general or broad terms of use or similar document that contains descriptions of personal information processing along with other, unrelated information, does not constitute consent. Hovering over, muting, pausing, or closing a given piece of content does not constitute consent. Likewise, agreement obtained through use of dark patterns does not constitute consent [emphasis added].³
The CPRA further defines dark patterns as “[…] user interface[s] designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation [emphasis added].⁴ This definition leaves ample room for regulators to decide what constitutes a dark pattern interface.
The Difficulty of Defining Dark Patterns
The term “dark pattern” has been used to describe numerous interfaces that are undesirable from the user-consumer perspective. Harry Brignell, who initially coined the term in 2010, defines dark patterns as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.”⁵ Brignull proposed many dark pattern categories, some referring to design methods and some to their purpose or effect. Most of his examples were not related to privacy options; however, businesses could conceivably adapt the designs to do so. For example, Brignull initially coined the term “dark pattern” after discovering a “sneak into basket” design on Ryanair’s website. The site automatically added insurance to a customer’s cart unless the customer selected a “No travel insurance required” option buried in a drop-down menu labeled “Please select a country of residence” between Latvia and Lithuania. Although this dark pattern was designed to trick consumers into buying something, the “sneak into basket” design could be adapted to gain putative privacy consents by making the privacy-intrusive option the default and hiding the non-consent option in a long, unrelated menu.
Brignull’s expansive definition was tacitly adopted by the FTC Commissioner, Rohit Chopra, in a statement about an order to Age of Learning concerning their service ABCMouse.
Typically, digital tricks and traps work in concert, and dark patterns often employ a wide array of both. Dark pattern tricks involve an online sleight of hand using visual misdirection, confusing language, hidden alternatives, or fake urgency to steer people toward or away from certain choices. This could include using buttons with the same style but different language, a checkbox with double negative language, disguised ads, or time pressure designed to dupe users into clicking, subscribing, consenting, or buying.⁶
He then admonished Age of Learning for using a “roach motel” interface, which is a type of dark pattern under Brignull’s definition.⁷
This usage of the term “dark pattern” is not helpful in divining its meaning in the CPRA. It just describes a range of online deceptive practices accomplished with the aid of purposefully designed user interfaces. Those deceptive practices are already prohibited under the FTC’s Section 5(a) authority. It begs the question, how much “steering” constitutes a dark pattern?
FTC’s Clear and Conspicuous Standard
The FTC’s primary mission is to prevent unfair and deceptive business practices.⁸ This authority naturally extends to advertisements, and advertisers sometimes rely on disclosures in order for their claims not to be unlawfully deceptive. These disclosures often use designs to minimize consumer focus or comprehension of those disclosures; traditionally thought of as “fine print” or “fleeting supers” on television.⁹ Since the FTC has a well-established framework for evaluating advertising disclosures, I suggest that this approach be adopted to define dark patterns in the context of privacy consent.
In various publications and orders, the FTC has opined that disclosures must be “clear and conspicuous.”¹⁰ The FTC’s guidance on what “clear and conspicuous” means in the advertising context may be instructive about how California should define dark patterns for the purpose of evaluating privacy consent interfaces.
Importantly, FTC considers “Clear and conspicuous” to be a performance standard. In a recent order, the agency stated that “Clearly and conspicuously” means that a required disclosure is difficult to miss (i.e., easily noticeable) and easily understandable by ordinary consumers.¹¹ This approach allows flexibility and puts consumers’ perceptions at the forefront.
To evaluate whether a particular disclosure is clear and conspicuous, the FTC asks advertisers to consider:
● the placement of the disclosure in the advertisement and its proximity to the claim it is qualifying;
● the prominence of the disclosure;
● whether the disclosure is unavoidable;
● the extent to which items in other parts of the advertisement might distract attention from the disclosure;
● whether the disclosure needs to be repeated several times in order to be effectively communicated, or because consumers may enter the site at different locations or travel through the site on paths that cause them to miss the disclosure;
● whether disclosures in audio messages are presented in an adequate volume and cadence and visual disclosures appear for a sufficient duration; and
● whether the language of the disclosure is understandable to the intended audience.¹²
Many of these factors could be borrowed by privacy regulators by switching the words “disclosure” with “privacy option” and “advertisement” with “website.” If the privacy options meet the clear and conspicuous standard analog, then there is no dark pattern present. However, if the user interface does not meet the clear and conspicuous standard analog, regulators would then evaluate whether these shortcomings are so severe that they subvert or impair user autonomy, decision-making, or choice. At this phase of analysis, the performance of the privacy option interface is critical. Are users actually being misled? If so, regulators should dub the interface a dark pattern, and consumers’ agreement should not constitute consent.
Conclusion
Unlike Dan Aykroyd in the Blues Brothers, privacy regulators in California do not know precisely where they are going or how to get there. The inclusion of dark patterns in the CPRA only adds to the difficulty of finding the way to a common understanding among consumers, businesses, and regulators. Fortunately, the FTC’s longstanding effort to curb advertising abuses can act as a roadmap to navigate this nebulous threat. CPRA implementation is going to be a white-knuckle ride regardless; regulators need to take their sunglasses off and look at the maps they already have before driving off in the dark.
Corwin Hockema, CIPP/US
[1] See Arille Pardes, How Facebook and other Sites Manipulate Your Privacy Choices, August 12, 2020.
[2] John Landis, Blues Brothers (1980).
[3] Cal. Civ Code 1798.140(h), (amended November 3, 2020, by initiative Proposition 24, Sec. 14.)
[4] Cal. Civ Code 1798.140(l), (amended November 3, 2020, by initiative Proposition 24, Sec. 14.)
[5] https://darkpatterns.org/index.html
[7] https://darkpatterns.org/index.html A “roach motel” design is one where entering into an arrangement is easy, but getting out is difficult; for example, hiding a cancellation option behind a series of counter-intuitive links.
[8] Federal Trade Commission Act, 15 U.S.C. § 45(a).
[9] See Lesley Fair, Full Disclosure, September 23, 2014 (FTC Business Blog).
[11] See FTC Decision and Order In the Matter of Warner Bros. Home Entertainment Inc. (2016).