Search for:

In brief

Having to click through a gauntlet of screens to cancel recurring subscriptions. Being told you are foolish if you decline a service. Discovering you were charged extra fees that were not clearly brought to your attention earlier. Finding it hard or confusing to configure your privacy settings to high. These and similar experiences arise when you encounter “dark patterns”, a term that US authorities are increasingly using to refer to interface design strategies that manipulate users into making choices they likely wouldn’t have otherwise made and that may cause harm.


Numerous US authorities are clamping down on these strategies by defining and restricting “dark patterns” in laws, regulations and guidance documents, and instituting related enforcement actions. For example, the Federal Trade Commission (FTC) published a detailed report in September 2022 examining a multitude of practices that the agency considers to be dark patterns, some of which the FTC has consistently found to be unlawful, others of which may be found unlawful based on a case-by-case evaluation of all attendant facts. State-level regulators are also increasingly characterizing manipulative design practices they consider to be unfair and deceptive as “dark patterns”. For example, the attorneys general of 17 states recently addressed an open letter to the FTC urging it to scrutinize dark patterns in digital advertising, including to recognize that the negative effects of dark patterns cannot, in all cases, be cured merely by more prominent disclosures. Companies have paid significant sums to the FTC and state attorneys general to settle allegations that they manipulated consumers through dark patterns, such as by making it difficult for subscribers to cancel their subscriptions, obfuscating information to subvert consumers’ privacy choices, using confusing interfaces to lead consumers into making unwanted purchases, permitting players of an online game to purchase items without confirming the CVV number of a saved payment card, or locking users out of their accounts if they submit a complaint.

A number of recently enacted US laws also define and restrict the use of dark patterns. For example, the California Consumer Privacy Act, California Age-Appropriate Design Code Act, Colorado Privacy Act, and Connecticut’s Act Concerning Personal Data Privacy and Online Monitoring all expressly restrict the use of “dark patterns”. Draft regulations under the privacy laws of California and Colorado include a number of examples of dark patterns, covering topics such as symmetry in choice and choice architecture, and establish that a business’ intent not to be deceptive and the widespread usage of a practice do not make a dark pattern permissible. Even if the practices at issue are not expressly labeled as “dark patterns”, businesses should be aware that US laws prohibit various deceptive or misleading practices, such as the Restore Online Shoppers’ Confidence Act, which prohibits charging consumers for goods or services over the Internet through a “negative option feature” unless certain conditions are met, such as making clear and conspicuous disclosures and implementing “simple mechanisms” for a consumer to stop recurring charges.

To help avoid allegations that your business’ practices constitute unlawful dark patterns, we recommend that you:

  1. Disclose all material terms relating to an offer, service or feature upfront, clearly and prominently. Use plain, straightforward language that does not use technical jargon in a format that is easy on the eyes. Consider specifically bringing the user’s attention to any details that are unfavorable to them.
  2. Present your options in a symmetrical way. For example, when providing a user with options, avoid emphasizing one option with a brighter or bigger font, or using loaded language to manipulate the user into selecting one option over another. Symmetry also applies in the number of steps; the option that is less favorable to you should not take more time or steps to select.
  3. Check for unintuitive design. This can apply to the use of confusing language (e.g., double negatives) and unintuitive placement of buttons (e.g., presenting consumers with selections that read “yes” then “no”, then suddenly changing the order into “no” then “yes”).
  4. Make it easy for users to unsubscribe, opt-out, or change their settings to ones that are less favorable to you. For example, do not hide unsubscribe links, do not require users to scroll through unnecessary text or ads to unsubscribe, and honor unsubscribe requests as soon as practicable.
  5. Have a neutral third-party test-run your interface with a view to detecting and deleting dark patterns. A fresh set of eyes can help spot issues that you would otherwise have missed, and it helps if the eyes are trained to spot issues that may be considered confusing, manipulative, unfair or deceptive to an average consumer.
Author

Jonathan Tam is a licensed attorney in California and Ontario. He focuses on privacy, advertising, intellectual property, content moderation and consumer protection laws. He is passionate about helping clients achieve their commercial objectives while managing legal risks associated with activities involving data, information technology and media. Jonathan regularly writes about information technology and privacy, and is the Vice Chair of the Cybersecurity and Privacy Law Section of the Bar Association of San Francisco. He has completed secondments at a global payment services provider based in London, England and a world-leading tech company based in Silicon Valley. He joined Baker McKenzie as a summer associate in 2012 and has also worked in the Firm's Toronto office.

Author

Michelle Shin is an associate in the International Commercial Group and is based in our San Francisco office. She advises US and multinational companies on data privacy compliance, intellectual property, and consumer protection laws.