Abstract
Dark Patterns are deceptive design strategies used in user interfaces to manipulate users into actions they might not otherwise take—such as subscribing to unwanted services or making unintended purchases. The term was first coined in 2010 as e-commerce platforms rapidly expanded and designers, often under pressure to meet business targets, began using these manipulative tactics.
This article is split into two key parts. The first offers a detailed look at what Dark Patterns are, their origins, and the various forms they take—supported by real-world examples. The second briefly explores the wider implications of such practices and potential solutions.
In essence, this article argues that Dark Patterns violate ethical design standards. Designers must take responsibility for their impact and shift toward more honest, transparent user experiences.
Introduction
Great UX design is built around the user—ensuring that interactions are seamless, helpful, and respectful. But what happens when that power is misused? In pursuit of growth, some businesses deliberately employ manipulative design to push users toward decisions that serve corporate interests rather than the users’ own. These are known as Dark Patterns.
What Are Dark Patterns?
A Dark Pattern is a deceptive interface design that exploits human behavior to drive user actions that are not always in their best interest. Coined by cognitive scientist and UX designer Harry Brignull in 2010, the term refers to intentionally misleading designs like hidden subscriptions or unclear opt-ins.
Unlike bad design—which might stem from poor planning or lack of skill—Dark Patterns are deliberately constructed. They rely on a deep understanding of human psychology, not to help, but to exploit.
The Evolution of Dark Patterns
Dark Patterns have been around in various forms, long before the digital age. Credit card offers that advertise “0% interest” while hiding the long-term terms in fine print are an offline example.
Online, early examples included spammy pop-up ads and misleading banners. But today, these patterns have become more subtle and integrated. For instance, LinkedIn once sent unsolicited emails to users’ contacts—posing as if they came from the user. This tactic led to a class-action lawsuit in 2014, costing the company $13 million.

Common Types of Dark Patterns
Brignull’s website darkpatterns.org outlines a taxonomy of deceptive UI behaviors. Here are some of the most prevalent:
1. Bait and Switch
This involves presenting a desirable option but delivering an entirely different outcome. A famous example: users clicking the ‘X’ on a Windows 10 upgrade prompt expecting it to close, only for the upgrade to begin.
2. Disguised Ads
Ads are embedded in a way that mimics normal content, tricking users into clicking. Font websites like Dafont.com often feature misleading “Download” buttons that lead to unrelated software.
3. Forced Continuity
Users sign up for a free trial with their credit card, and once it ends, they are automatically charged—often without a clear opt-out. Coursera is one example where the free version of a course is difficult to find, nudging users toward paid subscriptions.
4. Friend Spam
This pattern involves misusing a user’s contact list under the guise of connecting them with friends, only to send spam messages on their behalf. LinkedIn’s infamous contact-harvesting practice is a prime case.
5. Hidden Costs
These appear at the final stages of a checkout process. For instance, Curology promotes a $19.95 monthly treatment, but extra fees like shipping are revealed much later in the funnel.
6. Misdirection
This design tactic draws attention to one element while hiding critical information elsewhere. For example, Skype’s 2016 software update preselected options to change users’ homepage and search engine settings.
7. Price Comparison Prevention
Sites like LinkedIn often hide or obscure the price of their premium offerings, making it difficult to compare plans or make informed choices.
8. Privacy Zuckering
Named after Facebook CEO Mark Zuckerberg, this refers to interfaces that nudge users into oversharing personal data. Zapier was noted for offering two sets of Terms—one in plain language and another filled with dense legal jargon.
9. Roach Motel
It’s easy to sign up but incredibly hard to opt out. Times Jobs India, for instance, makes account deletion nearly impossible, continuing to send emails years after signup.
10. Trick Questions
These are intentionally confusing prompts designed to get users to opt into something unknowingly. Sky’s checkout page in 2015 included an opt-out sentence so poorly phrased that many users mistakenly agreed to receive marketing emails.
Addressing the Issue
Fixing Dark Patterns requires more than user complaints—it calls for a shift in design culture. Some industry voices, like Bunker (2013), have proposed ethical guidelines centered on privacy, honesty, and respect.
Author Nir Eyal, in his book Hooked (2014), introduced the Hook Model to build engaging, habit-forming products. He acknowledges the potential for misuse and introduces a “Manipulation Matrix” to help designers evaluate their intentions ethically.
Ultimately, responsibility falls on the designer. Each interface is a touchpoint with real-world impact. The question is not only “Can we do this?” but “Should we?”
Conclusion
As our digital ecosystems grow more complex, the use of Dark Patterns must be scrutinized. Designers are more than just creators—they are decision-makers with the power to influence behavior at scale.
Short-term gains from deceptive tactics may boost metrics temporarily, but they damage trust, brand integrity, and user wellbeing in the long run. As Steve Fisher once said at the Generate NY Conference:
“Find a way to help the vulnerable around you. If you have privilege, use it for good.”
Ethical design is no longer optional—it’s essential.
Want to deepen your knowledge of UI, UX, Branding, Develop, and Illustration? Read more helpful articles at Viartisan.
Sources:
This article draws from research and examples originally presented on darkpatterns.org, Fast Company, Medium, and works by Harry Brignull, Nir Eyal, and other UX practitioners.