What if the feel-good philosophy of Human-Centered Design allows companies to weaponize empathy, scrub criticism, and manipulate users into accepting broken systems? Focusing on the human alone is a trap. Real design power exists in the ruthless intersection of human desire and the cold, hard systems that govern our world.
Designers who ignore the intersection are the first to blame their clients — and the last to examine their own blind spots.
Human Feelings vs. Machine Logic
Human-Centered Design (HCD)
HCD is the art of the interface. It thrives on empathy, intuition, and frictionless moments. Its mission is noble: make things usable, useful, and desirable. From Donald Norman’s principles to today’s UX standards, it teaches us to see through the customer’s eyes. But alone, it’s just a polished surface. It asks “how does this feel?” while ignoring the more critical question: “what kind of system does this feed?”
System Design (SD)
System Design is the architecture of reality. It’s the relentless logic of data flows, business rules, network protocols, and security layers. It’s what keeps platforms from collapsing under traffic — and what ensures, or weaponizes, information control. When HCD operates in ignorance of system design, it breaks its ethical foundation. It applies lipstick on a pig: creating delightful experiences that hide exploitative or broken system design solutions.
HCD’s tools haven’t been abandoned — they’ve been corrupted.
How “Empathy” Became the Weapon
Corporations didn’t discard the language of empathy. They hijacked it. “User needs” now mask data harvesting. “Frictionless journeys” enable addictive loops. This is the dark translation of HCD into a manipulative service layer and the evidence is hiding in plain sight:
- The “Nudge” Turned Toxic — Behavioral psychology, originally meant to guide people toward better choices, now manufactures FOMO, anxiety, and shame to drive engagement. Instagram’s infinite scroll and like-count mechanics weren’t accidents — they were engineered using the same empathy toolkit HCD promotes.
- The “Sludge” Factory — Dark patterns aren’t design failures. They’re design successes — for someone else. Amazon’s subscription cancellation flow (Iliad Flow), which once routed users through four to six screens and a retention team, is a textbook example of HCD weaponized against the very person it claims to serve. Hidden fees, pre-ticked opt-ins, and countdown timers are systemic traps built by behavioral and business subsystems working in synchronisation — exploiting the very cognitive vulnerabilities HCD was originally designed to protect. Another similar example is Riot Games. If you try to delete your account, your are requested to give even more of your own ID data. The goal in both examples is to discourage customers to close their accounts. In the end, you are the product, your data is what those companies want.
- The Legal Muzzle — The ultimate system override. When customers leave negative reviews they get buried, critics receive cease-and-desist letters, and product defects are legally sealed from public record, this isn’t poor design — it’s a calculated architecture of intimidation. The human-centered surface remains warm and welcoming with the system underneath that is fortified against any accountability.
The combined result is a marketplace where sophisticated system-aware bad actors outmaneuver honest competitors — not by building better products, but by building better traps. George Akerlof described this dynamic in his seminal paper on “lemons”: in a market flooded with deceptive sellers, buyers lose trust, quality products get priced out, and everyone ends up with inferior goods at inflated prices. Exploitative design doesn’t just harm users. It collapses markets of quality goods and services into markets of lemons.
In economics, a “lemon” represents a product with low quality or significant defects that a customer cannot easily identify until after the purchase.
Ethical Design
Human-Centered Design is not a standalone discipline. It is one critical subsystem within a greater whole. Treating it as the main event is like perfecting a car’s cupholder while the engine is on fire.
Designers must be generalists — a system thinkers who comprehend every layer of what they build:
- The Human Subsystem (HCD) — Emotion, intuition, usability.
- The Behavioral Subsystem — The psychology of persuasion, habit formation, and choice architecture — and the ethical line between guiding and manipulating.
- The Technical Subsystem — Software, data pipelines, and algorithm logic; understanding how the code behaves when no one is watching.
- The Structural Subsystem — Business rules, legal frameworks, and profit motives; knowing whose interests the system actually serves.
Consider the difference: a designer focused only on the human experience can craft a beautiful, intuitive interface — and remain completely blind to the fact that the system behind it is harvesting data users never agreed to share. A systems-aware designer sees both. They ask not just “does this work for the user?” but “does the whole thing hold up to scrutiny?” That’s not idealism. That’s the job. Visuals are just one part of it.
Good design happens at the control center (cybernetics) — what cyberneticians call the point where all subsystems are visible, measurable, and kept in balance.
The solution isn’t to abandon human focus. It’s to expand design sovereignty.
The Integrity Mandate
Every dark pattern, every Iliad Flow, every manipulative nudge was signed off by someone who called themselves a designer. Someone who probably ran user research, built personas, and talked about human empathy. The problem was never the tools. It was the scope of responsibility designers were willing to accept.
Designers should expand that scope — or they can keep being complicit in systems they didn’t bother to understand at all. If they choose the latter, are they even designers?
At design studio Simple, we choose expansion.
