Most Harm in Civic Tech Isn’t Intentional — But It Still Hurts

In civic tech, most harm doesn’t happen on purpose. No one sets out to design a benefits portal that retraumatizes a parent trying to keep healthcare for their child, or a government chatbot that responds with cold, generic replies to someone in crisis. And yet, that is all too often the result.

In our most cynical moments, it can feel like these systems were built to test us, to see how much stress, confusion, and humiliation we’ll tolerate to get basic help. But most of the time, the harm isn’t deliberate. It happens because teams are trying to help without the tools to understand the conditions and infrastructures of care.

Good intent isn’t the same as informed care.

Behind nearly every flawed digital service sits a group of people who genuinely want to make government work better. They’re driven by purpose and pride. But without a shared understanding of overwhelm, distress, and trauma — how it shows up, how it’s activated, and how it compounds — even well-intentioned design can hurt the very people it’s meant to serve.

That’s because trauma-informed design isn’t just about better language or more empathy in workshops and interviews. It’s about recognizing that people often interact with government during some of the most vulnerable moments in their lives: when they’re grieving, scared, or stretched beyond capacity. A trauma-informed approach means understanding that systems themselves can mirror the abuse, neglect, or abandonment that people have already survived. When we ignore this, we don’t just design poor experiences; we design harm.

The patterns of harm we rarely name.

Just as Lou Downe and Sarah Drummond describe patterns of bad service design, harm in civic tech also has its own recognizable forms. They show up again and again, often unspoken, but painfully familiar.

  • Urgency as virtue. The belief that speed equals impact. Teams race to launch and measure progress in sprints, even when what’s needed is space to build trust. Trauma-responsive practices slow the tempo, not to delay, but to prevent collateral damage.

  • Empathy extraction. The ritual of asking people to share painful stories in research sessions, without safeguards, follow-up, or care for the researcher. The story gets captured, the insight slides into a deck, and the person is left holding the aftermath alone.

  • Professional detachment disguised as neutrality. Designers are told not to “get too close.” Civil servants are told to “stay objective.” But the absence of empathy is not objectivity; it’s avoidance. And avoidance in systems built on trust becomes its own form of harm.

  • Managed compassion. The institutional reflex to create the appearance of care — a few words about well-being or healing, a mindfulness session, a tone of kind efficiency, or joking and laughing about overwork — without addressing the deeper systems that cause the harm in the first place. This, in turn, is cosplaying care in the theatrics of faux care.

Each of these patterns protects the system more than the people within it. And like most protection strategies, they come from fear. There is a fear of being overwhelmed, getting it wrong, or naming trauma at all.

Fear and forbidden language.

Since early 2025, the Trump administration’s banned words list has made terms like trauma, equity, and inclusion politically risky in federally funded work. That climate of censorship has created a chilling effect that reaches far beyond government. Organizations are now quietly policing their own language, avoiding words that feel too political, even when those words describe the reality of the people they serve.

When we strip out trauma-informed language, we also strip out trauma-informed thinking. Without that vocabulary, it becomes nearly impossible to name what’s actually happening. So instead, we invent euphemisms like user stress, access barriers, and empathy fatigue while the underlying harm goes unaddressed. Silence, in this way, becomes a sordid compliance.

When the mirror is held up.

No one likes realizing they’ve been part of harm, especially when they joined civic tech to make things better. That’s the most challenging part of trauma-informed work: it asks people to look in the mirror, not with shame, but with honesty. It asks leaders to admit that the culture of urgency, extraction, and emotional avoidance isn’t just bad for the public; it’s also burning out the people on the inside.

But when the mirror is held up, something else happens: People start to see possibilities again. They remember why they came here in the first place. Repair becomes imaginable because repair is possible.

How do we start to repair?

Repair doesn’t require new jargon or new funding cycles. It starts with naming what’s real and being honest about it. It might look like:

  • Building reflection time into research, not just debriefs or retros.

  • Creating language guides that protect both participants and practitioners.

  • Reframing resilience away from endurance and toward peer support and mutual care.

  • Setting policies that value emotional safety as much as operational success.

In short: design systems where care is not a side effect but rather essential infrastructure. That shift can feel basic or small, but it’s radical in practice. It means slowing down when everything around you screams to go faster. It means protecting your team from the moral injury of working in systems that deny the very pain they’re meant to alleviate. It means believing that a kinder government is not naive… It’s necessary.

We can still do better.

Most harm in civic tech isn’t intentional, but when we refuse to name it, we make it inevitable. The choice to become care-centered and trauma-informed — even quietly and under current extremes of surveillance and political pressure — is an act of courageous resistance. It’s how we keep the door open for care, language, and humanity to survive inside and outside systems that would much rather forget them.

The hardest part isn’t seeing the harm. The hardest part is believing we can still do better and then doing so.

__________

My name is Rachael Dietkus, and I work with educators, public interest designers, civic tech teams, and government partners to move from trauma-aware to trauma-responsive practice. If you, your team, or your organization are ready to build safer, more trustworthy, careful, and sustainable practices into how you design and deliver services, let’s connect.

You can learn more about my work at Social Workers Who Design.

Next
Next

Care as Infrastructure