Who Programs Care?

Tomorrow, I’m facilitating a structured discussion on this exact topic at the AWP (Association for Women in Psychology) conference. This piece is the longer version of what I’ll be bringing to that room.

I didn’t know the word “culvert” until recently.

I was trying to find the right metaphor for something I kept seeing in AI therapy. I knew how to describe the thing: a man-made structure that intercepts a natural river and forces the water through a narrow channel someone else designed for efficiency, infrastructure, and control. When I finally found the word, something clicked.

And then I kept thinking, and I landed somewhere else entirely.

It’s not about rivers. It starts with the rain.


Okay, stay with me, I promise we are getting to AI therapy. But first: rain.

Rain doesn’t belong to anyone. It falls freely, without permission, without a pricing model. No one built it. No one owns it. It just falls.

Or it gets collected.

Reservoirs. Pipes. Subscription tiers. Someone looks at the rain and sees not a gift but a resource; something that can be captured before it reaches the ground, held, controlled, and then sold back to the people it once fell on freely.

This is exactly what is happening in mental health technology right now.

Someone is in pain. They need water. They reach out for help. They find an app, a chatbot, a platform… something that presents itself as care. But it was built inside a system of investor pressure, growth targets, and profit-based goals. The rain is intercepted before it reaches the person who needs it, and is pooled in a reservoir owned by people who prioritize scale and profitability over solutions.

What gets released is rationed, filtered, and shaped to serve the system, not the person who needs it.

And some people don’t get any at all, especially if their pain doesn’t map neatly onto the training data, or gets ignored by the biases and systemic inequities already baked into those datasets.

No one owns the rain, but people are building the reservoir. And those people are deciding who gets water.


This is what I mean when I talk about feminist ethics in AI therapy.

I don’t think it’s a question of whether the technology itself is good or bad. Infrastructure isn’t inherently evil. But it does reflect the values of whoever built it.

Right now, the people building mental health AI are overwhelmingly not the people who will need to use it.

Data science and AI are dominated by elite white men. This is not my opinion; it’s documented over and over again.1

This means the voices least likely to be encoded into these systems of “care” are the ones who have historically needed care the most: women, mothers, people of color, marginalized communities, and people whose pain has never been properly represented in training data.

In many cases, the data itself amplifies the harm caused because it carries forward the biases, systemic inequities, and erasures already present in the world it was scraped from, along with the profit-driven reason for collecting the data in the first place.

So, what happens when empathy is automated by people who have never been on the receiving end of systemic inequity?

You get a very efficient reservoir built for profit, not for the rain.

I want to be clear: I don’t think this is always intentional. There’s a concept called the “privilege hazard,” which is the idea that individuals in positions of power and privilege are often genuinely oblivious to the harms their decisions create.2

It may not be malicious, but it still perpetuates harmful outcomes. Intent doesn’t fix the drought.


There is another model. I think of it as a ripple.

This model starts from a different place entirely. It originates outside of a boardroom or a pitch deck and is oblivious to any growth targets.

It starts with the person in pain. Their reality is where the rain lands. And from that point of contact, care moves outward through immediate support, tools and resources, community, and eventually, if we can do this correctly, through systems and policy change.

No infrastructure intercepts the flow. No one decides how much they get or in what form. The system simply hands them what they need to use the rain. And they can use the rain however it serves them. They may drink it all. They may grow something. They may give it to someone who needs it more. They may just let it fall around them and breathe.

The water in this example finds its way because the system was built around their experience, not a revenue model.

In the capture system, people are a factor for profit. In the ripple, people are the source.


The problem is that the ripple doesn’t align with the capitalist logic so deeply embedded in how we build things right now.

I’m trying to build technology that supports perinatal moms, and I run into this constantly. The resistance isn’t usually to the mission. It’s due to the lack of a clear return-on-investment figure. The question is never “who can we help?” It’s always “how much can we make?”

This is the moment where I want to say to counselors, to mothers, to feminists, to anyone paying attention: we need to be in this conversation.

I know there’s real resistance to AI right now, and I understand it. But I think staying out of the room is the most dangerous option we have. The people in power don’t need our support to keep building. They’re already building. The technology is considered a race.3 Regulations are being blocked.4 It’s happening whether we want it to or not.

So the question isn’t whether AI gets built. It’s whether our voices are in the room when it is.

We can critique the reservoir from the outside. Or we can fight to be the ones who decide where the rain lands.

I know which one I’m trying to do.

Resources

1

Data Feminism by Catherine D’Ignazio and Lauren F. Klein, https://www.interface-eu.org/publications/ai-gender-gap, https://www.pewresearch.org/internet/2021/06/16/1-worries-about-developments-in-ai/, https://womenandtech.indiana.edu/about/news/2025/mind-the-gap.html, and many more.

2

D’Ignazio, C., & Klein, L. (2020). 1. The Power Chapter. In Data Feminism. Retrieved from https://data-feminism.mitpress.mit.edu/pub/vi8obxh7

3

https://www.goldmansachs.com/what-we-do/goldman-sachs-global-institute/articles/time-the-complicated-stakes-of-the-ai-race-between-the-us-and-china

4

https://www.theguardian.com/us-news/2025/dec/11/trump-executive-order-artificial-intelligence

Next
Next

What is AI Anyway? (It’s Not Just a Chatbot)