I’m told I should prepare for the day an artificial intelligence takes my job. This will leave me either destitute and rootless or overwhelmed by a plenitude of time and existential terror, depending on whom you ask. It’s apparently time to consider what kind of work only humans can do, and frantically reorient ourselves toward those roles — lest we be left standing helplessly, as if at the end of some game of robot musical chairs.

Emotional labor is a form of work less often considered in these automated future projections. Perhaps this is because the work it takes to smile at a rude customer or to manage their distress is intangible, difficult to quantify and monetize. In no small part, performances of support go unnoticed in the same way a lot of “women’s work” does—though in recent years talk of its hidden costs has gained momentum in the labor inequality conversation.

Thanks to the wonderful tools of digital society, we are theoretically able to give and receive more support than ever. Social media platforms let us learn more about one another and stay in constant touch, so we tend to assume this knowledge promotes empathy and connectedness. We feel more educated about structural inequality problems and global humanitarian issues. Yet who’s doing the actual work of teaching?

For many people, myself included, the modern technology and social media infrastructure has not actually made life easier. In fact, it’s facilitated demand for even more emotional labor without any extra money in our paychecks. And as is the case with almost all work, it ends up being the least privileged people who are doing the heavy lifting. On Twitter, it’s mostly women of color, risking harassment every time they speak up, who are the ones regularly offering lessons on race, intersectionality, or politics. If you’ve “gotten woke” as a result of spending time on social media, it was because of the thankless labor of volunteers serving this content, usually under stress (and for the profit of the platforms they use).

I try to do this work, too, where appropriate. But emotional labor can also be intimate, encompassing the energy women are disproportionately socialized to spend ameliorating interpersonal conflicts. In the Facebook age, the daily challenges of all my friends’ lives are always right in front of me. It gets hard to pretend like I haven’t seen a call for help or support, even several, in the middle of my real-work day—whose boundaries are starting to dissolve. I can somehow lose hours in supportive dialogue with someone who isn’t a particularly close friend, or in internet arguments standing up for my values against strangers I’ll never meet.

“I spend too much time on social media” is a privileged complaint in the grand scheme, to be sure. But all in all, my friends and I are increasingly ending our days wired and anxious, tired as if we’d labored for money, yet feeling emptier. The percentage of women choosing to skip motherhood has doubled since the 1970s, and while there are all kinds of generational and economic factors involved, I wonder: What if women today just feel like we’re all out of love?

In the 1960s, Joseph Weizenbaum created a therapist chatbot named ELIZA at MIT’s Artificial Intelligence Lab. While he never meant to design a “real” AI therapist, Weizenbaum was surprised to see his secretary growing attached, turning to ELIZA voluntarily as the AI offered “patients” gentle prompts about their conditions, or mirrored their responses back. What had been intended as a satire of the smoke and mirrors behind this simulacrum of empathy (and, to an extent, certain therapeutic techniques) became a research highway into the human psyche.

Weizenbaum couldn’t have predicted that so many people would maintain an interest in ELIZA, that they’d feel a bond with her, that they would spend the next decades typing their secrets to her into a glowing screen. That unexpected attachment provides an important clue about our hopes for AI — that we want very much to turn to it for emotional labor, and that we’re willing to do so no matter how poorly it reciprocates.

We’ve long been thinking about how AI might be able to take over some of this work, whether it’s tending to the mysteries of the human heart or the existential, daily burdens of an unjust society. Robot therapists, butlers, maids, nurses, and sex dolls are familiar components of the techno-utopian future fantasy, where dutiful machines perform all our undesirable chores, while we enjoy lives of leisure. But these familiar dynamics may actually be about nurturance and care just as much, and perhaps even more, than they are about service or labor.

I saw my first robotic toy in 1985. It was a stuffed bear called Teddy Ruxpin, who read aloud to children thanks to books on cassettes inserted into its belly. In TV ads, Teddy hung out with latchkey children after school while their parents were, presumably, out climbing the ladders and skyscrapers of the era; or he lovingly read or sang them to sleep at night, his fuzzy jaw clacking away in time. In that same year, the fourth Rocky film released, in which Sylvester Stallone’s titular boxer—now wealthy—infamously gifts his old friend Paulie a talking robot butler. It was peak-1980s, this idea that economic plentitude could create a stairway straight to the future of technology and leisure. The actual robot that appeared in the film, Sico, was created to help autistic children with communication before it fell prey to the allure of Hollywood. In the movie, Paulie somehow retrofits the functionally complex, male-voiced servant into a female-voiced social companion, of which he finally grows fond (“She loves me!” he exclaims).