The moment one is tempted to call Echo or Home a “she,” a battle has already been lost. A truly feminist Alexa, one that might decouple service work from passive femininity, wouldn’t have been cast as “Alexa” to start with, but perhaps as a baritone named Alex instead.

What’s worse than a stereotypically subservient female automaton? One that is also a bad service worker. The Star Trek computer (its name is LCARS), the fantasy origin of voice-activation devices, also had the voice of a woman, after all. But it was an eminently competent computer-woman, one who could carry out any request and access any tidbit of information instantly and accurately. LCARS is fictional, of course, so all the computer’s responses are accurate because they are scripted. Even so, the characters’ requests to “Computer” set the bar for competence in a female voice assistant. Perfection is assumed, not proficiency.

Alexa (or Siri, or Home, or any of them) works remarkably well given the current state of voice recognition, machine learning, and the other technologies that help it field requests. But it still works pretty badly. In my experience with Echo, often Alexa can’t respond at all—instead, she reports back with an apology like, “I’m sorry, I’m not sure.” When it does understand, sometimes it can’t answer effectively. When one of my kids recently asked Alexa, “How do I make popcorn on the stove?” it responded, “Sorry, no recipes were found for popcorn.” It’s kind of right, but also utterly useless.

When the robot does respond successfully, often its answers feel out of touch, inhuman even. “When did Mozart die?” for example, produces the over-detailed response, “Wolfgang Amadeus Mozart’s date of death is Monday, December 5, 1791.” Probably “in 1791” would have been sufficient; adding “Monday” makes Alexa seem pedantic and sanctimonious—other bad traits that women are sometimes accused of harboring.

Alexa can send messages to other Echo devices, and receive them. It transcribes those messages in the Amazon Alexa smartphone app, which makes it possible to read them textually, on the go. That’s convenient in certain circumstances, but like most transcriptions, Alexa’s are sometimes garbled or inscrutable. It’s a tough problem, and Amazon isn’t alone in bungling transcription. But when it does, the character of Alexa takes the blame for the flub.

Given the foibles of voice technology, it was inevitable that people would start abusing Alexa. It’s frustrating when it gets something wrong, because the service has set such a high bar for functionality. People are supposed to be able to query Alexa (or Siri) hands-free, on any subject, in circumstances that might make manipulating a smartphone difficult.

It’s worth comparing the interactions just described with similar ones on other information services that were not cast as women. If you Googled for some popcorn instructions or a Mozart biography, the textual results might also disappoint. But you’d just read over that noise, scanning the page for useful information. You’d assume a certain amount of irrelevant material and adjust accordingly. At no point would you be tempted to call Google Search a “bitch” for failing to serve up exactly the right knowledge at whim. And at no point would it apologize, like Alexa does.