Siri, we need to talk.

Last week Amazon announced plans to sell the Echo, its voice-controlled "intelligent" speaker, in the UK.

The Echo is a two-way wireless speaker and microphone with a virtual artificial intelligence assistant called "Alexa" that responds to voice commands such as “Read me the news” or “Turn off the lights”.

Like Siri, Alexa is ready to respond to every wish and command directed at her.

She has a female name, a female voice, and is housed inside a sleek, curvy container topped with round, glowing lips. It is perhaps no surprise that a virtual personal assistant, a typically feminine profession, should take such a form.

View more!

Of course, this is nothing new. When Siri was launched in 2010 it only came with a female voice, setting a trend for all similar devices to have a gendered persona.

Amazon’s decision to place yet another female virtual personal assistant in British homes six years later is concerning.

It is difficult to ignore the sexist origins of these various robots’ names: Apple’s Siri translates as "beautiful woman who leads you to victory"; Amazon’s Alexa is a name given to one of the five main Bratz dolls; Microsoft’s Cortana is based on a hypersexualised female character in the video game "Halo", and Facebook’s M is rumoured to be inspired by Moneypenny - the epitome of the "sexy secretary" - a woman who panders to James Bond’s misogynistic come-ons in every film.

Moneypenny spoke fewer than 200 words in 14 films, her only role was arguably forming a flying fancy for James Bond credit: REX

Even Google’s disembodied, nameless voice is female by default.

Ask any of these machines which gender they have and they, too, will claim they have none. But users still refer to them as "she" not "it", even though any sensible person can recognise that they are but sexless machines.

Siri now comes with a male option and Amy Ingham - a bot that schedules meetings via email - is also available as Andrew Ingham, but these male counterparts were both introduced as an afterthought - an update introduced, perhaps, to silence angry feminists.

And while it is true that several scientific studies have found that both men and women find female voices more appealing, this is a symptom of sexism in society that should actively be changed.

Female vocality has always been ascribed an inferior role. In ancient Greece, a woman’s voice was linked to prostitution, witchcraft and hysteria. During the Second World War - where women became telephone operators and receptionists because they were the only people available to do the job - they were coached in making their voices deferential and accommodating to customers.

Even now, women with husky voices are considered more attractive but are less likely to get a job.

By April of this year, Amazon had sold more than 3 million Echoes in the US alone. As the device launches in the UK, it is a genuine concern that it will magnify gender stereotypes.

Since birth we are programmed to tell women what to do - because they are our mothers, they are "trolly dollies", they are quiet, little ladies who don’t talk back. Now that we have an army of female AI assistants to command, the pattern begins again.

These devices are programmed to follow and obey you unconditionally, to not judge or challenge. If future generations are brought up with only female versions, how will they treat women? What if, as is the case for many young adolescents, the majority of their interactions with women are with a computer who simply cannot say no?

Female digital assistants that do not fight back reinforce the connection between a woman’s voice and submission. In fact, it encourages it.

According to Erika Hall, a professor at Emory University’s Goizueta Business School, our preferences and biases are acquired from the popular culture and human behaviour that we encounter in our lives. We become sexist just from existing in modern society, a theory called "unconscious bias." Prejudices then become stronger the more frequently they are encountered.

Even the adverts for the Echo are sexist:

So every time you call on Siri or Alexa to fetch you a pizza, you are exacerbating the association between females and instant gratification, no questions asked.

For centuries, women have struggled to have their voices heard. In some cultures, they are still fighting. These AI assistants have no need for gender, but women still need equality. It is frankly disrespectful that tech companies reinforce sexist stereotypes on our digital devices. Although, it is not entirely surprising when 60 per cent of women who work in Silicon Valley have been sexually harassed.

Making matters more creepy is the relationship between Siri and sex. In a world where virtual porn and sex dolls exist without too much question, it is not totally inconceivable that people may turn to these interactive devices for companionship.

In the 2013 film Her a mobile operating system - not dissimilar to Alexa - gets so familiar with its user Joaquin Phoenix that they become romantically attached.

The suggestion is built into some of the assistants already. Try asking Siri or Alexa a host of suggestive, inappropriate questions that, if said in a bar, would get you a slap in the face.

Alexa will say "let's just be friends" to date requests. Tell Siri you love it and , the reply is a sultry "you are the wind beneath my wings". Ask Siri for sex and you get a passive response.

Harmless flirting, some might say. Except it's not.

View more!

As Alexa arrives in British households, AI's increasing ability to make our lives easier should be celebrated, as long as we remember that these are mindless and genderless machines.

Come on Siri, tell me how to smash that glass ceiling.