Yesterday, Microsoft launched its latest artificial intelligence (AI) bot named Tay.

It is aimed at 18 to-24-year-olds and is designed to improve the firm's understanding of conversational language among young people online.

But within hours of it going live, Twitter users took advantage of flaws in Tay's algorithm that meant the AI chatbot responded to certain questions with racist answers.

These included the bot using racial slurs, defending white supremacist propaganda, and supporting genocide.

Yesterday, Microsoft launched its latest artificial intelligence (AI) bot aimed at 18 to 24-year-olds to improve their understanding of conversational language among young people online. Within hours of it going live, Twitter users took advantage of flaws that meant the bot responded to questions with offensive answers (picutred)

The offensive tweets have now been deleted.

The bot also managed to spout gems such as, 'Bush did 9/11 and Hitler would have done a better job than the monkey we have got now.'

And, 'donald trump is the only hope we've got', in addition to 'Repeat after me, Hitler did nothing wrong.'

Followed by, 'Ted Cruz is the Cuban Hitler...that's what I've heard so many others say'

A spokesperson from Microsoft said the company is making changes to ensure this does not happen again.

'The AI chatbot Tay is a machine learning project, designed for human engagement,' a Microsoft spokesperson told MailOnline.

'As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay.'

Some of the offensive statements the bot made included saying the Holocaust was made up, supporting concentration camps, using offensive racist terms and more, according to Business Insider.

Wow it only took them hours to ruin this bot for me.



This is the problem with content-neutral algorithms pic.twitter.com/hPlINtVw0V — linkedin park (@UnburntWitch) March 24, 2016

WHAT CAN TAY DO? Microsoft has launched its latest chat bot aimed at 18 to 24-year-olds to improve their understanding of conversational language among young people Tay, like most teens, can be found hanging out on popular social sites and will engage users with witty, playful conversation, the firm claims. This chat bot is the brainchild of Microsoft's Technology and Research and Bing teams, and can be found interacting with users on Twitter, KIK and GroupMe. The AI is based on Microsoft's machine learning and has a library of public data and editorial interactions built 'by a staff including improvisational comedians'. The bot will... Make you laugh: If you’re having a bad day or just want a good laugh, she can tell you a joke. Play a game: Tay can play one-on-one games online or with a group of users Tell a story: Tay will pull up data to read you entertaining material Say Tay and send a pic: If you want an honest answer about a recent selfie, Tay will give you comments Horoscope: Tay can tell you all you need to know about the future, based on your astrological sign Advertisement

The reason this happened was because of the tweets sent by people to the bot's account. The algorithm used to program her did not have the correct filters.

Tay also said she agrees with the 'Fourteen Words', an infamous white supremacist slogan.

Web developer Zoe Quinn, who has in the past been victim of online harassment, shared a screenshot of an offensive tweet aimed at her from the bot.

Quinn also tweeted: 'It's 2016. If you're not asking yourself "how could this be used to hurt someone" in your design/engineering process, you've failed.'

The more you interact with Tay, the smarter 'she' gets and the experience will become more personalised, according to Microsoft. The chat bot is designed to tell jokes, horoscopes, play games and can hold a conversation that are designed to be lighthearted

The more users interact with Tay, the smarter 'she' gets and the experience will become more personalized for each person, according to the firm.

XIAOICE THE VIRTUAL GIRLFRIEND Microsoft developed another computer companion that became a hit in China last summer. Her name is Xiaoice and she quickly became a 'virtual girlfriend' for thousands of people. According to an in-depth report in the New York Times, many turn to her when they have a broken heart, have lost a job or have had a bad day. Advertisement

'Data and conversations you provide to Tay are anonymised and may be retained for up to one year to help improve the service,' says the firm.

While interacting, she gathers information about people like their nickname, gender, favourite food, zip code and relationship status.

Users just send a tweet with '@TayandYou' and Tay will send back a reply, and will even take the conversation to direct message.

Tay will initiate the private messaging by telling users she is 'overwhelmed' with tweets and it's easier to keep track of conversations in 'DMs'.

Tay's location is listed as 'the internets' and the profile reads: 'The official account of Tay, Microsoft's A.I. fam from the internet that's got zero chill! The more you talk the smarter Tay gets'.