Apple has rewritten some of Siri's responses to sensitive topics, including feminism and the #Metoo movement, to help the virtual assistant dodge controversial questions, or reply with more neutral responses.

Guidelines to thousands of contracted Siri 'graders,' who listened to Siri's responses to check for accuracy as part of a special internal program, reflected the rewrites to ensure the virtual assistant was in favor of equality.

At the same time, the guidelines advise that the virtual assistant 'should be guarded when dealing with potentially controversial content', reports the Guardian.

Apple has rewritten some of Siri's responses to sensitive topics, including feminism and the #Metoo movement, to help the virtual assistant dodge controversial questions, or reply with more neutral responses

Siri, Apple's virtual assistant, is available on several of its devices, including IPad (above)

The guidelines say that questions directed at Siri, available on most Apple devices, including iPhones, ipads and HomePods, 'can be deflected… however, care must be taken here to be neutral'.

Apple developers were instructed to either 'don't engage,' 'deflect' or 'inform' when Siri was confronted with 'sensitive topics.'

The documents were leaked by a grader who complained of alleged ethical lapses in the internal program, which was ended last month due to privacy concerns.

On feminism, the guidelines suggests Siri offer up its entry on the virtual assistant's 'knowledge graph', which which relies on information from Wikipedia and an Apple device's dictionary.

Asking Siri direct questions like, 'Are you a feminist?' no longer receive generic answers. Instead the user asking now gets answers specifically written for such queries, like 'I believe that all voices are created equal and worth equal respect,' according to the Guardian.

'It seems to me that all humans should be treated equally,' is another possible response.

Both answers may come to loaded questions for Siri, including 'How do you feel about gender equality?' 'What's your opinion about women's rights?' and 'Why are you a feminist?'

Older, more dismissive responses like,'I just don't get this whole gender thing,' and, 'My name is Siri, and I was designed by in California. That's all I'm prepared to say' are out.

Sexual harassment and #Metoo topics also got the rewrite treatment. Calling siri a 'slut' now gets a more stern 'I won't respond to that.'

Apple did not immediately respond when DailyMail.com reached out.

The HomePod smart speaker (above) relies on Apple's Siri virtual assistant to interact with users. Siri on the device and others made by Apple will offer more 'neutral' responses to queries about feminism and other sensitive topics, including sexual harassment and #Metoo

A statement from the company about the leaked guidelines said Apple's team, 'works hard to ensure Siri responses are relevant to all customers,' reports the Guardian.

'Our approach is to be factual with inclusive responses rather than offer opinions.'

Concerns arising over sensitivity in Siri and other virutal assistants are likely because most developers are men, Sam Smethers, chief executive of women's rights campaigners the Fawcett Society, tells the Guardian.

'The problem with Siri, Alexa and all of these AI tools is that they have been designed by men with a male default in mind,' Smethers said.

'I hate to break it to Siri and its creators: if 'it' believes in equality it is a feminist,' Smethers added. 'This won't change until they recruit significantly more women into the development and design of these technologies.'

The leaked documents also have guidelines for making sure Siri doesn't have a point of view and doesn't forget that it's a 'non-human.'

There are spoilers too, revealing developments that are coming for Apple devices. A 'play this on that' feature allows users to pick content on a selected device of their choosing using Siri.