A shocking report has revealed that searching for the phrase 'how to have' on YouTube brings up a list of disturbing auto-complete results.

Creepy suggestions include 'how to have s*x with your kids' and 'how to have s*x kids.'

YouTube has described the findings as 'profoundly disturbing' and says it is investigating the matter to understand what caused the results.

Scroll down for video

A shocking report has revealed that searching for the phrase 'how to have' on YouTube brings up a list of disturbing auto-complete results. Creepy suggestions include 'how to have s*x with your kids' and 'how to have s*x kids'

DISTURBING RESULTS The bizarre search results were first discovered by BuzzFeed, which searched the phrase on multiple devices in an incognito mode – meaning previous searches did not influence results. The searches revealed that the phrase 'how to have' returned the same results, including 'how to have s*x with your kids', 'how to have s*x kids' and 'how to have s*x in school.' MailOnline has subsequently tried to search for the phrase, and the disturbing results no longer appear. Advertisement

The bizarre search results were first discovered by BuzzFeed, which searched the phrase 'how to have' on multiple devices in incognito mode – meaning previous searches did not influence results.

The searches revealed that the phrase 'how to have' returned the same results, including 'how to have s*x with your kids', 'how to have s*x kids' and 'how to have s*x in school.'

In a statement, a YouTube spokeswoman said: ‘Earlier today our teams were alerted to this profoundly disturbing autocomplete result and we worked to quickly remove it as soon as we were made aware.

‘We are investigating this matter to determine what was behind the appearance of this autocompletion.’

MailOnline has subsequently tried to search for the phrase, and the disturbing results no longer appear.

Several users took to Twitter to express their concerns about the findings.

Bill Robertson wrote: 'WTF YouTube!?! 'how to have' suggests 'how to have sex with your kids'. What dark underbelly exists on YouTube?'

And WeWuzMetokur wrote: 'Don't even need 'how to have'. Just 'how to h' to get results. Based on this I believe YouTube wants me to hack my kids video game, hypnotize them as I tell them how much I hate lil wayne, and then rape them.'

Several users took to Twitter to express their concerns about the findings, including WeWuzMetokur, who wrote: 'Don't even need 'how to have'. Just 'how to h' to get results. Based on this I believe YouTube wants me to hack my kids video game, hypnotize them as I tell them how much I hate lil wayne, and then rape them.'

The timing of the discovery is less than ideal for YouTube, whose system for flagging sexual comments on children's videos was recently found to be broken for a year.

A report by moderators at YouTube found that up to 100,000 predatory accounts had been leaving indecent comments on videos, with 28 comments directed at children that were against the site's guidelines.

Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies have also pulled advertising from YouTube after an investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.

Comments from hundreds of paedophiles were posted alongside the images, which appeared to have been uploaded by the children themselves, according to an investigation.

MailOnline has subsequently tried to search for the phrase, and the disturbing results no longer appear

YOUTUBE'S REPORTING SYSTEM - YouTube's system for reporting sexualised comments has not been working correctly for more than a year - Up to 100,000 predatory accounts leaving indecent comments on videos have been found - Report identified 28 comments directed at children that were against the site's guidelines - Over a period of several weeks, five of the comments were deleted, but no action was taken against the remaining 23 - BBC contacted the company and provided a full list. All of the predatory accounts were then deleted within 24 hours Advertisement

One video of a pre-teenage girl in a nightie drew 6.5 million views.

This isn't the first time that Google – which owns YouTube - has come under fire for its controversial auto-complete results.

In December 2016, Google became embroiled in an anti-Semitism row after its autocomplete suggested the phrase 'Are Jews evil?'

Google apologized and said the results came from its autocomplete algorithm.

A blog post by Tamar Yehoshua, Product Manager of Google Search, explained why sometimes autocomplete searches go wrong.

The post said: 'Autocomplete isn't an exact science, and the output of the prediction algorithms changes frequently.

'Predictions are produced based on a number of factors including the popularity and freshness of search terms.'

MailOnline has contacted Google for comment.