SAN FRANCISCO — Facing longstanding criticism that they had not done enough to protect people from harassment, YouTube executives announced Wednesday that the video service would start policing material that insulted or demeaned others because of their race, gender or sexual orientation.

The policy applies to videos and comments directed at anyone, including public officials, private individuals and YouTube creators.

Enforcement will roll out over the coming weeks and months, the company said. Thousands of so-called raters eventually hired by YouTube will screen flagged videos for prohibited content. YouTube said it had put together guidelines for weighing the context of the videos and comments to properly identify harassment.

The new policy is one of a number of adjustments that YouTube has made over the last few years in an attempt to make the site less toxic. The company has introduced a range of policies restricting hate speech, extremist content and the exploitation of children.