SAN FRANCISCO — As violence spread in the Arab world over a video on YouTube ridiculing the Prophet Muhammad, Google, the owner of YouTube, blocked access to it in two of the countries in turmoil, Egypt and Libya, but did not remove the video from its Web site.

Google said it decided to block the video in response to violence that killed four American diplomatic personnel in Libya. The company said its decision was unusual, made because of the exceptional circumstances. Its policy is to remove content only if it is hate speech, violating its terms of service, or if it is responding to valid court orders or government requests. And it said it had determined that under its own guidelines, the video was not hate speech.

Millions of people across the Muslim world, though, viewed the video as one of the most inflammatory pieces of content to circulate on the Internet. From Afghanistan to Libya, the authorities have been scrambling to contain an outpouring of popular outrage over the video and calling on the United States to take measures against its producers.

Google’s action raises fundamental questions about the control that Internet companies have over online expression. Should the companies themselves decide what standards govern what is seen on the Internet? How consistently should these policies be applied?