Google is not the only Internet company to grapple in recent days with questions involving the anti-Islamic video, which appeared on YouTube, which Google owns. Facebook on Friday confirmed that it had blocked links to the video in Pakistan, where it violates the country’s blasphemy law. A spokeswoman said Facebook had also removed a post that contained a threat to a United States ambassador, after receiving a report from the State Department; Facebook has declined to say in which country the ambassador worked.

“Because these speech platforms are so important, the decisions they take become jurisprudence,” said Andrew McLaughlin, who has worked for both Google and the White House. Most vexing among those decisions are ones that involve whether a form of expression is hate speech. Hate speech has no universally accepted definition, legal experts say. And countries, including democratic ones, have widely divergent legal approaches to regulating speech they consider to be offensive or inflammatory.

Europe bans neo-Nazi speech, for instance, but courts there have also banned material that offends the religious sensibilities of one group or another. Indian law frowns on speech that could threaten public order. Turkey can shut down a Web site that insults its founding president, Kemal Ataturk. Like the countries, the Internet companies have their own positions, which give them wide latitude on how to interpret expression in different countries.

Although Google says the anti-Islamic video, “Innocence of Muslims,” was not hate speech, it restricted access to the video in Libya and Egypt because of the extraordinarily delicate situation on the ground and out of respect for cultural norms.

Google has not yet explained why its cultural norms edict applied to only two countries and not others, where Muslim sensitivities have been demonstrably offended.