Facebook is working to stop bad content from showing up, but the company's head of global policy said it helps if users provide context to anything that might be considered offensive.

"That's why we say if you're using, for instance, an ethnic slur to say 'we shouldn't be using this word' or 'this is something I heard somebody call someone today and thought it was terrible,' make that clear in your post and we'll leave it on the site," Monika Bickert, Facebook's head of global policy management, said on CNBC's "Squawk on the Street."

"It's often a challenge and that's because we don't always have the context to know why somebody is posting something," Bickert said.

Users have complained about their posts being viewed as hate speech — and thus removed — when the goal was educational rather than harassment. Facebook's team of content reviewers adheres to rigid standards, but there is always room for human error and a post with unclear intentions may be unfairly taken down, Bickert said.

Facebook recently made its Community Standards public, outlining the types of posts that can get users banned. The site will allow users to appeal if they believe that their post was wrongfully taken down. If another reviewer finds that the content was originally misinterpreted, it will reappear on the site.