The assassination of Iranian physicist Masoud Ali-Mohammadi on Tuesday prompted a number of questionable accusations from the Iranian government and media about who was behind the killing, claims that have been countered by sources who knew the victim.



Iran's state-controlled media blamed the U.S. and Israel. And President Mahmoud Ahmadinejad, in his first public remarks about the incident Thursday, said the murder method was "Zionist," according to Bloomberg News. U.S. officials dismissed these accusations, calling them "absurd." In stark contrast to Iran's media depicting Ali-Mohammadi as a loyalist, anonymous government sources and colleagues said he was an outspoken critic of the regime and suggested Tehran was actually behind the killing. Although a physicist, Ali-Mohammadi was not known to be involved in Iran's nuclear enrichment program.



History is riddled with examples of governments and media spreading information that lacks supporting evidence or is slanted to push an agenda. A recent U.S. example is the Bush administration's rationale for invading Iraq in 2003—that Saddam Hussein's regime was behind the attacks on the World Trade Center and the nation could be harboring weapons of mass destruction. There was no evidence to support the former assertion and, subsequent to the invasion, weapons were not found.



But government or media messages are only two potential components of misinformation campaigns. It could be said that, in general, the true power of such campaigns lie with the public, or audience, and how thoroughly they accept messages.



To get a better idea of the effectiveness of misinformation campaigns, Scientific American spoke with David Altheide, a sociologist at Arizona State University in Tempe. For several decades, he has been studying the mass media and propaganda. In his books, most recently Terror Post 9/11 and the Media (Peter Lang Publishing, 2009), Altheide explores how politicians and governments use fear and how the idea of terror has become engrained in our society.



[An edited transcript of the interview follows.]





Could you give an idea of how you would define a government misinformation campaign?

Based on a lot of my work looking at propaganda over several decades, I would define a government misinformation campaign as one in which the government intentionally distorts and/or promotes some very questionable information for public dissemination for a particular purpose. Usually the purpose is to gain support for a policy, an action—and typically this will involve some sort of an international conflict.



Would this be an instance where government groups say the opposite of what they know is true or fill in the truth with speculations? Or both?

I think that more typically it's situations in which the information is partially true and a certain very clear slant is given to it. We've known since World War II and the work by the Nazi propagandist Joseph Goebbels that extreme blatant lies that fly in the face of what an audience directly experiences don't work. So Goebbels really argued that sometimes it's more important not to deny, say, that a building was bombed, but rather to give it a particular spin to, for example, minimize the damage.



Have there been sociological studies to look at how effective different misinformation campaigns are in the public?

Yes, of sorts. The broad category is probably best referred to as propaganda research. And there's been decades of research on that for what sorts of messages, what sorts of appeals, seem to work the best. Often we find that when, let's say, a country is on the edge [of suspecting an enemy] and has had a number of reports—true or false—about experiences involving a threat, then it becomes very easy to sort of sneak in another report about that enemy, and [for it] to be believed.



What do you think are the key ingredients in a successful misinformation campaign? It seems it goes along with the public opinion, plays off of fear, and maybe off of a lack of evidence?

Those are key things, and that [the message] is visual, and that it is repeated [with] some of the same kind of language and discourse—"Here [the enemies] are again. There they go again." It can be very effective.



Look what we've done with health care in this country. We've somehow cast the whole health debate into cost. Whether or not children get health care [depends on if] we can afford it. How things get cast and then ratcheted up as being more important is always the fascinating bit. This is something that strategists, manipulators work at very carefully. We call it framing. How can we frame this issue in a way that will tap into something people are already worried about [such as cost] and that will discredit some other point of view? The cost of a war, however, is almost never an issue because this is something that we just have to do.



Would an example be the Bush administration's campaign that led to the invasion of Iraq in 2003?

Yes, that'd actually be a terrific example and something I studied pretty extensively in the book, Terrorism and the Politics of Fear. Basically the real negative image of Muslims…probably went back to the late '70s with the Iranian hostage crisis. But then, to jump ahead 20-plus years, there was a much more direct history and dislike and mistrust of Saddam Hussein and his regime, and we had really been looking for a long time for reasons to try to take stronger military action against him. The 9/11 attacks provided an opportunity to frame those attacks by blaming Iraq in general, and Hussein in particular, and therefore to justify action [by saying that], "In order to defend ourselves, we need to do this."



In your studies, did you look at how people responded to these situations, like the invasion of Iraq, and why they felt the way they did?

By looking at statements people made in the news media, by looking at opinion polls, listening to spokespersons and so forth, you're able to start piecing together the kinds of views that people had. The very broad notion here is that people were operating with such a sense of fear and mistrust of this very broad enemy—the foreigners or what we call in sociology, "the others". [Propagandists don't want us] to see everyone in the world as essentially just moms and dads trying to get by and trying to do what's best for their kids and [that] everybody's struggling. You want to have some real sharp distinctions.



[Then] it became very easy for a lot of people to support military action and then, once that happens, the whole argument shifts. It's no longer, "Are we right in our action?" "Was the original information correct that led us to this action?" It shifts from that to "Well, now we have to support the people that are fighting for us." We have to support our policies in order to keep everybody safe and in order to be good citizens.







ScientificAmerican.com became interested in the idea of misinformation because of the Iranian media reportedly saying that the U.S. and Israel were involved in the assassination of the physicist in Iran. Does this example play into distrust of Iranians in the U.S.?

I think that is really an important issue to look at. In my view, it's not at all clear yet that those reports [by the Iranian media] are totally absurd. The chances are probably pretty good that it's absurd, but we have too many examples where our government in the past has been involved in various kinds of undesirable and secretive activities. Over time, if other evidence [that Tehran was involved] becomes available, then the fact that people still might be pushing [the blame onto the U.S. and Israel] becomes very important.



You've got to remember that how we view things has less to do with what we might call the objective facts of the case than with context and our experience. For example, after 9/11 a number of opinion polls in, broadly speaking, many Arab communities showed that many citizens on the street actually believed that Israel and even the U.S. were involved in planning and carrying out the 9/11 attacks. They saw it as a way to generate anger and hostility to the Arab world and, in particular, Iraq.



No amount of evidence is able to shake those beliefs from many people. That's the challenge that we face: When people already have a mind-set and they already operate with certain kinds of narratives about how the world works and who's responsible for things, it becomes a little bit easier to slot in all sorts of things that are consistent with those views.



What about in the case of misinformation campaigns where they are not really playing off public mistrust of a group of people or a sense of "us" versus "them"? Can those campaigns still work?

They can, but not as effectively. If public support and reaction to a program or a policy or a course of action is necessary, then one of the things that you can count on is fear. "Can we rely on tapping into people's fear to get them to support something?" That's why fear has become such an important part of our news media, our entertainment industry and, increasingly, our public policy.



It seems like it would be a lot harder for governments to advance misinformation campaigns if it weren't for news reports and pop culture getting behind them. What's the motivation for those groups?

We've developed [a concept] called "media logic" 30 years ago that basically refers to the way in which media operate—especially electronic media, their grammar, format and so forth in order to attract audiences. A key thing is entertainment and fear became a staple of entertainment not only in popular culture and in movies, but in news. The politicians and government officials recognize this [and] that tapping into fear, including some of the same images that could be seen on movies, paid dividends.



Did you see news images that went from street crime to, say, images of Arab people?

The connection that was made early on with the Iraq war, and I detail that in Terrorism and Politics of Fear, was with drugs. The terrorists were claimed to be involved in heavy drug trafficking and they were benefiting from drug profits. So now when we fight drugs, we're not only fighting the drug [traffickers], we're also fighting the terrorists. There were a couple commercials that ran during the 2002 Super Bowl, which basically said, "If you're buying drugs, you're supporting terrorism." I couldn't make this stuff up. We've directed the concern with terrorism to talk about immigration, too, people crossing our borders and so forth.



Are there groups of people who are immune to misinformation or can this really affect everyone?

It affects everyone. Generally speaking, the more media literate (familiar with how the media operates), the more educated, the more critically thinking people are, the less likely [they are] to be swayed by it. People who just have more opportunity to think and evaluate information and who have access to more information and to different kinds of media are less likely to be so directly influenced.



The life-saving part of it is that falsehoods will [eventually] be recognized as that by enough people so as to discredit them. [One example is] the U.S. involvement in Vietnam. There will be some people that will never change but, over time, as long as you have sort of a free and open society, enough information comes out so that people do start seeing the falsehoods and the old claims start losing credibility.