(Apropos of nothing in particular, though this article on Gaddafi’s cult of personality and this article on the indoctrination of children at a school in Libya probably had something to do with it. I’m also lecturing tomorrow on the mechanisms of control used by dictators, and this is something I might want to tell my students; writing helps for self-clarification).





Cults of personality are hardly ever taken seriously enough. They are often seen as a sort of bizarre curiosity found in some authoritarian regimes, their absurdities attributed to the extreme narcissism and megalomania of particular dictators, who wish to be flattered with ever greater titles and deified in ever more grandiose ways. And it is hard not to laugh at some of the claims being made on behalf of often quite uncharismatic dictators: not only is Kim Jong-il, for example, the greatest golfer in the world , but he also appears to have true superhero powers:





The Hidden People of North Korea, p. 55). In 2006 Nodong Sinmun published an article titled ‘‘Military-First Teleporting’’ claiming that Kim Jong-il, ‘‘the extraordinary master commander who has been chosen by the heavens,’’ appears in one place and then suddenly appears in another ‘‘like a ﬂash of lightning,’’ so quickly that the American satellites overhead cannot track his movements. (Ralph Hassig and Kongdan Oh,, p. 55).





To the extent that cults of personality are taken seriously, moreover, they are often analyzed in terms of their effects on the beliefs of the people who are exposed to them. Thus, the typical (if at times implicit) model of how a cult of personality “works” is one in which people are indoctrinated by exposure to the cult propaganda and come to believe in the special qualities of the leader, no matter how implausible the claims, simply because alternative sources of information about the leader do not exist. On this model, the cult of personality creates loyalty by producing false beliefs in the people, and the best way of combating its effects is by providing alternative sources of information. Even scholars who are well aware of the basic unbelievability of cults of personality often speak as if their function were to persuade people, even if they fail to achieve this objective. Hassig and Oh, for example, write that “[e]ven in North Korea few people have been convinced by this propaganda because since Kim came to power, economic conditions have gone from bad to worse” (p. 57) which makes it seem as if the main purpose of the cult of personality were to convince people of the amazing powers of Kim Jong-il.





because they have been exposed to the propaganda of the cult since they were children, though the evidence for this is scarce. In Lenin’s Tomb, David Remnick’s compulsively readable account of the last days of the Soviet Empire, one occasionally comes across descriptions of such people, usually elderly men and women who reject or rationalize any and all evidence of Stalin’s “errors” and hang on to their belief in Stalin’s godlike powers. Remnick also tells many stories of people who claim that they used to believe in Stalin but lost their faith gradually, like groupies who eventually outgrow their youthful infatuation with a band. And there is evidence that significant numbers of Russians (how many exactly it’s hard to say) remain “proud” in some sense of Stalin, though this “pride” in Stalin appears to have much less to do with Stalin’s actual cult of personality than with Stalin’s supposed achievements as a leader (e.g., winning WWII, industrializing the country, making Russia into a “high status” country that needed to be taken seriously on the world stage, etc.). Identification with a leader can be a form of “ But this way of thinking about cults of personality misses the point, I think. Not because it is entirely wrong; it is certainly plausible that some people do come to believe in the special charisma of the leaderthey have been exposed to the propaganda of the cult since they were children, though the evidence for this is scarce. In, David Remnick’s compulsively readable account of the last days of the Soviet Empire, one occasionally comes across descriptions of such people, usually elderly men and women who reject or rationalize any and all evidence of Stalin’s “errors” and hang on to their belief in Stalin’s godlike powers. Remnick also tells many stories of people who claim that they used to believe in Stalin but lost their faith gradually, like groupies who eventually outgrow their youthful infatuation with a band. And there is evidence that significant numbers of Russians (how many exactly it’s hard to say) remain “proud” in some sense of Stalin, though this “pride” in Stalin appears to have much less to do with Stalin’s actual cult of personality than with Stalin’s supposed achievements as a leader (e.g., winning WWII, industrializing the country, making Russia into a “high status” country that needed to be taken seriously on the world stage, etc.). Identification with a leader can be a form of “ status socialism ,” a way of retaining some self-respect in a regime that would otherwise provide little except humiliation. Yet, though I do not want to deny that cults of personality can sometimes “persuade” people of the superhuman character of leaders (for some values of “persuade”) or that they draw on people’s gullibility in the absence of alternative sources of information and their need for identification with high status individuals, they are best understood in terms of how dictators can harness the dynamics of “signalling” for the purposes of social control.





One of the main problems dictators face is that repression creates liars ( preference falsification , in the jargon), yet it is necessary for them to remain in power. This is sometimes called the dictator’s dilemma : it is hard for dictators to gauge their true levels of support or whether or not officials below them are telling them the truth about what is going on in the country because repression gives everyone an incentive to lie, yet they need repression if they are to avoid being overthrown by people exploiting their tolerance to organize themselves. Moreover, repression is costly and works best when it is threatened rather than actually used. All things considered, then, a dictator would often prefer to minimize repression – to use it efficiently so as to minimize its distorting effects on his knowledge and on its effectiveness. He can either allow relatively free debate, and run some risk of being overthrown (this happens especially in poor dictatorships which cannot construct a reliable monitoring apparatus, as Egorov, Guriev, and Sonin show ungated ]), or he can use repression and risk being surprised by a lack of support later.





Here is where cults of personality come in handy. The dictator wants a credible signal of your support; merely staying silent and not saying anything negative won’t cut it. In order to be credible, the signal has to be costly: you have to be willing to say that the dictator is not merely ok, but a superhuman being, and you have to be willing to take some concrete actions showing your undying love for the leader. (You may have had this experience: you are served some food, and you must provide a credible signal that you like it so that the host will not be offended; merely saying that you like it will not cut it. So you will need to go for seconds and layer on the praise). Here the concrete action required of you is typically a willingness to denounce others when they fail to say the same thing, but it may also involve bizarre pilgrimages, ostentatious displays of the dictator’s image, etc. The cult of personality thus has three benefits from the point of view of the dictator (aside from stroking his vanity):





1. When everybody lies about how wonderful the dictator is, there is no common knowledge: you do not know how much of this “support” is genuine and how much is not, which makes it hard to organize against the dictator and exposes one to risks, sometimes enormous risks, if one so much as tries to share one’s true views, since others can signal their commitment to the dictator by denouncing you. This is true of all mechanisms that induce preference falsification, however: they prevent coordination.

2. What makes cults of personality interesting, however, is that the more baroque and over the top, the better (though the “over the top” level needs to be achieved by small steps), since differences in signals of commitment indicate gradations of personal support of the dictator, and hence give the dictator a reasonable measurement of his true level of support that is not easily available to the public. (Though you have to be willing to interpret these signals, and not come to actually believe them naively).

3. Finally, a cult of personality can in fact transform some fraction of the population into genuine supporters, which may come in handy later. In a social world where everyone appears to be convinced of godlike status of the leader, it is very hard to “live in truth” as Havel and other dissidents in communist regimes argued.





To be sure, in order for a cult of personality to work, you must start small, and you must be willing to both reward (those who denounce) and punish (those who do not praise) with sufficient predictability, which presents a problem if control is initially lacking; there must be a group committed to enforcement at the beginning, and capable of slowly increasing the threshold “signal” of support required of citizens. (So some dictators fail at this: consider, e.g., Mobutu’s failures in this respect, partly from inability to monitor what was being said about him or to punish deviations with any certainty). But once the cult of personality is in full swing, it practically runs itself, turning every person into a sycophant and basically destroying everyone’s dignity in the process. It creates an equilibrium of lies that can be hard to disrupt unless people get a credible signal that others basically hate the dictator as much as they do and are willing to do something about that.



