Wise Words Podcast Now available on all major podcast channels.


Age of Propaganda Book Summary – Anthony R. Pratkanis

What you will learn from reading Age of Propaganda:

– How master persuaders cleverly use specific words and labels to change peoples perceptions. 

– How questions can be used to frame debates and persuade stealthily.

– Why and how credibility can be manufactured and used to persuade others. 

Age of Propaganda Book Summary:

Age of Propaganda is a fascinating book on persuasion techniques that are used and abused by advertisers, politicians and sales people to name a few. It also touches on interesting philosophical questions, such as the role of propaganda in a democracy and the difference between propaganda and education. 

It’s a must read for anyone who wants to get a deeper understanding of the way mass media changes perceptions and generally get to grips with how propaganda can be weaponised and used by a select few without others becoming conscious of it.


The Age of Propaganda:

Every time we turn on the radio or television, every time we open a book, magazine, or newspaper, someone is trying to educate us, to convince us to buy a product, to persuade us to vote for a candidate or to subscribe to some version of what is right, true, or beautiful. This aim is most obvious in advertising: Manufacturers of nearly identical products (aspirins, for example, or toothpastes, or detergents, or political candidates) spend vast amounts of money to persuade us to buy the product in their package.

Persuasion shows up in almost every walk of life. Nearly every major politician hires media consultants and political pundits to provide advice on how to persuade the public and how to get elected. Virtually every major business and special-interest group has hired a lobbyist to take its concerns to Congress or to state and local governments.

Everyday we are bombarded with one persuasive communication after another. These appeals persuade not through the give-and-take of argument and debate but through the manipulation of symbols and of our most basic human emotions. For better or worse, ours is an age of propaganda.

Every society requires a mechanism for making decisions for resolving disputes and coordinating activities. Our society has opted for persuasion.


What is Propaganda?

The forms of persuasion that have come to dominate our twentieth century lifestyle are much different from those seen in any other age of persuasion-much different from those experienced by colonial  Americans. For this reason, we use the term propaganda to refer to the mass persuasion techniques that have come to characterise our postindustrial society.

Propaganda was originally defined as the dissemination of biased ideas and opinions, often through the use of lies and deception. However, as scholars began to study the topic in more detail, many came to realise that propaganda was not the sole property of “evil” and totalitarian regimes and that it often consists of more than just clever deceptions.

The word propaganda has since evolved to mean mass “suggestion” or influence through the manipulation of symbols and the psychology of the individual. Propaganda is the communication of a point of view with the ultimate goal of having the recipient of the appeal come to “voluntarily” accept this position as if it were his or her own.


How Propaganda works:

In general, we humans seek to conserve our cognitive energy by taking mental shortcuts whenever we can, and we attempt to rationalise our thoughts and behaviour so that they appear reasonable to ourselves and others. Most, if not all, propaganda appeals attempt to take advantage of these two basic human tendencies.



Pre-persuasion is what Aristotle called atechnoi and Cicero termed statis. Pre-persuasion refers to how the problem is structured and how the decision is framed. By cleverly establishing how an issue is defined, a communicator can influence cognitive responses and obtain consent without even appearing to be attempting to persuade us.


The Routes to Persuasion:

Richard Petty and John Cacioppo argue that there are two routes to persuasion, peripheral and central.

In the peripheral route, a message recipient devotes little attention and effort to processing a communication. Some examples might include watching television while doing something else or listening to a debate on an issue that you don’t care much about. In the peripheral route, persuasion is determined by simple cues, such as the attractiveness of the communicator, whether or not the people around you agree with the position presented, or the pleasure and pain associated with agreeing with the position.

In contrast, in the central route, a message recipient engages in a careful and thoughtful consideration of the true merits of the information presented. For example, in the central route the person may actively argue against the message, may want to know the answer to additional questions, or may seek out new information. The persuasiveness of the message is determined by how well it can stand up to this scrutiny.

The personal relevance of the issue determined the route to persuasion. For those students for whom the issue of comprehensive exams was personally relevant, the strength of the message’s argument was the most important factor determining whether or not they were persuaded. In contrast, for those students for whom the issue of the comprehensive exam was not personally relevant, the source of the communication mattered the source high in expertise convinced; the one from the high school class failed to do so.


The State of Modern Democracy:

We have a state of affairs that may be called the essential dilemma of modern democracy. On the one hand, we, as a society, value persuasion; our government is based on the belief that free speech and discussion and exchange of ideas can lead to fairer and better decision making.

On the other hand, as cognitive misers we often do not participate fully in this discussion, relying instead not on careful thought and scrutiny of a message, but on simplistic persuasion devices and limited reasoning.

Mindless propaganda, not thoughtful persuasion, flourishes.


The Rationalising animal:

When our self-esteem has been threatened by our own past behaviour, we all have a powerful tendency to become rationalising animals.

We humans like to think of ourselves as rational animals. However, it is more true that we are rationalising animals, that we attempt to appear reasonable to ourselves and to others.

Albert Camus said that humans are creatures who spend their entire lives in an attempt to convince themselves that their lives are not absurd. How do we demonstrate that our lives are not absurd? By finding ways to justify our actions, as the following example illustrates.


Reducing Cognitive Dissonance:

Psychologists have reported that the people who are least likely to believe in the dangers of smoking are those who tried to quit and failed. They have become more committed to smoking. By reducing dissonance, a person defends the ego and retains a positive self-image. But self-justification can reach startling extremes; people will ignore danger in order to avoid dissonance, even when that ignorance can cause their deaths and the deaths of loved ones


How Propagandists use Cognitive Dissonance:

The reduction of dissonance is a part of everyday life; this is used to advantage by propagandists in what we call the rationalisation trap. The trap goes like this. First, the propagandist intentionally arouses feelings of dissonance by threatening self-esteem—for example, by making the person feel guilty about something, by arousing feelings of shame or inadequacy, or by making the person look like a hypocrite or someone who does not honor his or her word. Next, the propagandist offers one solution, one way of reducing this dissonance-by complying with whatever request the propagandist has in mind. The way to reduce that guilt, eliminate that shame, honour that commitment, and restore your feelings of adequacy is to give to that charity, buy that car, hate that enemy, or vote for that leader

Example: Suppose you are at home and someone knocks at your door, asking you to contribute to a worthy charity. If you don’t want to contribute, you probably wouldn’t find it too difficult to come up with reasons for declining-you don’t have the money, your contribution probably won’t help much anyway, and so on. But suppose that, after delivering a standard plea for a donation, the fund-raiser adds one simple phrase: “even a penny will help.” There goes your easy excuse. Refusing to donate after hearing this statement would undoubtedly stir up some dissonance by challenging your self concept. After all, what kind of person is it who is too mean or stingy to come up with a penny? Your previous justifications would no longer apply. The rationalisation trap is set. The request threatens your self-esteem and there is only one way to reduce this negative feeling-give to the charity.


War Propaganda:

War causes a great deal of harm and destruction, often to innocent bystanders and children. The cognition “I and my country are decent, fair, and reasonable” is dissonant with the cognition “I and my country have hurt innocent people.” If the harm is clear, then you cannot reduce your dissonance by claiming that it wasn’t done or it wasn’t really violence. In this situation, the most effective way to reduce dissonance is to minimise the humanity or maximise the culpability of the victim of your action to convince yourself the victims deserved what they got.

Dehumanisation succeeds in resolving any dissonance that may be aroused by our cruelty toward our enemies. However, watch out; the more we justify our cruelty, the easier it becomes. The rationalisation trap becomes an escalating spiral: “I committed an act of cruelty; I justify this act by believing that the victim deserved it. If the victim deserved that cruelty, well maybe they deserve more and maybe I am just the one to give it to them.


The power of words:

Perhaps Joseph Goebbels, Hitler’s minister of propaganda, best described the power of words:

It would not be impossible to prove with sufficient repetition and psychological understanding of the people concerned that a square is in fact a circle. What after all are a square and a circle? They are mere words and words can be moulded until they clothe ideas in disguise.

We all know that candy cannot be turned into a nutritional foodstuff simply by calling it by a different name. Yet this is what advertisers and political pundits do everyday for their products with great success.

Through the labels we use to describe an object or an event, we can define it in such a way that the recipient of our message accepts our definition of the situation and is thus pre-persuaded even before we seriously begin to argue. This simple rule of persuasion was recognised by Cicero as long ago as the second century. Cicero claimed that one of the factors in his success at establishing the innocence of some of Rome’s most notorious murderers was his ability to argue that their heinous crimes were not “crimes” at all, but virtuous deeds-that the victims were villains who deserved to be killed.

Often the full meaning of the inference is left to the imagination of the audience. W. H. Werkmeister calls this the use of glittering generalities. In such cases, the propagandist uses “purr” words— words that have positive connotations but are usually ambiguous in the context in which they are used. Some examples include: “A kindlier, gentler America”; “Let’s make America strong again”; “The best that money can buy”; “We must support our valiant freedom fighters.” Few would disagree that kindlier, gentler, strength, the best, and fighting for freedom are good things; in most concrete situations, however, even fewer would agree on the meaning of each of these words.


Using labels effectively:

The psychologist Gordon Allport pointed out that it is the nature of language to divide up and categorise the buzzing boom of information that comes our way every second of the day. It is this inherent nature of language that gives it the power to persuade. By labeling someone a “man,” a “woman,” a “philanthropist,” an “attractive Chinese,” a “physician,” an “athlete,” we emphasise some particular feature of the object “human being” over many other possible ones.

We then respond to these features, organising our realities around the object label. Nouns that “cut slices”-such as we-they, black-white, rich-poor, free-soviet, male-female-serve to divide up the world into neat little packages and to imply the range of appropriate courses of action to take.

Given the flexibility of the human mind, there are many ways to label any given event. This flexibility provides the key to overcoming the propagandist’s intent: When presented with a definition of the world, we can always ask, “Why this label? What other definitions of the situation would shed more light on the issue?” By looking at any given event from many different perspectives we can gain new insights to guide our decision making


How Television influences our worldview:

Let’s look at the relationship between watching television and images of the world by looking more closely at how we picture criminal activity. In an analysis of “television criminology,” Craig Haney and John Manzolati point out that crime shows dispense remarkably consistent images of both the police and criminals. For example, they found that television policemen are amazingly effective, solving almost every crime, and are absolutely infallible in one regard: The wrong person is never in jail at the end of a show.

Television fosters an illusion of certainty in crimefighting. Television criminals generally turn to crime because of psychopathology or insatiable (and unnecessary) greed. Television emphasises criminals’ personal responsibility for their actions and largely ignores situational pressures correlated with crime, such as poverty and unemployment.

Haney and Manzolati go on to suggest that this portrayal has important social consequences. People who watch a lot of television tend to share this belief system, which affects their expectations and can cause them to take a hardline stance when serving on juries.


How mass media can set agendas:

The content of the mass media sets the public’s political and social agenda. As just one example, in a pioneering study of an election in North Carolina, researchers found that the issues that voters came to consider to be most important in the campaign coincided with the amount of coverage those issues received in the local media.

Former Secretary of State Henry Kissinger clearly understood the power of the news media setting agendas. He once noted that he never watched the content of the evening news but was only interested in “what they covered and for what length of time, to learn what the country was getting.”

Why are the pictures of the world painted by the mass media so persuasive? For one thing, we rarely question the picture that is shown. We seldom ask ourselves, for example, “Why are they showing me this story on the evening news rather than some other one? Do the police really operate in this manner? Is the world really this violent and crime-ridden?” The pictures that television beams into our homes are almost always simply taken for granted as representing reality.


When mass media persuades us the most:

The information and impressions we receive through the media are relatively less influential when we can also rely on firsthand experience. Thus those of us who have been in close contact with several women who work outside the home are probably less susceptible to the stereotypes of women portrayed on television. On the other hand, regarding issues with which most of us have had limited or no personal experience, such as crime and violence, television and the other mass media are virtually the only vivid source of information for constructing our image of the world.


Courses of action and ambiguous events:

The debate over war with Iraq was really a debate over whose definition of ambiguous events was “correct.” And with good reason: once it is decided how an event or person should be categorised, it becomes clear what course of action should be taken.

In the end, debates on a course of action must come down to which definition of the situation is perceived to be correct—Is Saddam more like Hitler or is Iraq more like Vietnam? Of course, we should entertain the hypotheses that both analogies are true, that neither is true, or that perhaps other analogies also fit the situation.


Analogies and Persuasion:

Classical theories of rhetoric look with disdain upon analogy as a form of persuasion; any given analogy is vulnerable to the attack that it is based on faulty comparisons that the points of similarity presented in the analogy are irrelevant and inconsequential.

According to classical theory, analogies should be evaluated using two rules:

1. The similarities between two things must concern pertinent, significant aspects of the two things.

2. The analogy must not ignore pertinent dissimilarities between the two things being compared.


Using questions to persuade:

Question-asking can be a subtle form of pre-persuasion —a carefully crafted question can be used to define the issue at hand, to suggest subtly what the “right” answer may be, and to organise the way we think about the issue.

Lawyers also know the importance of a carefully phrased question. Textbooks describing how to cross-examine a witness stress the maxim, “Never ask a question for which you don’t already know the answer.” Put more accurately, “Never ask a question that doesn’t get the answer you want.”

Question asking can be a powerful persuasion device because questions structure our decision-making process. They do this by directing our thoughts about the issues at hand and by implicitly specifying the range of possible answers. Consider the question, “Do you support the constitutional right to bear arms?” The question directs our thoughts and attention to the constitutionality of gun ownership as opposed to other concerns, such as the right to a safe neighbourhood. The question also implicitly defines the issue as guns versus no guns and excludes middle-of-the road proposals such as gun registration, waiting periods, and restricted access to certain types of arms (such as Saturday night specials, automatic assault weapons, or nuclear arms).


How to tackle tough questions:

The stinging question about supporting the Constitution can be avoided by rephrasing it (“It’s a good question; just what will I do about the crime in our streets?”), by postponing it (“Let’s first clear up a few important issues”), by glossing over it in a way that appears to answer the question (“You know I support the Constitution of the United States and the people’s right to live in safety as they see fit”), or by diverting attention by asking yet another question (“But do you believe in the right of every American to live in safety?”).


Using decoys to create value:

The lesson to be learned from research on decoys is that the context makes a difference. Judgment is relative, not absolute.

Depending on the context, objects and alternatives can be made to look better or worse. Often we do not pay much attention to the influence of context, much less question the validity of the alternatives presented.

This greatly enhances the power of “context-makers,” such as politicians, advertisers, journalists, and sales agents. The context they set can pre-persuade us by influencing our perceptions and judgments; we are thus lulled into decisions that we would not normally make.


The rise of Factoids:

The novelist Norman Mailer, who coined the term factoid, defines it as “facts which have no existence before appearing in a magazine or newspaper. We define factoid as an assertion of fact that is not backed up by evidence, usually because the fact is false or because evidence in support of the assertion cannot be obtained.

Factoids are presented in such a manner that they become widely treated as true. In our workplaces and neighbourhoods, they are known as rumours and gossip. In courts of law, factoids are called hearsay and inadmissible evidence. In the mass media, they are called libel, slander, innuendo, and the reporting of currently circulating rumours, gossip, and hearsay as news.

The use of factoids is also a common practice in campaigns against other nations. Adolf Hitler and his propaganda minister Joseph Goebbels mastered the art of what has been termed “the big lie.” According to Nazi propaganda theory, one effective way to persuade the masses is to develop and repeat falsehoods-for example, “the German people are a master race; Europe is menaced by the Jewish conspiracy.” Such big lies are difficult to prove false.

For example, the fact that there is no evidence that a Jewish conspiracy exists is just further evidence regarding the cleverness of Jews. The big lie is then supported by many small, but often irrelevant, facts to make the big lie all that more believable—for example, some Jews own banks and Marx, the founder of communism, was Jewish. The big lie technique continues to be used by many governments today.


How do Factoids work?

As Mark Twain once put it, “A lie can travel halfway around the world while the truth is putting on its shoes.” Why are factoids so persuasive? We can suggest three reasons.

First, few attempts are made to verify the truth of a factoid. We often hear rumour and gossip directly from trusted friends whom we are not in a habit of questioning. We also turn to the “news”—whether on television or in other mass media-expecting, quite frankly, “the news” and are often not prepared to debunk each and every presented “fact.” Factoids often just sneak by our persuasion defenses. We rarely think to ask, “Is this factoid really true?

Second, we accept factoids because they often meet one or more psychological needs. For example, many factoids are entertaining and thus capture our attention-it is fun to look for clues that Paul McCartney is dead. More seriously, the very best factoids help us rationalise and justify our most fundamental worries and concerns. The “Leaflet of Villejuif” confirms the belief held by many that big corporations have conspired to kill us for profit.

Finally, and perhaps most importantly, factoids function as a form of pre persuasion; they create social reality. Factoids serve as bits and pieces that are used to construct our picture of the world. As such, factoids direct our attention and suggest how we should interpret the world.

Even when the factoid is shown to be false, it can still direct attention and thinking.

For example, a political candidate falsely accused of embezzling must spend his or her time refuting and denying the charge instead of proposing substantive new programs in an attempt to win the election. And, as we saw in the research by Daniel Wegner and colleagues, such denials are usually doomed to fail-because they often serve to remind the casual reader of the original accusation.


Associating with negative activities:

It appears that simply questioning a candidate’s connection with unseemly activities can be enough to damage that candidate’s public image. What’s more, the source of the innuendo made little difference: The candidates were still rated negatively even if the source of headline was a newspaper lacking in credibility (the National Enquirer or the Midnight Globe as opposed to the New York Times or the Washington Post). Negative political advertising and slur campaigns often do indeed pay off.


Faking Credibility:

Credibility has become a commodity not only to be feigned but also to be bought and sold on the open market.

The fact that we humans often use the credibility of a communicator to guide us in the acceptance or rejection of a message opens the door once again for mindless propaganda. Although it makes sense to believe the credible source when his or her expertise and trustworthiness are directly related to the issue at hand, it is often easier to feign credibility than to actually achieve it.

One of the important tasks of media research is to keep tabs on the “reputation and credibility” of public figures such as movie stars, sports figures, and other public “personalities.” Advertisers want to know which celebrity figures are most believable, who among the stars is most liked by the public, who has been on the cover of leading magazines, and who has been overexposed.


How to manufacture credibility:

One of the morals of Han Fei-Tzu’s story is that communicators can make themselves seem trustworthy by apparently acting against their own self-interest. If we are led to believe that communicators have nothing to gain and perhaps something to lose by convincing us, we will trust them and they will be more effective.

Another way of increasing the perception of credibility: The apparent trustworthiness of a person can be increased and the apparent bias of the message decreased if the audience is absolutely certain the person is not trying to influence them. To illustrate, let us bring this discussion into the twentieth century. Suppose a stockbroker calls you up and gives you a hot tip on a particular stock. Will you buy? It’s hard to be sure. On the one hand, the broker is probably an expert, and this might influence you to buy.

On the other hand, the stockbroker has something to gain by giving you a tip (a commission), and this could lower her effectiveness. But suppose you accidentally happened to overhear her telling a friend that a particular stock was about to rise. Because she was obviously not trying to influence you, you might be more readily influenced.

The manufacture of credibility can lead to a cult of personality. Instead of thinking about important issues for ourselves, we turn to credible-looking leaders for their solutions. This strategy would make some sense if the people we turn to do indeed possess the required expertise. Sadly, it is often the case that “credibility” has been subtly manufactured and sold for propaganda purposes. And the media often serve as unwitting handmaidens in this charade.


Role models and Persuasion:

The power of models to persuade has not been lost on advertisers.

Thirty second spots are full of chubby people losing weight with the right diet aid; housewives impressing their husbands with clean houses brought about by an advertised cleanser; boyfriends and girlfriends being rewarded with love, affection, and prestige for just one smile polished by a special toothpaste; young couples achieving the good life with the use of a major credit card. These models do more than just sell products; they also reinforce values (e.g., thin is good) and teach lifestyles (e.g., housewives as husband pleasers, yuppies have the good life).

Mass media models are effective for two primary reasons. First, they teach new behaviour. For example, a young child learns the “specifics” of how to shoot and kill a “suspect” by watching television crime shows such as “Miami Vice” or “Hunter.” A newly married couple discovers that the good life can be purchased on credit by watching others on television use a credit card. A teenager learns about the attractiveness of committing suicide from a news report.

This brings us to the second reason why mass media models are persuasive: They serve as a cue to indicate that a certain behaviour is legitimate and appropriate. After watching a prizefight, it appears to be okay to “punch someone’s lights out;” hearing about a teenage suicide says that this may be a reasonable way to deal with life’s problems or that those who have wronged us will now be forced to grieve for us; watching a housewife mop the floor convinces us that this is an appropriate lifestyle for women (but not necessarily for men). A steady diet of mass media models can shape and twist our understanding of what is right and wrong with the world


Heuristics and product quality:

Other heuristics are also used to encourage consumers to infer product quality and thus buy a particular brand. Three of the most important are price, store image, and brand name.

Each heuristic comes with its own rules for inferring product quality.

Price – the higher the price, the better the quality which is probably true when it comes to such products as Yugos and Rolls Royces, but not necessarily the case for wines, medicines, sport sneakers, prepackaged foods, and a host of other products.

Store image – The same pair of jeans looks better in a high-class department store than in the local discount mart.

Brand name – Nationally recognised brand names are deemed superior to store brands and generics.

This raises an important question: What conditions are most likely to lead to heuristic rather than rational decision making? Research has identified at least five such conditions.

Heuristics are most likely to be used when we not have time to think carefully about an issue, when we are so overloaded with information that it becomes impossible to process it fully, or when we believe that the issues at stake are not very important.

Heuristics are also used when we have little other knowledge or information on which to base a decision and when a given heuristic comes quickly to mind as we are confronted with a problem.


The Self-Sell

Subsequent persuasion research has shown that self-generated persuasion-whether induced by group discussion, by getting someone to role-play an opponent’s position, or by asking a person to imagine adopting a course of action is one of the most effective persuasion tactics ever identified. Indeed, one recent set of studies found that just thinking about how to transmit a persuasive message to others will result in changes in attitude that persist at least twenty weeks.

The technique of self-generated persuasion combines many of the facets of successful persuasion discussed throughout this book. It gains its power from providing subtle social cues and directions that ask the target of influence, in effect, to “think up as many positive cognitive responses about the issue as you can and, if you do happen to come up with some counterarguments, to be ready to refute them.”

The resulting message will come from a source that you almost always consider credible, trustworthy, respected, and liked yourself. The act of generating arguments is an act of commitment to the cause. After all, they’re your ideas, aren’t they?


Vividness and Persuasion:

By vividness, we mean a message that is

(1) emotionally interesting (it attracts our feelings),

(2) concrete and imagery-provoking, and

(3) immediate (it discusses matters that are personally close to us).

For example, one anti-fluoridation leaflet consisted of a picture of a rather ugly rat, along with the inscription “Don’t let them put rat poison in your drinking water.” The referendum to fluoridate the water supply was soundly defeated.

Why do vivid appeals work? Vivid messages affect our cognitive responses in at least three possible ways.

First, vivid information attracts attention. It helps the communication stand out in the message-dense environment.

Second, vividness can make the information more concrete and personal. We saw earlier the persuasive power of self-generated arguments and images.

Finally, a vivid presentation can make the material more memorable. This is especially important if we do not reach an immediate conclusion but base our later judgments on information that comes readily to mind.


Repetition and Persuasion:

The power of repetition in propaganda was well understood by Joseph Goebbels, the head of the Nazi propaganda ministry. His propaganda crusades were based on a simple observation: What the masses term truth is that information which is most familiar. As Goebbels put it:

The rank and file are usually much more primitive than we imagine. Propaganda must therefore always be essentially simple and repetitious. In the long run only he will achieve basic results in influencing public opinion who is able to reduce problems to the simplest terms and who has the courage to keep forever repeating them in this simplified form despite the objections of intellectuals.”

The repetition of simple messages, images, and slogans can create our knowledge of the world, defining what is truth and specifying how we should live our lives.


How advertisers use repetition:

Advertisers know that repeated exposure can lead to what is known as “wear-out” when an ad loses its effectiveness because consumers find repeated exposures to be tedious and annoying. Wear-out effects are most likely to occur with ads that attract much attention, such as humorous ads and informational messages. Advertisers attempt to eliminate wear-out by using a technique known as “repetition-with variation.”

In this technique, the same information or theme is repeated many times, but the presentation format is varied. For example, the Marlboro man is posed at different angles and is seen in different settings, or the halitosis-fighting power of Listerine is demonstrated at work, in the home, and on a date.


Changing peoples minds with dissonance:

The greater the discrepancy, the greater the discomfort for the members of the audience. But this does not necessarily mean the members of an audience will change their opinion.

There are at least four ways in which the members of an audience can reduce their discomfort: (1) They can change their opinion; (2) they can induce the communicator to change his or her opinion; (3) they can seek support for their original opinion by finding other people who share their views, in spite of what the communicator says; or (4) they can derogate the communicator convince themselves the communicator is stupid or immoral—and thereby invalidate that person’s position.

The conflicting results are accounted for: When a communicator has high credibility, the greater the discrepancy between the view he or she advocates and the view of the audience, the more the audience will be persuaded; on the other hand, when a communicator’s credibility is doubtful or slim, he or she will produce maximum opinion change at moderate discrepancies.


Protagoras’ ideal

When it comes to persuasion, the early Greeks possessed a vision of the ideal. The sophist Protagoras believed that there are two sides to every issue. His fellow citizens agreed; they thought that persuasion happened best when all parties concerned knew both sides of the issue at hand. By juxtaposing one argument against an opposing one, the issue is made clear, the advantages and disadvantages of the available courses of action are revealed.

But Protagoras’ ideal is a long way from the type of persuasion one finds in the mass media today. The typical ad is one-sided and consists of little information, much puffery, and many exaggerated claims. “This pain reliever is gentler, faster, best.” “Brand X will take fat off fast.” “You can’t beat the taste of Brand Y.” “Only presidential candidate Z will strengthen defense, increase domestic spending, and balance the budget (Congress willing).” For the most part, debate and comparison are missing from today’s persuasion landscape.


Can two side arguments be more persuasive?

Is it possible ever for a two-sided argument to be more persuasive than a one-sided message? Or to put it more bluntly, what factors make a one-sided argument so effective that they seem to be the tactic of choice for most modern propagandists. To sharpen our understanding of the uses and abuses of one- and two-sided communications, let’s examine the conditions under which each is most persuasive.

Before trying to answer this question, let’s look closely at the factors involved. If a communicator mentions the opposition’s arguments, it might indicate that he or she is an objective, fair-minded person; this could enhance the speaker’s trustworthiness and thus increase his or her effectiveness.

On the other hand, if a communicator so much as mentions the arguments on the other side of the issue, it might suggest to the audience that the issue is a controversial one; this could confuse members of the audience, make them vacillate, induce them to search for counterarguments, and ultimately reduce the persuasiveness of the communication.

Research generally finds that the more well informed the members of the audience are, the less likely they are to be persuaded by a one-sided argument and the more likely they are to be persuaded by an argument that brings out the important opposing arguments and then attempts to refute them.


Fear and Persuasion:

What does the evidence tell us? Experimental data overwhelmingly suggest that all other things being equal, the more frightened a person is by a communication, the more likely he or she is to take positive preventive action.

Is this true for all people? It is not.

There is good reason why common sense can lead us to believe that a great deal of fear leads to inaction: It does-for certain people, under certain conditions.

What Leventhal and his colleagues discovered is that the people who had a reasonably good opinion of themselves (high selfesteem) were the ones most likely to be moved by high degrees of fear arousal. People with low opinions of themselves were the least likely to take immediate action when confronted with a communication arousing a great deal of fear-but (and here is the interesting part) after a delay, they behaved very much like the subjects with high self-esteem. People who have a low opinion of themselves may have difficulty coping with threats to themselves.

Very similar results were obtained in Leventhal’s cigarette experiment. Leventhal found that a high-fear communication produced a much greater intention to stop smoking. Unless it was accompanied by recommendations for specific behaviour, however, it produced little results. Similarly, specific instructions (buy a magazine instead of a pack of cigarettes, drink plenty of water when you have the urge to smoke, and so on) without a fear arousing communication were relatively ineffective. The combination of fear arousal and specific instructions produced the best results; the students in this condition were smoking less four months after they were subjected to the experimental procedure.

In sum, a fear appeal is most effective when (1) it scares the hell out of people, (2) it offers a specific recommendation for overcoming the fear-arousing threat, (3) the recommended action is perceived as effective for reducing the threat, and (4) the message recipient believes that he or she can perform the recommended action.


The use and abuse of fear appeals:

But there is a broader question: Should fear appeals be used at all? Given the power of fear to motivate and direct our thoughts, there is much potential for abuse. Illegitimate fears can always be invented for any given propaganda purpose.

As persuaders, it is our responsibility, if we decide to use a fear appeal, to insure that the fear we instill is, at least, legitimate and that it serves to alert the target to potential dangers as opposed to obscuring the issue with emotion. As targets of such appeals, we owe it to ourselves to first ask, “How legitimate is this fear?” before taking the propagandist’s bait.


The Granfalloon Technique (labeling others):

What makes Tajfel’s research so curious are the results that are often obtained. Despite the fact that the subjects were total strangers prior to the study, that they had never interacted with one another and never would, and that their actions were completely anonymous, they acted as if those who shared their meaningless label were their good friends or close kin.

Subjects indicated that they liked those who shared their label. They rated others who shared their label as more likely to have a pleasant personality and to have produced better output than out-group members. Most strikingly, subjects allocated more money and rewards to those group members who shared their label and did so in a competitive manner.

Herein lies the secret to the persuasiveness of the granfalloon. If the professional persuader, the advertiser, the politician, the televangelist can get us to accept his or her granfalloons, then we have a ready-made way to make sense of our lives-the propagandist’s way and as our self-esteem becomes increasingly linked to these groups, we have a strong motivation to defend the group and to go to great lengths proudly to adopt its customs. What the propagandist is really saying is: “You are on my side (never mind that I created the teams); now act like it and do what we say.”

Sometimes granfalloons come ready-made. In recent decades, marketers have put forth extraordinary effort to classify America into groups and lifestyles, with the goal of designing communications and products to fit each group. For example, one scheme advanced by the Claritas Corporation divides America into 40 neighbourhood types based on values, lifestyles, income, and so on. You could be a member of the “Young Influentials,” “Blue-collar Nursery,” or “Blue-blood Estates.” Each group is associated with a certain self-image and lifestyle. Advertising and products are then targeted to profitable markets. In other words, products are given a “personality” that fits the image of the target market; this advertising then goes on to create further the image of each granfalloon by specifying what needs to be done to maintain a certain image.

Often workplace distinctions can be so trivial that they are not easily understood by an outside observer or a new group member. The cunning office politician may be so bold as to create temporary, fluctuating group identities. For example, today’s Machiavellian may set one group against another by going to one person and saying “we women must stick together,” to another and saying “we newcomers have to watch out for the old guard,” and to a third and saying “those of us with an eye to the bottom line better pull together on this one”—all in an attempt to secure allegiance to a proposal.


Defending against the granfalloon:

Here are five simple rules of thumb that might help.

First, be wary of those who attempt to create minimum groups and to define you as a member of a certain category. There are many ways to define and label a person. Ask yourself, “Why is this particular label being suggested?”

Second, follow the old civil rights motto, “Keep your eye on the prize.” Try linking your self-esteem to achieving an objective—be it a quality purchase at a reasonable price or a social good-rather than maintaining a self-image.

Third, don’t put all of your self-esteem eggs in one basket, one granfalloon—it might lead to fanaticism.

Fourth, look for common ground-goals that might be acceptable to both the in- and the out-group-as a means of reducing the importance of group boundaries.

And finally, try to think of an out-group member as an individual, someone who may share more in common with you than you might have previously thought.


Guilt and Persuasion:

The power of guilt to convict and to persuade stems, as with most emotional appeals, from its power to direct our thoughts and to channel our energies. When we feel guilty we typically pay little attention to the cogency of an argument, to the merits of a suggested course of action. Instead, our thoughts and actions are directed to removing the feeling of guilt-to somehow making things right or doing the right thing. We fall into the rationalisation trap.

Nevertheless, many effects of guilt are, of course, not positive; many guilty feelings are undeserved. Guilt can be induced by reminding the target of past sins that have long since been atoned for, by making small transgressions loom large, or by making it appear that the target is responsible for a crime that he or she did not commit. Once we are filled with guilt, our thoughts and behaviour are directed toward ridding ourselves of this feeling. The end result is, at best, the manipulation of our behaviour and, perhaps at worst, long-term damage to our self-esteem.


How to Diagnose persuasive communication:

Here are some questions that can be asked to help diagnose a persuasive communication.

The questions can be used when the intention to persuade is clearly recognised and, perhaps most importantly, when it is not, but yet decisions must be made.

What does the source of information have to gain?

Why are these choices being presented to me in this manner?

Are there other options and other ways of presenting those options?

What would happen if I chose something other than the recommended option? What are the arguments for the other side?


Propaganda vs Education:

The American Heritage Dictionary of the English Language i defines propaganda as “the systematic propagation of a given bus doctrine” and education as “the act of imparting knowledge or skill. Again, we could all agree that breakfast cereal and aspirin ads are propaganda designed to promote the sale of certain brands. But what about American television programs, which still tend to depict women and minorities in stereotyped roles? Or, more subtly, what about the vast majority of high school textbooks in American history that virtually ignore the contributions of blacks and other minorities? Is this merely imparting knowledge?

The issue, though, isn’t as clear-cut as it may appear. Remember the examples used in your elementary school arithmetic text? Most of the examples dealt with buying, selling, renting, working for wages, and computing interest. As Phillip Zimbardo and his colleagues have pointed out, these examples do more than simply reflect the capitalistic system in which the education is occurring: They systematically endorse the system, legitimise it, and, by implication, suggest it is the natural and normal way.


Propaganda and Value Misalignment:

In many ways, it is dangerous to apply the labels “education” and “propaganda” to a communication merely on the basis of whether it agrees or disagrees with one’s values.

The point is that the bias of a communication is often in “the eye of the beholder.” What is labeled as propaganda and what is labeled as education depend on one’s own propaganda purposes. Protagoras’ book was considered heretical in his day. Today, a similar volume may be called a “textbook.”

Consider a scenario in which we learn about a disagreeable fact-for example, that the policies of the president we elected are in many ways responsible for increased racism and poverty. To reduce our dissonance, we label the disconcerting information “propaganda” and deny the validity of the claim. Thus discredited, the fact no longer needs to be considered in our decision making. In such cases, we miss the opportunity to educate ourselves about important issues of the day and to discover perhaps more successful courses of action.


Selection is Propaganda:

Selectivity of news is the beginning of propaganda. As Walter Lippmann once put it:

Without some form of censorship, propaganda in the strict sense of the word is impossible. In order to conduct a propaganda there must be some barrier between the public and the event. Access to the real environment must be limited, before anyone can create a pseudo-environment that he thinks wise or desirable. For while people who have direct access can misconceive what they see, no one else can decide how they shall misconceive it, unless he can decide where they shall look, and at what

Accordingly, when those in charge of news programming make decisions about which events to cover and which fraction of the miles of daily videotape to present to the public, they make their decisions, at least in part, on the basis of the entertainment value of their material. Film footage of a flooded metropolis has much more entertainment value than footage devoted to a dam built to prevent such a flooding: It is simply not very exciting to see a dam holding back a flood. And yet, construction of the dam may be more important news

Just as such action events as football games are more entertaining on television than such quiet events as chess matches, so it is more likely that riots, bombings, earthquakes, massacres, and other violent acts will get more air time than stories about people helping one another or people working to prevent violence.

The result is sound-bite news-a montage of brief visual images that play to the crowds. Each event and every idea must be part of a dramatic story amply illustrated with visual displays. Stories that are easily dramatised and visualised, such as a child trapped in an abandoned well, are readily covered. More complex issues, such as the economy or regulatory policy, receive little attention unless they can be made concrete and visual.


How to Become a Cult Leader:

Indeed, in case any of you would like to be a cult leader, we can list seven mundane but nonetheless “proven-effective” tactics for creating and maintaining a cult.

1. Create your own social reality.

The first step in creating a cult is to construct your own social reality by eliminating all sources of information other than that provided by the cult. Cult headquarters should be isolated from the rest of the world—a farm in Oregon, a secluded house on the outskirts of town, a jungle in Guyana. Members mail should be censored. Family should be prevented from visiting members. Strict boundaries between “believers” and the “unredeemed” must be maintained. Such censorship can be physical, that is, forcibly excluding outsiders and physically restraining wayward members. However, it is much more practical to teach members self censorship by labelling everything that is not ‘of the cult’ as ‘of the devil’.

The second step in constructing a social reality is to provide a cult’s-eye view of the world. This picture of the world is then used by members to interpret all events and happenings. For example, Jim Jones taught that there is a constant threat of nuclear war and that the world is full of racism. To be prepared to live in this evil world, one must be ready to die.

One useful technique for constructing social reality is to create your own language and jargon. For example, Divine Light Mission members call their services “satsang” and “darshan.” The Unification Church refers to lying to the unredeemed as “heavenly deception.” Scientology teaches that the “thetan” (soul) is impeded in its many lives by “engrams” (aberrations) that result from wrongdoing. A good vocabulary is useful for putting the “right” spin on things. By teaching a battery of clichés such as “bourgeois mentality,” “Krishna consciousness,” “of the world,” and “in the Lord” any event is quickly defined as good or evil and critical thinking is abruptly terminated.

When it comes to teaching your social reality, here is one additional point to keep in mind: Repeat your message over and over and over again. Repetition makes the heart grow fonder, and fiction, if heard frequently enough, can come to sound like fact

2. Create a granfalloon.

The granfalloon technique requires the creation of an in-group of followers and an out-group of the unBredeemed. The technique allows you to control members by constantly reminding them: “If you want to be a chosen, then you must act like a a chosen one. If you are not chosen, then you are wicked and unredeemed. To be saved, you must act like you are supposed to act.”

The essential ingredient in establishing an in-group of believers is the creation of a social identity-an image of who “we” are. Joining a cult represents a break from the “other” world and the acceptance of this new identity.

The reverse side of the granfalloon tactic is the creation of an outgroup to hate. The Children of God teach members to hate their parents. One Mo letter states that parents are evil; they are not your true family; we are your family now.

The creation of an evil out-group serves the dual purpose of making members feel good about belonging to the group (“I’m glad I’m not like them”) and increasing their fears about leaving the group (“I don’t want to be like them”).

If granfalloon techniques are correctly applied, then you should be successful in creating fear of the “outside” world and the belief that the cult is the only solution to a happy life. Life is thus impossible outside the cult-the only solution to life’s problems.

3. Create commitment through dissonance reduction.

Cults can insure members’ obedience by establishing a spiral of escalating commitment; the cult member, at first, agrees to simple requests that becoming increasingly more demanding. Jim Jones used such a technique, extracting great trust from his followers one step at a time

After making an initial commitment one does not feel comfortable reneging on the deal. To justify the sensibility of the initial commitment, the member is often willing to do more and then still more-to make increasingly demanding commitments. In this way, the resolution of dissonance and maintenance of one’s self-image as honouring commitments form a powerful rationalisation trap.

4. Establish the leader’s credibility and attractiveness.

Most cults have leader myths-stories and legends passed from member to member concerning the life and times of the cult leader. Unification Church biographers compare Moon’s birth in Pyung-buk, North Korea, with Jesus’ birth in Bethlehem-both were at night, both were in small unknown villages, and both marked the dawn of a cosmic transition.

What is the purpose of such myths? It is hard to disobey a person believed to be “the son of God” or, at least, blessed by a divine purpose. Anybody in their right mind should seek to identify and be like a holy person

5. Send members out to proselytise the unredeemed.

Witnessing to the unconverted has the obvious advantage of bringing in new members. Perhaps just as important, proselytising can ensure that members are constantly engaged in self-sell, or self-generated persuasion.

The act of witnessing requires the member to state anew to many different people the positive advantages of being in a cult. In arguing to convince others, members convince themselves. Evangelical activity also strengthens resolve. Each witness is likely to elicit many negative responses or an attack on the cult. In defending their beliefs, cult members learn to refute a wide range of attacks, thus inoculating themselves against counterarguments and thereby maintaining belief in the cult

6. Distract members from thinking “undesirable” thoughts.

Most cult doctrines are hard to take seriously, much less accept. The cult member, especially a new recruit, is likely to question and counte rargue such basic points as the value of giving all one owns, especially a new sports car, to the cult and the merits of working sixteen hour shifts and turning over all proceeds to the cult leadership. The old advertising saw “If you don’t have anything to say, sing it,” probably never applied more. How, then, does a cult leader distract a member from carefully scrutinising and questioning the cult’s doctrine?

There are other ways to disrupt counter arguing in a cult. Chanting and singing prevent thinking about anything else but the chant and the song. Meditations such as those performed by the Divine Light Mission, in which the premie spends hours trying to visualise light, hear music, taste nectar, and speak the primordial vibration of existence, prevent the meditator from focusing on other, more worldly concerns.

Nonstop activities such as proselytising, working, and cooking and cleaning for other members further limit the opportunity for careful thought and personal reflection.

7. Fixate members’ vision on a phantom.

The successful cult leader is always dangling a notion of the promised land and a vision of a better world before the faithful. Unification Church members follow and attend to the teachings of the “Lord of the Second Advent” so that they can be resurrected to the perfection state during the third and final testament stage.

Phantoms can establish hope-a powerful motivator of human behaviour by providing a sense of purpose and mission.