Tricks of the trade

Some concepts discussed in Age of Propaganda:

Magic bullets

The notion that propagandists could influence the masses in mysterious and inscrutable ways. For instance, the subliminal mind control tapes, or the rogue movie frame inserted to get moviegoers to go to the concession stand like zombies and spend money on overpriced snacks. Propaganda may be sophisticated, but it’s based on research and observation, not magic. A politician could for instance mention ‘al Qaida’ and ‘Iraq,’ or ‘bin Laden’ and ‘Saddam Hussein,’ close together to create an association in the minds of the target audience without asserting it (what is often called in politics ‘plausible deniability‘). But that’s not so mysterious once you start recognizing it.

Law of Cognitive response

The above doesn’t mean propagandists don’t go for lots of indirect persuasion–that’s their specialty. For instance, individual commercials may not cause an individual to go out and buy a product (again, in zombie-like fashion). But their cumulative effect may be substantial and largely unconscious–they may in a sense ‘rule out’ alternative views of the world that don’t see consumption of some sort as the answer to most problems. There is plenty of evidence to show more direct effects, for instance that a TV advertising campaign can increase sales (but it costs money, too–it’s an investment, so a business has to decide whether the increased sales warrant the investment). Political ads, especially negative ones, have been found to be effective, at least among media consumers who seek little other information about candidates. And certain techniques work better than others, fear being among the most effective. What does this mean? For one thing, watching TV news would have made you more likely to support the Iraq invasion in 2003, despite a welter of evidence that the case for invasion was based on flimsy, falsified, and selective intelligence. This was no accident.

Six sales principles

Among thousands, but these have been shown through research to work: 1) ads containing the words newquickeasyimprovednowsuddenlyamazing, and introducing2,3) Eye-level placement of goods in supermarkets is preferred (see where the sugary cereals are, and think about the eyeball height of its biggest fans ….); also end of aisle placement is effective–it just looks like a sale! 4) ads using animals (Geico, AFLAC, the Coke polar bears … ), babies (E*trade, Michelin tires), and sex (too numerous to mention …) sell; 5) Bundle pricing (at one point in the La Grande Safeway, the Yoplait yogurt was selling for 40 for $20, meaning .50 per; Costco is built on the bulk principle); 6) Asking people ‘how they’re feeling‘ on sales calls increases sales (even if we know deep down the caller is on commission and reading a script ….). So face it . . . . we’re suckers for a clever–or at least well-researched and tested–sales pitch.

Granfalloon technique

This appears to have come from the novel, Cat’s Cradle, written by Kurt Vonnegut. It involves a group that claims to have some shared sense of purpose, but a meaningless one. Hey, you’re a Capricorn! And from Texas! And a Phi Beta Kappa! We’re practically soul mates (so would you like to buy a really high-end vacuum cleaner?)! This technique is used more often than you might think. I’ve had it used on me personally a few times, by some people quite skilled in the art of persuasion (these stories are too long for a lecture page, but they had me going for a while, not long enough though to make me feel trapped into accepting their request for money).

Rationalization trap

One of the most important concepts from the book. The key, as the authors put it, is to make the recipient of persuasion attempts feel guilty about something. They use the example of giving to a charity. Take two groups, each asked for a charitable contribution. The first is simply asked. To the second is added the expression ‘every penny counts.’ Seems unimportant. But the second group gave more often, and gave as much in quantity on average as the first group. The researchers speculated this was because of an appeal that made them feel bad, guilty, so that they wouldn’t even be willing to give a penny. But then when they did contribute, they didn’t want to seem cheap, so they gave in generous amounts.

There are other ways to think about this. For instance, you’re on the phone with a telemarketer, trying to get you to donate (for example the Fraternal Order of Police). The longer they can keep you on the phone, the more likely you’ll feel you’re being sucked into a spiral of commitment to give something, because you’ve kept this nice person on the phone for so long, for such a good cause! The Iraq War was not going well, in terms of meeting its initial objectives, fairly early on. But the White House continued to use the logic that we have been at this for so long, we have to honor the fallen and meet our commitments (though those commitments were rather obscure after a few years).

The idea here has to do with the psychological concept of cognitive dissonance–you’ve been convinced of something, and in order to minimize feeling bad about it, you’re more willing to find ways to justify or rationalize your continued commitment than to change course. The rationalization trap. It’s effective and pervasive. I mean, you’ve taken the time (unless you cheated and skipped to the last paragraph, and you know who you are ….) to make it all the way to the end of this lecture page–and it’s one of the longest (I promise).

You don’t really want to believe it’s all just a load of hooey, do you??

To review some concepts from class (drawn from Brian Patrick’s ‘Ten Commandments of Propaganda‘):

  • Control the flow of information
    • Source filtering
    • Exclusion
  • Reflect the values and beliefs of the audience
    • using ‘metrics’
    • ‘personalization’
  • Disambiguation
  • Distance propaganda from its source
  • Group ‘horizontal’ pressure

Likability (in this case, of presidents)

 

Anthony Pratkanis and Eliot Aronson. 2007. Age of Propaganda. NY: Holt.

Brian Anse Patrick. 2013. The Ten Commandments of Propaganda. London: Arktos.