Some Tools For Optimizing Our Media Use

Benevolent and malevolent media producers possess the power to influence society in positive and negative ways. They can do this through agenda setting, framing, priming, spreading memes, altering perceptions of groups and individuals, outright propaganda, and other methods.

I think more attention should be paid to the pathways from content to effects, so that we can optimize our cultural landscape.

Lest this post soon turn into Applause Light Vegas, I’ll now get into some methods I think can be used to sway mass opinion in a direction amenable to making the world better. Many of these methods deal with familiar biases, heuristics, and psychological effects.

Media Use Facilitating Positive Social Change

First, the mass media possesses the power to alter estimates of the likelihood and frequency of specific occurrences. Think back to some of the classic examples of the availability heuristic. When asked to estimate the number of homicides in the USA compared to suicides, people answer that there are far more homicides in the United States, even though the reverse is true. The mass media report on homicides far more often than they report on suicides, so people have more available instances of homicide in their memories and these come to mind more easily. This influences their beliefs about the real world, which can then be politicized to lead to different stances on gun control and education. The priorities of a culture with a homicide problem are not the priorities of a culture with a suicide problem.

This effect is consistent with some theoretical models of the mass media’s impact on society. Cultivation theorists understand the media, especially television, as a system of coherent memes and messages reflecting a society’s dominant ideology. If we accept the fundamental claim of cultivation theory then we should hypothesize exposure to television to be positively correlated with status quo beliefs and attitudes. We might then expect high exposure to non-fiction television to lead to mean- and scary-world beliefs, given the disproportionate amount of media coverage homicides receive. One cultivation theorist found this result, yet did not find the same effect on heavy fiction viewers.

Malevolent, benevolent, and clueless media producers could capitalize on the availability heuristic to adjust mass estimates of society’s biggest problems, and by extension, mass assessments of social priorities. This ability of the mass media to affect the perceived importance of subjects by representing or not representing them is sometimes called agenda setting. Want different kinds of people to take existential risks seriously? Maybe get existential risks mentioned in media outlets that different types of people read. Use Medium.com, pitch articles to The Guardian, write an editorial to your local newspaper, increase the representation of important issues on Wikipedia, and so on. You don’t even need to convince people that Friendly AI should be a global priority as much as you need to convince them that thinking so doesn’t make you crazy. Exposing people to AI concerns without coming off as a clear member of a disliked outgroup (e.g. conspiracy theorist) can play a big role in legitimizing the issue in the public’s eyes.

Politicians and media outlets can also make use of framing devices to influence audience perspectives on news stories by tweaking irrelevant factors. A newspaper headline claiming, “Public condemnation of democracy should not be allowed” will receive more support than will one that claims, “It is right to forbid public condemnation of democracy.” If you’ve ever heard a politician speak, you’ve probably noticed how they frame everything they say in a way that makes it sound better than it is. Similarly, a headline will have very different connotations if it describes an event as a “strike” or as an “invasion” or as a “bombing.” (And was it committed against “soldiers” or “forces” or “rebels” or “terrorists”?)

Framing isn’t purely a word-selection thing. It can be done with audio-visual media as well. Film fans among you may have heard of the Kuleshov effect, discovered with a famous experiment that used and re-used a single close-up of a man’s face against a series of different images, such as a bowl of soup, a little girl smiling, a funeral. You can watch a short example here. Each time we cut back to the man, his face appears to express a different emotion even though it’s actually an identical shot of his face. Soviet Montage filmmakers capitalized on this effect in their movies to express meanings through the juxtaposition of different shots.

Biases Facilitating Social Stagnation

Media producers have much more to think about than the biases and heuristics that facilitate persuasion. They also have to examine the psychological and cultural factors that entrench ideas in our heads. The mind is the Hotel California of ideas – once one gets in there, it might never see the light of day again. What are some of those forces that keep us from changing our minds?

The first important factor to consider is selective exposure. Before an idea can sound persuading to you, it has to get in front of you. This is harder than it seems because people don’t want to be confronted with ideas they don’t agree with. Confirmation bias predisposes them to crave ideas they already agree with. If I’m an atheist surfing YouTube, am I going to click on “Creationist moron DESTROYED with a Hitchslap” or on “How to prove atheism wrong in 8 seconds”? People avoid the stuff that doesn’t seem like it’ll cater to their beliefs.

If you want to get existential risk and AGI messages in front of new audiences, you need to find ways to make your stances on those issues seem somewhat consistent with a lot of other peoples’ current views. Getting important undercovered ideas into the public eye will probably mean smuggling them there. A TV station only covering existential risks can easily be ignored by all the people with no interest in existential risks. (LessWrong is sort of an online equivalent to this.) Instead, you may have to smuggle your important ideas into a mixture of more mainstream content.

The primacy effect suggests that the earliest information people receive about an issue is likely to form their thinking on that issue, biasing them in favour of that viewpoint. This suggests to me that it might be a good idea to find subjects on which people haven’t completely formed their ideas yet. If you can give people good ideas before they get a chance to form bad ideas, they’ll be more partial to your ideas than if you try to convince them that their fully-formed “bad” ideas are inferior to yours. My impression is, relative to secularism and the dangers of technological progress, that fundamental anti-speciesist and effective altruist ideas are subjects on which people are still forming their ideas.

Mass media agenda setting also works in combination with other biases. The third person effect is the occurrence of people overestimating the magnitude of the media’s influence on other people. Do you ever assume that a political attack ad, or a marketing pitch, or porn, or a violent video game probably affects a lot of people – while being very confident that you aren’t one of those people being affected? You, of course, are much too clever but those other people are surely easy pickings for propagandists. This view is probably closely related to overconfidence and the bias blind spot.

Davison points out that the third person effect can play into pluralistic ignorance. Misperceptions of public opinion can lead to the majority reinforcing behavioural norms that only the minority of a population agrees with. If we assume that the majority of persuasion tactics we see in the media are successful on other people, then we’re going to wind up with skewed ideas of what everyone else believes. In our current era of demassification and social media, we live in so-called “cyber ghettos” where most of our information comes from people on our social media feeds and others that already agree with us. This probably leads to an overestimation of the popularity and mainstream-ness of our ideas.

Media Use Facilitating Negative Social Change

So, all sorts of biases and heuristics prevent ideas from leaving Hotel California—and on the scale of a culture, this creates memetic stagnation. Now, let’s look at how the mass media can be deliberately used to create negative change. Understanding how this works can help us squash the deliberate spread of misinformation or improve our more benevolent methods of media persuasion.

There is a field called “agnotology” that is about exactly this: ignorance, how and why it’s produced and maintained. When examining a field, a good indicator of whether there’s disinformation at work is whether there exists a divide between expert opinion and public opinion. My impression is that this is the case for many of the issues that interest LessWrongers.

The strategies of disinformation are well known to those familiar with the debates on climate change and evolution by natural selection. One strategy is to assert the absence of scientific consensus by citing the dissenting opinions of scientists in unrelated fields. Another is to point out past blemishes on science’s track record, often using examples taken from the popular literature, rather than from peer-reviewed academic journals. Finally, deceivers can draw attention to fringe parts of a theory that are indeed controversial rather than focusing on the core tenets of the theory that are widely accepted by experts.

Along the same lines as the availability heuristic, media coverage can alter estimates of the extent of scientific consensus on empirical facts. News programs can intentionally or unintentionally contribute to the appearance of scientific controversy by treating both sides of an argument equally, creating the illusion of equal credibility. Further, in attempting to make science palatable to mass audiences, the mainstream media may inadvertently oversimplify or misrepresent scientists, thereby spreading misunderstanding. As a result, it might make more sense to write your own articles instead of going through a middleman that is knowledgeable about journalism but not about your topic.

Hotel California only truly shuts its doors once you’ve left the front lobby and gone up to your room. New information is very briefly “believed” before it is rejected. When reading a novel, we do not suspend disbelief as much as we willingly construct disbelief immediately after believing. But it doesn’t feel that way to the Rider on the Elephant. Sometimes we err in figuring out which ideas have gone up to their rooms and which have exited Hotel California immediately after entering the lobby. There is a whole literature on narrative persuasion – how fiction can lead to the absorption of false beliefs. After reading fictional narratives including statements like, “mental illness is contagious” and “brushing your teeth doesn’t actually make your teeth cleaner,” people are more likely to reproduce those errors on future tests. This effect is even stronger after a two-week gestation period, revealing an absolute sleeper effect.

Even when a retraction immediately follows a statement, it usually fails to eliminate the initial effect. If I tell you, “Woody Allen’s real name is Jacob Allen,” then I have just poisoned your mind in a sense even if I immediately tell you I made the name up. If you were on Who Wants To Be A Millionaire and Jacob was one of the options, it would sound more familiar to you than the alternatives even though I’m making it perfectly clear I have no idea what Woody Allen’s real name is. For all I know, it’s Woody Allen.

One reasonable explanation of this phenomenon is that listeners form mental models of the stories they hear (e.g. Event A leads to Event B leads to Event C). When one of the events of the story is retracted (“Actually, I lied: Event B never happened!”), the listener’s mental model is left with a gaping hole, as Event A would not lead to Event C without the prior occurrence of Event B. Filling this coherence gap with an alternate account of events is a confirmed way to break the continued influence of misinformation. Many other helpful techniques are offered here.

More Ideas for Optimizing Media Use

One good tool for world-changers to have is a list of memes they’d like to spread to a larger audience. Since our uncertainty about the future is high, the selected memes should be very safe messages that are difficult to be abused or to lead society in bad directions if accepted en masse. For instance, a meme like “Technological progress is good” may be generally true but it could easily lead to untrue beliefs or bad consequences if accepted too dogmatically. In contrast, “Racism is bad” seems almost impossible to be misused.

Examples of “safe” memes I would expect to have net positive consequences:

Racism is bad
Sexism is bad
Speciesism is bad
Homophobia is bad
Xenophobia is bad
Belief without evidence is bad
Recycling is good
Defining one’s terms before an argument is good
It’s important to be willing to change one’s mind
One should learn the basic skills of rationality
A lack of absolute certainty does not equal a lack of objectivity
Moral reasoning can be useful
Etc.

The point of a fluctuating list of good memes is that it hierarchizes ideas and causes one to consider how likely specific memes are to be misinterpreted or misused. It also prevents one from getting off track. If you have a list of memes in mind, you can use it to guide your creative decision-making.

It could also be helpful to focus on specific political issues that are hot at a given time. For example,

Party X should win the election
War X should not happen
Apartheid X should be stopped
Abortion should(n’t) be legal
Gay marriage should(n’t) be legal
Capital punishment should(n’t) be practiced
Gun control laws should be stricter/​left alone
Climate change should(n’t) be taken seriously
The rich should(n’t) be taxed more
Etc.

Some of these issues might be far less important than the media and politicians make them seem, but knocking them down, one by one, could probably pave the way for more meaningful change. Perhaps more importantly, they win a battle of principles and prevent the tides from gaining momentum in the opposite direction.

More specific to the issue of an intelligence explosion, the uncanny valley hypothesis suggests that people experience revulsion at the sight of a humanlike-but-not-human thing. This suggests that if one wishes to spread general resistance toward the development of AGI, it would be wise to make a point of associating AGI with these ugly humanoid depictions. On the other hand, if one wanted to spread general acceptance of AGI, it would be good to avoid such depictions.

Another approach is culture jamming. Culture jamming usually means “subvertising” ads by creating TV commercials and billboards that turn corporate ads on their head. Click here for some basic examples. These campaigns build cynicism against corporations and politicians, fuel dissent, and prime people for more world-changing behaviour.

It’s also important to consider the audience of a given message. The average person may not need a reminder to develop their social skills or learn how to communicate, but maybe the average LessWronger probably does. Similarly, there’s no need to convince rationalists that atheism is acceptable because they already believe so – but it remains, I think, a good meme to spread to the broader public. The outward image of activists to the public should consist mainly of moderate, socially acceptable ideas. These topics are not necessarily more important than the more esoteric topics, but they are more likely to be memetically effective because they are consistent with a wide number of outlooks.

Lastly, an important tool for social change is the “nudge” because it guides people toward better decision-making without removing their freedom to choose. The clearest cases where nudges are effective in shaping culture involve appeals to social proof.

Some examples from Thaler and Sunstein’s book: obesity is socially contagious, federal judges are influenced by the votes of their colleagues, 12% of participants choose “subversive activities” as the biggest current issue when asked in private compared to 48% when asked publically, self-reported musical taste is hugely influenced by the self-reported tastes of others, the amount of food people eat correlates with the number of people they eat with, tax and recycling compliance can be increased by informing people that the compliance level is high, binge drinking and smoking rates can be reduced by informing the public of unexpectedly low drinking and smoking rates, and people can be nudged to reduce their energy use by informing them that their energy use is above average.

Do you have any others to add to this list? Was there anything useful in this post you didn’t already know?