The global media landscape has changed significantly in this century. Increasing digitalisation of the media sphere continues to alter the patterns of consumption of media content. Fewer and fewer people access news through the daily/weekend print newspapers or consume news through cable and broadcasting services. There is an ongoing shift from characterising the general public as somewhat passive consumers of media towards being considered more active “pro-sumers”. This evolving scenario of media and communication in the public sphere poses a number of alarming, but also fascinating, challenges for science communication.
An interesting example is provided by the general concerns over social media that enable, and encourage, users to be active participants in the processes of creating and sharing content. And, as the social media platforms have become a vehicle for news coverage. After a period of optimism, anxieties over the role of social media have grown in recent years. Social media has opened up new spaces for the widespread circulation of misinformation and conspiracy theories regarding the contribution of science to public policy. Misinformation and conspiracy theorising has a long history and is nothing new; what is new is social media’s opportunity for ‘voices in the wilderness’ to build communities of conspiracy theorists promoting groundless and discredited claims about global warming and climate change, rubbishing covid vaccines, and the so-called Great Reset organised by the likes of the World Economic Forum, Bill Gates etc. Such conspiracy fictions are the fuel of populists at the extreme fringes of politics.
The anti-science fraternity believe that they alone, and not scientists and other experts, have the insights needed to save the world. Many are evangelists committed to saving the likes of the many who (in their view) have been duped by scientific nonsense emanating from the dark forces of evil.
All technologies co-evolve with society, so it is unreasonable to attribute responsibility for any contemporary malaise, including the spread of conspiracy theories about science, to the latest technologies of communication. When looking for solutions, it is more fruitful to turn to social science to understand the societal changes that are taking place and what can be learned from the past.
Dibble, an archaeologist from the University of Cardiff, writes that “for the last decade, scholars and experts have dealt with misinformation and pseudoscience either by trying to ignore it, in order not to amplify it, or by debunking it once it has spread far enough. But recent misinformation research highlights the importance of pre-bunking rather than debunking. An audience primed with real facts is armed to understand the issue with pseudoscientific narratives”.
Some classic studies in social psychology point to ways in which science communicators can take on misinformation and conspiracy theories. In the aftermath of WW2 and the emergence of the Cold War, substantial empirical efforts were devoted to the systematic study of persuasion and propaganda, from the style and structure of messages to characteristics of the receiver. The Yale Communications Programme was established to research “who says what, to whom and with what effect”. It is still part of the core literature on the psychology of communication. There is one thesis in particular that has a direct relevance to the discussion on the tools to counteract misinformation – inoculation theory.
The idea of pre-bunking resonates with inoculation theory, dating back to research by Lumsdaine and Janis on the effects of one-sided and two-sided communications. The subject matter was the Soviet Union’s ability to produce atomic bombs. In one experimental condition students received a communication saying that the Soviets would not be able to do so. In the second condition, students received the same supportive arguments but also opposing arguments some of which were refuted and others not. The initial impact of the one- and two-sided communications was the same. Subsequently, all the students were exposed to a communication that the Soviets would produce a large number of bombs. The students in the two-sided group were more resistant to this counter communication than those exposed to the one-sided communication. Lumsdaine and Janis concluded that the greater resistance to persuasion created by the two-sided communication had “given an advanced basis for ignoring or discounting the opposing communication”.
In 1964 McGuire linked the idea to medical immunisation, which involved the injection of a weakened dose of a virus in order to develop immunity. He argued that attitudes and beliefs can be vulnerable to attack from persuasive counter-arguments and that protection against such attacks could likewise be achieved through “attitudinal inoculation”; the exposure to weakened forms of the attacking message. McGuire also suggested that beliefs that had never been attacked, such as cultural truisms (for example ‘it is good to clean one’s teeth’), would be more vulnerable to attack in the same way that those with little exposure to viruses are most vulnerable to viral illnesses.
A feature of an attitudinal inoculation treatment is refutational pre-emption. Pfau et al, 1997 argue that this component of a message “provides specific content that receivers can employ to strengthen attitudes against subsequent change”. In health promotion, refutational pre-emption is introduced by the raising and refuting of ‘unhealthy’ counterarguments. Thus, a conventional inoculation message begins with a forewarning of impending challenges to a held position, then raises and refutes some possible challenges that might be raised by opponents. For example, an inoculation message designed to discourage young people from vaping might begin with a warning that peer pressure will challenge their rejection of vaping, then follow this with potential counterarguments they might face from their peers e.g., “vaping isn’t bad for you” followed by refutations of such counterarguments e.g., “Actually, vaping has been found to lead to serious respiratory diseases”. To implement such an inoculation strategy, message communicators need to know the counterarguments that might be employed in attack messages in order to provide and refute weakened counterarguments.
In 2020 the Wellcome Trust’s Global Monitor surveyed representative samples of the public across the world. In Europe, trust in scientists and the wider scientific system is solid; for example 79% of respondents trust scientists and 83% trust science to arrive at accurate findings. It is possible that for many people this may reflect a ‘cultural truism’ that has seldom, if ever, been challenged. As such, it is at risk of attack and persuasion from conspiracy propaganda. Furthermore, a not insignificant percentage of Europeans say they have ‘not much’ or ‘not at all’ trust in science. Among the 8-10% will be some who believe in one or more anti-science and/or conspiracy theories that include the rejection of scientific expertise/advice.
Science communicators should recognise the potential vulnerability among some of the majority who have trust in science. Rather than seeing science communication as a mechanism to promote trust in science, they should consider it to be a means to provide the majority of citizens with the arguments that will vaccinate them against misinformation.
References
Lumsdaine, A. and Janis, I. (1953). Resistance to counter propaganda produced by one-sided and two-sided propaganda presentations. Public Opinion Quarterly 17,311-318.
McGuire, W.J. (1961). The effectiveness of supportive and refutational defences in immunising and resisting beliefs against persuasion. Sociometry, 24, 184-197.
Eagly, A.H. and Chaiken, S. (1993). The psychology of attitudes. Harcourt, Brace & Jovanovich.
Pfau, M et al (2009) Nuance in inoculation. Communication Quarterly 45: 461-481.