The Age of Anti-Science

Reading Time: 9 minutes Although scientific sources present data and validated evidence, an increasing number of people do not believe in science. What causes this attitude, and how does it affect society?

The Age of Anti-Science
Reading time 9 minutes
Reading Time: 9 minutes

Over time, myths’ creation, scope, and impact on the scientific community have been influenced by various factors, such as politics, religion, society, psychology, and economics. However, some persist long after solid scientific evidence has presented alternative explanations. Society seems to have descended into a Dark Age in which scientists are portrayed as enemies and conspirators with global interests.

There is no better example of this than the COVID-19 pandemic of 2020, which began four years ago this month, an event that claimed the lives of millions of people. During this time, many people refused to believe in the virus or in the scientific community, which proved its severity and created a vaccine to save millions of people.

Vaccines are usually developed by medical scientists who work day and night to create new vaccines to stop outbreaks of a threatening virus or biological predator; they do not miraculously appear. These must pass through advanced monitoring systems and receive the approval of governmental agencies, such as the Centers for Disease Control and Prevention (CDC) and the Food and Drug Administration (FDA) in the United States, which have been operating for decades.

According to research by the Peterson Center for Healthcare and the Kaiser Family Foundation (KFF), from June 2021 to March 2022, about 234,000 COVID-19 deaths in the United States were preventable if the people who contracted the virus had been vaccinated. In Texas alone, an estimated 40,000 people died because they refused to get vaccinated.

Speaking about his experience with vaccines during the pandemic, Dr. Robert Froehlke commented to the New York Times, “In the past, we could convince more people thanks to our experience and training.” Now, he cites the Centers for Disease Control and Prevention or other official sources so he does not get accused of being an accomplice to some grand conspiracy. “This lack of trust is very concerning,” says Dr. Froehlke.

In truth, the lack of trust is nothing new. In 1998, the discredited academician Andrew Jeremy Wakefield published a paper in The Lancet called “The MMR Vaccine and Autism: Sensation, Refutation, Retraction, and Fraud.” It reported misleading research claiming a link between the measles, mumps, and rubella vaccines to autism. This post was later proven false, resulting in Wakefield being removed from the medical registry. Still, thousands continue to believe in his study despite it being debunked.

Other examples outside the medical field are those who believe the Earth is flat, despite the photographs and videos proving otherwise, and those who disbelieve climate change, one of the most critical issues of our time. Undoubtedly, this era of anti-science has brought misinformation and deadly consequences.

This lack of trust in data from reliable sources comes hand in hand with the post-truth era and fake news that has gained popularity. It has intensified due to the ease of information sharing and consumption on social media. How people produce, share, and consume news plays a critical role in distributing misleading data. Those who share and publish usually need more knowledge or skills to evaluate content.

The scientist and pediatric specialist Peter Jay Hotez published a book entitled The Deadly Rise of Anti-science: A Scientist’s Warning. In it, Hotez explains that while anti-scientific movements are not new, they have become more organized, better funded, and adopted by political ideologies in recent years.

These forces have vigorously attacked the scientific community to the extent that Peter Hotez himself has received threats and harassment in his home; however, this is not an isolated case. According to Hotez, two out of five scientists who spoke publicly about COVID-19 and vaccines were similarly attacked. A survey of 300 scientists by Nature corroborated this data, reporting that dozens of researchers shared stories of death threats or threats of physical or sexual violence for speaking out about the coronavirus. Anti-vaccine groups and politicians in many countries have led these attacks. 

Why are people against science?

The Wellcome Charitable Foundation published research on the state of science and society in November 2021, covering 119 thousand people from 113 countries. The study found that trust in scientists closely correlates to trustworthiness in national governments to the degree that it becomes difficult to disentangle where the credibility of one ends and the other begins. So, does the anti-science movement have its roots in politics? Politics does not create anti-scientific attitudes but triggers, amplifies, and strengthens them.

So why is it that when different people are provided with the same scientific evidence, some accept it while others reject it? What are the psychological principles that explain people’s anti-scientific views? An investigation by Aviva Philipp-Muller, Spike W. S. Lee, and Richard E. Petty, published in the scientific journal Proceedings of the National Academy of Sciences (PNAS), specifies four primary occurrences that impulse these attitudes. These annotations are based on decades of research on persuasion, influence and social identity, information processing, and attitudes. Mistrust occurs:

  1. When a scientific message comes from sources perceived as lacking credibility.
  2. When recipients embrace social belonging or group identity with anti-scientific attitudes.
  3. When the scientific message contradicts what the recipients consider accurate, favorable, valuable, or moral.
  4. When there is a mismatch between the delivery of the scientific message and the epistemic style of the receiver.

Each of these points involves specific antecedents and provokes different nuances of psychological reaction. Despite this, the four are similar in revealing how scientific information clashes with the content or style of thinking already ingrained in people. These conflicts are difficult to accept and easy to reject, complicating the effective communication of scientific information. However, this difficulty becomes surmountable once its underlying fundamentals are clarified.

The source of the scientific message

Most people rely on scientists, journalists, health officials, politicians, or opinion leaders to build their understanding of the world. Traditionally, the more credible the source, the more likely people were to accept its information and be persuaded. This brings us to the anecdote of Dr. Robert Froehlke, who said that being a doctor was enough for his patients to believe him, but now he lacks credibility, especially on issues such as vaccines. Why?

According to a study published in PNAS, credibility is supported by three pillars: “experience (i.e., possessing specialized skills and knowledge), trustworthiness (i.e., being honest), and objectivity (i.e., having unbiased perspectives on reality).” If scientists are perceived as lacking these three qualities, they will be considered inexperienced or biased and unable to change public opinion.

While these experts were previously seen as experienced and competent, the integrity of their findings might now be questioned in all fields, from the social sciences to the medical sciences. According to the authors, this is because the very mission of science can undermine their credibility: to have legitimate debates and defend various, sometimes contradictory perspectives, theories, hypotheses, findings, and recommendations. These contradictions make the scientific community look unbelievable.

Another point is that research is often funded by pharmaceutical companies, elite institutions, or government organizations, which affects its reliability because many suspect their motives and findings. In addition, people perceive scientists as cold, insensitive, and atheistic; many conservative and fundamentalist religious people do not trust their conclusions because they go against their beliefs.

Recipient of the scientific message

Aviva Philipp-Muller, Spike W. S. Lee, and Richard E. Petty mention substantial research that discusses social identity theory and how the social groups to which people belong influence their responses to the information they receive. Social identities can play a role in anti-scientific attitudes and behaviors, as people tend to reject scientific information incompatible with their identities.

It is usual for individuals to distort scientific findings to suit their values and dismiss those that threaten their cultural identity. For example, if people like video games, they are more likely to accept research highlighting their benefits than studies reporting their harm to health. Beyond this, some people identify with groups that ignore and wholly shut down thoughts, recommendations, and scientific evidence in general, such as the famous “anti-vaxxers.” These people often relate to personally meaningful political or religious identities. 

Philipp-Muller, Lee, and Petty caution that “an important nuance and caveat, however, is that while scientists may characterize some social groups as anti-science, individuals who identify with these groups may not think that they explicitly or consciously repudiate science.” The authors mention that these individuals believe their views are more sound from a scientific point of view than from an expert point of view. They rely on pseudoscience in several cases and effectively contradict the scientific method for generating and accepting scientific knowledge.

The danger occurs when these individuals harbor hostile feelings towards people with differing opinions; hostility carries them away. They become violent. Scientists experienced this during the COVID-19 pandemic by people who rejected scientific messages.

Cognitive Dissonnance

Sometimes, when scientific information contradicts people’s beliefs, they may reject even the most substantial scientific evidence. They refuse to harbor contradictory cognitions; this is called cognitive dissonance.

Cognitive dissonance arises when an individual is exposed to information that conflicts with their beliefs, attitudes, or behaviors, causing discomfort. It is easier to reject a piece of scientific information than to revisit an existing belief system that one has accumulated and integrated into a worldview over the years, often reinforced by social influences. Aviva Philipp-Muller, Spike W. S. Lee, and Richard E. Petty confirm this, noting that “rejecting new scientific information is often the path of least resistance than revising existing moralized attitudes.”

Sometimes, these beliefs come from the exact science with previously accepted scientific information that is now considered obsolete or erroneous, as is the case of Andrew Jeremy Wakefield’s study that “proved” that vaccines cause autism. That information seemed to come from a reliable source, a doctor and scientist, but it was proven untrue. Many still believe it to be valid research because it does not go against what they have believed for years.

On the other hand, it is true that in recent years, fake news has increased, and it has spread faster due to social networks. Fake news spreads faster because it evokes stronger emotional reactions and appears more novel than the real thing. Once spread, the misinformation is difficult to correct, mainly when it infiltrates a group that sees it as trustworthy because like-minded people share it.

There is a mismatch between the delivery of the scientific message and the epistemic style of the recipient

Sometimes, scientific information does not conflict with the individual but is still rejected because of how it is delivered; it may be at odds with a person’s thinking on the subject or their general approach to information processing. This is called epistemic style.

According to PNAS, there are four dimensions of epistemic style: the level of interpretation, the regulatory approach, the need for closure, and the need for cognition. The first refers to how people often do not accept scientific research because their level of abstraction is different. For example, if people think about climate change abstractly (global environmental degradation), concrete information about carbon savings may be discounted.

The second is the regulatory approach, where someone focuses on losses rather than gains. For example, describing a vaccine as 90% adequate falls on deaf ears of risk-averse people who interpret it as 10% ineffective. Another epistemic style is when the individual needs closure; they do not tolerate uncertainty and reject information that is not definitive or conclusive. Finally, there can be a lack of cognition, which is when the person does not enjoy processing, so they are less receptive to complex information delivered to them, no matter how high quality.

What can we do about anti-scientific attitudes?

The following strategies can be implemented to combat anti-scientific attitudes:

  1. Increase the perception of science as a credible source of information.
  2. Diminish identification with anti-science groups.
  3. Increase the acceptance of scientific information.
  4. Adapt the message to the recipient’s thinking style.

The first point considers how people no longer see scientists as credible sources. Instead, they see them as inexperienced, unreliable, and biased. To address the perception of the quality of their work, scientists must improve the validity of their research and make their findings reproducible. In addition, they must contribute to the public debate, explaining the disagreements and how debating them is inherent to the scientific process and is healthy. In addition, they should reach out to journalists, health officials, politicians, or key opinion leaders and join forces, as it is easier to reach the public through sources they trust.

In addition, the scientific community must strive to use language that conveys its message clearly and precisely and is accessible to a general audience. Here again, it is essential to emphasize the importance of approaching media such as the IFE Observatory, which can make non-professional summaries for those who are not experts but are interested in obtaining information in terms they understand.

On the second point, science communicators must appeal to their audience’s social identities. Strategies to engage social identities shared with the audience help reduce hostility and increase receptivity. Beyond these groups, they can also form groups with shared goals, which help increase the chances that their message will be heard and accepted by those who may have been initially opposed to scientific information.

The scientific community must also strive to gain the trust of groups historically exploited or excluded by the scientific community, those who have been used as objects of study. Researchers can work collaboratively with members of these communities, develop cultural competencies, and engage these oppressed and racialized communities.

Concerning the third point, the population must be trained in scientific reasoning. Teaching people how to assess the quality of scientific information can help them accept high-quality scientific evidence, even when it contradicts their beliefs. In addition, warning people about false information and refuting it can help them better resist believing erroneous data.

Science communicators must present solid, well-reasoned, substantiated arguments to alter entrenched attitudes. If possible, framing scientific information per the recipient’s moral values can increase their receptivity to the message. Generally, it is vital to use various strategies to improve the acceptance of scientific information, especially when it contradicts people’s beliefs and attitudes. Finally, scientists must identify the recipient’s thinking style and tailor the message to that way of thinking.

The reality is that science is going through a crisis because it is not accepted or perceived as a reliable source as before. This brings many societal consequences in all senses, not only socially but also for the health and well-being of society. Another important consideration is the role of teachers and education in this age of anti-science. How can educators teach something their students do not believe? This problem limits the type of knowledge they can impart. For example, teaching about climate change is becoming increasingly urgent, but what can teachers do if families and students have strong anti-science attitudes?

Translation by Daniel Wetta

Paulette Delgado

This article from Observatory of the Institute for the Future of Education may be shared under the terms of the license CC BY-NC-SA 4.0