Distrust of science is a massive problem. In our current environment, it's directly leading to people's deaths. Much of the misinformation we face is intentional and organized, and even worse, research has found lies seem to spread faster online and are often stickier than the truth.

So psychologist Aviva Philipp-Muller, now at Simon Fraser University, and colleagues dug into the scientific literature on persuasion and communication, to try and outline an up-to-date and cohesive overview on how to tackle this wicked problem.

One of the biggest myths about communicating science is that merely presenting people with knowledge will lead to them acting accordingly with logic. This is known as the information deficit model, and the mode of communication we're using here, but between the global pandemic and climate crisis we now have countless examples of how this often doesn't work.

"Vaccinations used to be a standard thing that everyone accepted," says Ohio State psychologist Richard Petty. "But there have been a few developments in recent years that have made it easier to persuade people against the scientific consensus on vaccinations and other issues."

While that may be hard for many of us to swallow, people do have plenty of legitimate reasons for their distrust.

For starters, industries are degrading trust in science by hijacking scientific credentials, using "sciency" sounding claims to bolster their clout for profits; pharmaceutical companies have most certainly given us plenty of reasons not to trust them. What's more, science doesn't always get things right, and large factions of the media are stoking sentiments against "elitist" experts and bolstering anti-science views.

All this doubt, conflict, and information overload are eroding people's trust in scientists, and those of us often responsible for conveying scientific information to the public, like the media and government officials, are fairing even worse on the trust scales.

This distrust of the source of information is one of the four main barriers to accepting science Philipp-Muller and colleagues identify in their review.

When information challenges a person's core beliefs, challenges the group they identify with, or doesn't match their learning style are the other main barriers the team highlighted.

"What all four of these bases have in common is they reveal what happens when scientific information conflicts with what people already think or their style of thought," explains Petty.

1. Distrust in the information source

As mentioned above, lack of trust in the information source comes up time and time again as one of the key reasons people don't accept scientific information.

Legitimate and robust scientific debate can also confuse people who are not familiar with the scientific process, further damaging trust when it spills into the public domain.

To combat these trust issues the researchers suggest highlighting the communal nature of science and emphasizing the wider, prosocial goals of research. Honestly acknowledging other people's positions and any drawbacks in your own, rather than brushing them away, can also go a long way to better establishing trust, the team explains.

"Pro-science messages can acknowledge that there are valid concerns on the other side, but explain why the scientific position is preferable," says Philipp-Muller.

2. Tribal loyalty

The way our thinking is wired as an obligatorily social species makes us very vulnerable to sometimes blindly believing those we identify with as part of our own cultural group – no matter how much education we have had. This phenomenon is called cultural cognition.

"Work on cultural cognition has highlighted how people contort scientific findings to fit with values that matter to their cultural identities," write Philipp-Muller and colleagues.

Political polarization and social media have only enhanced this. For example, conservatives are more likely to believe scientists that appear on Fox News, and liberals are more likely to trust those on CNN.

"Social media platforms like Facebook provide customized news feeds that means conservatives and liberals can get highly varied information," explains Philipp-Muller.

To combat this we need to find common ground, create information that's framed for specific target audiences, and collaborate with communities holding anti-science views, including people traditionally marginalized by science.

3. Information goes against personal beliefs

The internal conflicts created by information that challenges our social or personal beliefs such as morals and religion, lead to logical fallacies and cognitive biases such as cognitive dissonance.

"Scientific information can be difficult to swallow, and many individuals would sooner reject the evidence than accept information that suggests they might have been wrong," the team wrote in their paper. "This inclination is wholly understandable, and scientists should be poised to empathize."

So key strategies to counter this include showing an understanding of the other person's viewpoint.

"People get their defenses up if they think they are being attacked or that you're so different from them that you can't be credible," says Petty. "Find some places where you agree and work from there."

Counterintuitively, increasing someone's general scientific literacy can actually backfire, because it provides the skill to better bolster their pre-existing beliefs. Increasing scientific reasoning and media literacy skills, prebunking, or inoculating people against misinformation are advised instead, as is framing information in line with what matters to your audience and using relatable personal experiences.

4. Information is not being presented in the right learning style

This problem is the most straightforward of the four bases – a simple mismatch in how information is being presented and the style best suited to the receiver. This includes things like preferring abstract compared to concrete information, or being promotion or prevention focused.

Here, Philipp-Muller and team suggest making use of some of the same tactics that anti-science forces have been using. For example, like the technology and advertising industry, researchers should be using metadata to better target messaging based on people's profiles according to personal online habits.

While the current level of public acceptance of research can be disappointing, the good news is that trust in scientists has fallen it is still relatively high compared to other information authorities.

As much as we pride ourselves on being logical beings, in reality, we humans are animals with messy minds that are just as governed by our social alliances, emotions, and instincts as our logic. Those of us involved with science, whether as supporters or practitioners, must understand and account for this.

The review was published in PNAS.