#50 How do we challenge misinformation in sport & exercise? Hint: education doesn't work
Whether it’s keto/low-carb diets, complementary and alternative medicines like acupuncture or chiropractic, Crossfit, HIIT, politics, religion, or vaccine hesitancy: ideological bias is a common feature of life in the 21st-Century.
Breaking myths, and challenging pseudoscience and misinformation in exercise and health (and contemporary culture more broadly) is a complex process, but a crucial one if we are to increase the uptake of scientifically-derived guidelines. Some would suggest that education should be the priority; i.e., that we emphasize facts and figures in order to inform educated decisions among the public. In some limited circumstances, this might actually work.
Research from Dyer and Hall showed that a critical thinking class, specifically addressing pseudoscience, produced a large and significant reduction in pseudoscientific beliefs, whereas classes in both Research Methods and an unrelated subject didn’t. This is a win, unquestionably. But education alone may be an insufficient guarantor of epistemically warranted beliefs, especially in adults who aren’t paying to attend an educational institution, and whose beliefs may be altogether less malleable.
Consider that when someone prescribes to an ideology, it nearly always manifests as a profound confirmation bias. That is, a predisposition to favourably interpret data that confirms their beliefs, while simultaneously looking unfavourably on data that contradicts their beliefs. In this way, they’re able to insulate themselves from corrective arguments.
According to Nyhan and Reifler: “Once a belief is formed, people generate explanations that fit and further reinforce this belief and tend to vigorously reject counter-arguments that make them uncomfortable, regardless of their validity”. This might explain why error correction, fact-checking, and persuasive arguments sometimes evoke stronger retention of misinformed beliefs, especially in those who hold them more staunchly.
So, to challenge incorrect/epistemically unwarranted beliefs, education alone isn’t enough, and may even do more harm than good. We must use techniques that are more subtle and more sophisticated. I’ll outline just a few of them.
Cognitive forcing is a technique that encourages individuals to challenge and mitigate their biases. This integrated educational approach can be considered a type of “metacognition” because individuals study the decision-making process itself, and engage critical thinking as a means of how to think rather than what to think. This type of choice architecture will increase the likelihood of them arriving at a factually correct conclusion. So, we must be ingraining basic critical faculties in youngsters at home, and integrating them at school and college so that graduates will be better equipped to navigate the world, regardless of their field of study or chosen career-path.
Another potentially powerful strategy at combating ideological bias in those with less malleable beliefs, is defeasibility. Defeasibility theory is a mode of reasoning in which the individual is encouraged to search for evidence that might undermine their position. In other words, the practitioner doesn’t challenge the client’s preconceived notions, but rather urges them to do it themselves (or at the very least to consider how a valid challenge to that notion would appear).
For instance, a chiropractic proponent would be asked to consider the circumstances under which they might reform in their ideas on the practice; e.g., convincing data, testimony from a friend or colleague, testimony from a chiropractor, a negative personal experience, etc. For the skeptic practitioner, this approach may elucidate the extent to which that belief is revisable (if at all).
An approach that may be complementary to defeasibility is the notion of self-nudging which encourages the subject to conduct an internal inquiry as to how strongly their conviction is felt. In practice, this could be done by rating the strength of the conviction on a scale, and considering from a qualitative perspective why those convictions are felt. Both defeasibility and self-nudging may be preferred in scenarios where an individual harbours longstanding or unwavering beliefs. Using these strategies, one may be able to cleave room for doubt, enabling discussion of alternative hypotheses.
Finally, others suggest pre-emptive strategies are crucial in challenging the future spread of misinformation. So-called inoculation theory supports the notion that individuals can be protected against persuasive attacks on their attitudes, in a similar manner to immunization against a virus. For example, an exerciser could be inoculated against falling for a disproven sports supplement (e.g., a fat burner) by pre-emptively exposing them to weakened versions of the arguments that are likely to be used in the marketing rhetoric. As a result, the individual would develop counterarguments that improve their ability to recognize similar types of misinformation in the future. A recent meta-analysis showed that inoculation was more effective at reducing individual susceptibility to persuasive arguments when compared to corrective facts. Even certain video games designed to pre-emptively expose individuals to the strategies used in “fake news” have helped confer cognitive immunity against the spread of misinformation.
As Jonathan Swift famously noted: you cannot reason someone out of a position they were not first reasoned into. That’s why simple education with facts and figures may be ineffective in the majority of cases, particularly in those where the individual holds a belief that’s not evidence-based. We must use more sophisticated and creative techniques, in addition to efforts at education, if we are to maximize the uptake of scientifically-derived guidelines.