Opinion

Is bothsidesism killing us? (And why scientific consensus matters)

Our information ecosystem has become a massive false-balance machine. Fringe positions that have already been studied and shown to be wrong are legitimized, given a huge profile and presented as reasonable and respectable alternatives to the existing body of evidence.

False balance – also known as bothsidesism – occurs when opposing views are represented as being more equally valid than the evidence suggests. At its core, false balance is the misrepresentation of the scientific consensus. And it has become a significant issue.

It is happening with debates on the causes of climate change, the safety of GMOs, the effectiveness of unproven therapies, the value of transgender care, abortion and, of course, the benefits and risks of vaccines.

While the issue of false balance is usually linked to how journalists represent topics, false balance is increasingly driven by social media echo chambers, the fragmentation of the news media and the ideologically motivated embrace of fringe ideas. Too often this has allowed a small cohort of vocal contrarians to have an outsized impact on public policy and public perceptions.

A recent study published in the Journal of the American Medical Association highlights how a small group of U.S. physicians spread “inaccurate and potentially harmful assertions” about COVID and the vaccines in a manner that, the authors speculate, had a significant impact on the public understanding and adoption (or lack thereof) of effective public health measures, contributing to the death of hundreds of thousands of Americans.

Other studies have found that false balance can have an adverse impact on the intention to vaccinate, which has been associated with a reduction in vaccine coverage. And research I did with my colleagues at the University of Alberta found that the bothsidesism media coverage of COVID public health policies – such as the value of the natural herd immunity, let-it-burn approach – potentially undermined the adoption of more widely accepted mitigation strategies.

Much of the harms that flow from false balance happen because it creates an impression that the science is more contested than it is. An interesting study from 2022 illustrates how this can fuel harmful misperceptions about important health topics. The study, done in the Czech Republic, found that 90 per cent of those in the public underestimate how much physicians trust the COVID vaccines. Indeed, almost half believe that only 50 per cent of doctors trust the vaccines. And four out of five “assumed the medical community is undecided.” In reality, almost 90 per cent of physicians say they trust the COVID vaccines and only 2 per cent say they don’t. Importantly, the lack of knowledge about the strong physician consensus had an adverse impact on the intention to vaccinate.

False balance is likely another reason that those who believe conspiracy theories and misinformation wildly overestimate how many people agree with them. A study by Gordon Pennycook and colleagues, for example, found that while most (but, alas, not all) conspiracy theories are believed by a relatively small minority, “conspiracy believers thought themselves to be in the majority 93 per cent of the time.”

Valuing the body of evidence over fringe views may feel like giving in to conformity, the embracing of group think or the stifling of innovation. I understand these concerns. Contrarian and controversial scientific views are important parts of the scientific process. For example, plate tectonics and the idea that bacteria cause ulcers were both once viewed as fringe-y. But once these controversial views were raised – at scientific venues, it is worth noting – they were subsequently supported by data and cogent arguments. The evidence wasn’t ignored and there was no ranting about conspiracy theories on social media, popular podcasts and cable news shows.

One could argue that our current bothsidesism culture has also led to a significant waste of scientific resources.

So, I’m not arguing for a silencing or ignoring of controversial ideas. In fact, many of the topics that have been the subject of the most intense bothsidesism – vaccines’ link to autism, the therapeutic value of ivermectin for COVID, the dangers of GMOs, the role of human activity in climate change, Trump’s “Big Lie” – have not, and have never been, ignored or silenced. They have been openly debated (probably too much). They have received a ridiculous amount of attention in popular culture (probably too much). And they have been studied and studied and studied (probably too much). And the fringe version has been consistently and demonstrably shown, over and over, to be wrong.

Indeed, one could argue that our current bothsidesism culture has also led to a significant waste of scientific resources. But for the false balance fuelling public perceptions, would we have spent so much time, energy and money studying ivermectin, homeopathy or the lie that vaccines cause autism?

What is needed is a more accurate representation of the science and the degree and nature of the scientific consensus.

Yes, the scientific consensus might shift. Scientific paradigms evolve. And conventional wisdom should be continually tested and questioned. We must also be honest and transparent about areas where there is a plurality of legitimate views and scientific uncertainty. But to not recognize the value of the existing body of evidence – especially in the context of decision-making at the level of both the individual and public policy – is to largely gut the value of science. It is a paralyzing position. Agreed upon science-informed conclusions become meaningless.

Notably, in this cultural moment, the false-balance problem seems to emerge most often with politically polarized topics, highlighting the degree to which bothsidesism misrepresentations are more about ideological positioning than debating what the science actually says. Consider these questions: Would you rather fly with an airline that uses planes designed using the scientific consensus or on All Knowledge Is Relative Airline? Would you want to drive across a bridge informed by the existing body of evidence or one where the builders embraced a contrarian view of physics? The fact that these questions are so obviously absurd illustrates that in many, perhaps most, areas there is an acceptance that the scientific consensus matters.

There is good news. Studies have consistently shown that both explaining the scientific consensus and using a weight of evidence approach – that is, accurately representing what the available evidence says – can have positive impacts on correcting misperceptions. To this end, journalists should take great care in how they represent contrarian views by, for example, reporting how other reputable scholars view the issue and, when possible, referencing the scientific consensus on point. The scientific community – including universities, public funding entities and scientific and professional organizations – should create accessible and shareable content about the scientific consensus on important topics. There is a new project at Durham University, called the Institute for Ascertaining Scientific Consensus, that is designed to help with this goal by measuring and compiling the strength of scientific consensus on a range of key topics. The team, led by Professor Peter Vickers, hopes it will be a useful tool for policymakers and will “serve to inform laypersons, fighting against ‘fake news’ and misinformation.”

Finally, we also need to correct misrepresentations wherever they emerge, be it on social media, in the news, on popular podcasts or out of the mouths of our political leaders.

 

The comments section is closed.

7 Comments
  • Dr. Rob Murray says:

    I agree with Professor Caulfield’s main point having spent hours trying to convince a few people about the safety and efficacy of COVID vaccines and the serious existential threat of climate change. They tend to rely on only a few sources and don’t investigate their qualifications.

    Medicine is supposed to be based on science but it’s a highly siloed hierarchical profession and ignores science when it becomes inconvenient as in the case of Lyme disease. Let’s face it, doctors are smart people and infectious disease doctors are regarded by some as being the very smartest.

    Lyme is a multi-staged, multi-system, life-altering disease, the infectious disease equivalent of cancer and the first epidemic of climate change but infectious disease doctors misclassified it as a minor nuisance disease not worth further investigation when, in 1994 the insurance industry red-flagged it as being too expensive to treat.

    The CDC gave the management of Lyme and all the procedures around it to the private 13,000 member International Diseases Society of America [IDSA]. Infectious disease doctors have adopted the tactics of the fossil fuel and tobacco industries to cast doubt. They even have a front organization, the American Lyme Disease Foundation [ALDF]. The members of the Association of Medical Microbiologist and Infectious Disease [AMMI] Canada take their direction from and owe their loyalty to the IDSA.

    Now we know much more about how this stealth pathogen operates but the IDSA Lyme guidelines have been pruned of all science that doesn’t agree with dogma that Lyme disease is difficult to acquire, easy to diagnose, readily cured with a short course of antibiotics. If a patient has symptoms following treatment either initial diagnosis was wrong or they have Post Treatment Lyme Disease Syndrome [PTLDS] since there is no such thing as chronic Lyme disease. Infectious disease doctors even have a front organization the American Lyme Disease Foundation [ALDF] to throw people of track.

    We hear mostly about mosquito-borne diseases but ticks are responsible for 95% of vector-borne diseases in Canada yet there is deep institutionalize bias and denial of this hidden ignored epidemic that takes people off of work, out of school and into wheelchairs and often on social assistance.

    You can do a lot of damage in medicine when you cherry-pick and edit the science you use. The IDSA Lyme Guidelines used in Canada are evidenced-biased.

    AMMI, PHAC, CIHR, The Pan-Canadian Public Health Network, PMRA, Health Canada may speak with one voice on the Lyme file but consensus is not a substitute for being scientifically correct. Science is a way of knowing, a process, not a belief system.

    Medicine is a self-regulating profession but that only works if everyone is acting altruistically.

    “The greater the ignorance, the greater the dogma.” -Osler

    Rob Murray [DDS retired]
    Board member Canadian Lyme Disease Foundation [www.CanLyme.org]

  • Adelaide says:

    A study published in 2005 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/) examines why most scientific studies are false. Today more than ever, bias exists due to what side of the research is being funded by vested corporate interest – namely big Pharma and Biotech. It is undeniable that Covid vaccine research was flawed and biased, since very few studies examined the immune systems of those who remained unvaccinated yet had recovered from the infection. Furthermore, T-cells and long term immunity have gone largely unstudied. Why? There is little interest by scientists in conducting studies that could vilify public health efforts and media with evidence contrary to what corporations and governments would want us to believe. I am not convinced that a link has not been established between some cases of autism and vaccines, because the science stopped when public pressure became too great. In fact, many studies which should have been published were not, and the science rebuked by powerful interests, and scientists and professionals muzzled. A lot of intelligent people out there are sounding alarms and being muzzled. My idea of a conspiracy theorist is an uneducated and unintelligent individual, but that’s not what I’m seeing. And why is a contrary viewpoint automatically labeled a conspiracy? Couldn’t the popular sentiment then also be labeled as such, if all it means is a poorly studied viewpoint backed by poorly done (or no) studies?

    • Mike Fraumeni says:

      Excellent to point out the work of John Ioannidis’ analysis, cited according to Google Scholar over 12,000 by other papers. Also of relevance is the increasing burden put on physicians to adhere to clinical practice guidelines – or else. In Ontario there is increasing pressure for physicians to follow practice guidelines, for example Cancer Care Ontario produces cancer guidelines on behalf of the Ministry of Health and Long and Long-Term Care, contracted out to McMaster University in the past albeit I’m not sure if this is the case. In the U.S. Dr. Paul Hsieh pointed this out a number of years ago with reference to Obamacare:
      “ObamaCare will worsen the current physician shortage. The law will also drive physicians to become hospital employees or to join large Accountable Care Organizations (ACOs), where their treatment decisions will be monitored with mandatory electronic medical records. Government and private insurers will increasingly link payments to adherence to “comparative effectiveness” practice guidelines. Physicians will face significant conflicts-of-interest when their patients might benefit from treatments outside the guidelines, but the physician risks nonpayment (or losing his ACO contract) as a result…. Ask your doctor if he will be joining an ACO. (Not all doctors will.) If so, ask if your personal medical records can be excluded from his ACO practice statistics. If ACO rules allow it, this will help him practice outside the guidelines when medically appropriate (e.g., ordering an MRI scan sooner than usual or prescribing a stronger but more expensive antibiotic) without fear of hurting his overall statistics.”
      Source: https://www.forbes.com/sites/paulhsieh/2012/11/13/5-ways-to-protect-yourself-against-obamacare/

    • Mike Fraumeni says:

      Also in respect to guidelines, one could say that the Ontario Ministry of Health and Long-Term care is our Accountable Care Organization, our health insurer, and in this document for example with respect to family physicians and guidelines from the OMHLTC (remember of course clinical practice guidelines are population based guidelines and not ncessarily specific for any specific individual with any specific case of a particular medical issue), do a search for the word guideline in this document for how family physicians practice with the province: Eg of one instance of the word ‘guideline’ in this document and clearly this is tied into payment for the physician to practice a generic population based guideline:
      “A flow sheet or other documentation that records all of the required elements of the most current CDA guidelines must be included in the patient’s permanent medical record, or the service is not eligible for payment.”
      Source: https://www.health.gov.on.ca/en/pro/programs/ohip/sob/physserv/a_consul.pdf

  • Mike Fraumeni says:

    Excellent article and I completely agree with scientific consensus and weight of the evidence. Also I might add as Diane O’Leary mentions, attention to philosophical assumptions on what is “true” or “valid” is necessary. For example, this article questions the philosophical assumptions on Cartesian dualism which is most often assumed to be the “correct” assumption for the Western medical model:

    “Your impressive work on dualism in medicine and psychiatry has forced me and many others in medicine and psychology to reexamine long-standing assumptions. I would refer readers to your papers on medicine’s metaphysical confusion, the biopsychosocial model, and your recorded talk as part of the Philosophy of Psychiatry webinar series to learn about your views in detail.1-3 Can you briefly explain your argument that medicine has misunderstood dualism?” …
    Source: The Case for Dualism in Medicine—Philosophical Misunderstandings and Clinical Implications: Diane O’Leary, PhD. Psychiatric Times. 2023. 40(7)
    Link to fulltext online:
    https://www.psychiatrictimes.com/view/the-case-for-dualism-in-medicine-philosophical-misunderstandings-and-clinical-implications-diane-o-leary-phd

  • John Deacon says:

    I totally agree with you. The amount of time we spend debating over issues confounded by misinformation is tearing us apart. It is not that truth is hidden but ridiculed that has us confounded. When it is bent to suit personal agendas rather than to rightly inform us with what the best evidence is telling us, things get so tainted, that both common civility and the common good suffer.
    We are literally killing ourselves by giving an equal say to truth and falsehood. Both need to be voiced for the sake of our rightly discerning where we are best headed, with truth our only healthy go forward.

  • Wendy Smith says:

    This has bugged me for years. Bothsideism was huge in the 70s. As a teenager(!) I would rant and rave. There is such a thing as truth. And yes science changes…that’s how it fucking works!

Authors

Timothy Caulfield

Contributor

Timothy Caulfield is an author and Canada Research Chair in Health Law and Policy, University of Alberta.

Republish this article

Republish this article on your website under the creative commons licence.

Learn more