Ed. note: Author Anne Borden King examines the health misinformation crisis and its impact on children in a three-part series. Next: Sen. Stanley Kutcher on legislative attempts to curtail dis- and misinformation.
Autistic children are among the most vulnerable victims of the crisis of health disinformation. As Cécile Guerin has documented in Wired, much health disinformation is dispersed in private Facebook groups or ads that target vulnerable people within a specific algorithm, such as parents of autistic children.
The sale of harmful autism treatments to parents on social media is especially troubling because the children are not choosing or consenting to the ordeal of products such as MMS (a bleach product), stem cell “therapy” and medical tourism for fecal transplants. These products are dangerous, causing health issues, trauma and even death.
Since 2020, I have made scores of reports to Facebook about advertising of fake autism treatments like MMS, turpentine and unregulated stem cells via ads and groups on the platform. But I soon learned social media companies often do not remove user-reported health disinformation content. Since every piece of advertising or group post generates revenue and growth for a social media company, there is no profit motive for the company to remove any of it. As social media expert Charles Arthur has noted, these platforms have typically allowed mis- and disinformation to flourish, unless it reaches “the point where it’s driving the good users away.”
During the pandemic years, there was enough pressure that Meta, the company that runs Facebook, made some efforts by “pre-bunking” COVID-related posts with links to information from public health authorities. This small measure has been scaled back because the pandemic is over. The problem is, the infodemic is not over.
There’s now a movement in Canadian government to take on the issue of misinformation and disinformation on social media. Shortly before the Senate’s summer break, Senator Stanley Kutcher tabled Motion No. 113, calling for a study “on health misinformation, its impacts on Canadians and potential remedies.” Canada’s National Security and Defence Committee has also begun to address political disinformation, which too often flows along the same pipeline as health disinformation. Federal Bill C-292, introduced by Member of Parliament Peter Julian, seeks to require social media companies to inform users about how their personal information is being used.
These initiatives are crucial, because relying on social media companies to self-regulate their platforms is not working. But they also raise the question of how our government can develop the capacity to keep up with scammers on social media, especially when they evade regulators by migrating to new locations, virtually and in real life.
Consider the case of British Columbia naturopath Jason Klop. In 2020, the CBC reported that Klop had been selling fecal transplants as an autism treatment, a process that is neither evidence-based nor approved by Health Canada. As the CBC described it, Klop had been running a medical tourism storefront of sorts, persuading parents to invest around $15,000 for an all-inclusive package that included a stay at a Hilton Hotel in Rosarito, Mexico. At night, they could walk on the beach. During the day, their children would be given enemas containing someone else’s fecal matter, known as Fecal Microbiota Transplants (FMT), with claims that it could treat autism.
Experts labeled this practice as being non-evidence based and very risky for children. The British Columbia College of Naturopaths took action, ordering Klop to stop selling medical tourism for fecal transplants as an autism treatment and to cease selling fecal capsules, some of which, the College alleges, had been made from his nephews’ feces. The “Klop Kids” Facebook page, where products and services were marketed, was shuttered and the website selling the program (called NovelBiome) was rendered inaccessible on Canadian browsers. Klop appealed the College’s decision to the provincial Superior Court in December 2022, which ruled in favour of the College. In March 2023, the B.C. Court of Appeal dismissed Klop’s appeal application.
Relying on social media companies to self-regulate their platforms is not working.
But just two months after the Court of Appeal decision, I noticed a parent in an autism Facebook group mention an upcoming trip to Mexico for their son’s FMT treatments. It turned out there was a new Facebook group in which a group administrator was sharing an appointment scheduling tool to have an FMT Zoom consultation with Klop. A month later, after announcing in a Facebook post that it would no longer be offering medical tourism after summer’s end, an administrator noted that the company would be focusing on capsule production; however, that has not been approved by Health Canada. (A NovelBiome representative recently confirmed the changes to the CBC). With affiliated clinics in Hungary, Australia and Panama, the direction of the company outside of Canada remains to be seen.
When companies selling unproven and risky autism treatments are international in scope, it adds further complication for regulation and enforcement. An investigation may involve many moving pieces, with complex, multi-year collaborations. For example, in the case of the Genesis II company, which was selling MMS, a bleach product, as a “treatment” for autism, AIDS, COVID and more, the U.S. Justice Department and the Food and Drug Administration worked together, in addition to collaborating with law enforcement in Colombia, where two of the dealers had travelled. Ultimately, they were arrested and jailed, with their products destroyed and years of evidence ready to be presented at the coming trial.
But even after the arrests, Genesis II and other MMS groups continued to pop up on social media, especially on platforms that do not prioritize stopping disinformation, such as Telegram. Kerri Rivera, another promoter of bleach “cures” for autism, has continued to elude law enforcement, moving from country to country and thriving on platforms like Rumble and Telegram. Chillingly, administrators on Rivera’s channels have advised parents not to seek medical care for their children when they experience side effects to the bleach, claiming it’s a positive aspect of the child’s “detox” process.
Unfortunately, the COVID era has complicated the issue of how to combat health mis- and disinformation. Some are now wary of government over-reach following pandemic restrictions that included restrictions of speech on social media. This wedge issue is playing out in U.S. courts, with a recent ruling by a Trump-appointed Court of Appeals judge that U.S. government employees should be barred from discussing misinformation content with social media companies at all. The cultural hindsight from the past three years has been weaponized to argue against any controls over health disinformation. (The decision is being appealed by the Biden administration.)
Why does this all matter so much? Shouldn’t a social media platform have free speech, regardless of whether that speech is true or false? Shouldn’t individuals decide for themselves about health products? These are important questions to be sorted out, but they’re irrelevant when it comes to dangerous products that target children.
If our government treads lightly on the issue of autism “cures” like bleach, unregulated stem cells and medical tourism for fecal transplants, it is abdicating its responsibility to protect the most vulnerable. By addressing the many-headed hydra of social media, our policymakers may be able to make the real world a safer place for kids.