I’ve been investigating pseudoscientific claims for 13 years now. The word “pseudoscience” means “fake science,” something that superficially looks like science—and is often sold as such—but that isn’t. It attempts to imitate it without reaching equivalence. Astrology is a classic example. Behind the star charts and calculations, we see a stagnant belief system that once attempted to explain human behaviour but that can only at best produce the occasional random hit or a statement so vague that it fits everyone.
What are the red flags that tip us off to the presence of a pseudoscience? When I worked in cancer research, I read a seminal paper called “The hallmarks of cancer.” It identified the main characteristics that define this family of diseases. I want to do a similar, informal exercise here, highlighting the traits of pseudoscience and describing the types of people who end up making lucrative careers pushing theories and products that only trivially resemble real science.
Given my background in the health sciences, I will focus exclusively on their darker counterparts, though we should keep in mind that pseudoscientific beliefs also arise in physics and archaeology, for example. My hope is that this overview will train your brain to warn you the next time you stumble upon something that looks too good to be scientifically true.
Oversimplification and certainty
If you’ve ever studied the human body—anatomically, molecularly, physiologically, memorizing kinase cascades in university or learning to identify irregularities under the microscope—you know how complicated it is. Yet, so many purveyors of pseudoscience talk of our bodies as if they are made up of tubes. Feeling ill? Your tubes must be clogged. When we do away with their jargon, that is basically the principle behind chiropractic, Reiki, and acupuncture. Their practitioners become plumbers of the flesh.
Oversimplification is a major hallmark of pseudoscience. Where real scientists see complexity and uncertainty, pseudoscientists claim unnatural simplicity with unearned confidence. “Pseudoscientists, unburdened by the ethical responsibilities of a professional order, will talk in certain terms,” writes Michael Marshall to me. He is, among many things, the project director of The Good Thinking Society in the UK, which promotes rational enquiry. “’Take this, do this, and follow my advice entirely, and it will work.’ The certainty is understandably seductive to patients.”
These people often denounce a single cause of all diseases and endorse a panacea that cures everything, what academically has been called an absence of boundary conditions. There are no boundaries, no limits to what their pseudoscientific treatment can achieve. If only medicine were that elementary….
Scienceploitation
Much like the tech industry, pseudosciences these days tend to move fast. By contrast, drug development is a long process, starting with cells in culture and laboratory animals before moving on to progressively larger tests in human volunteers. Who has time for that? Through a process that has been called “scienceploitation,”preliminary findings get turned into fool’s gold. Human trials are skipped, and people sell you products based on small studies done in rats or cells in Petri dishes. They go beyond hyping up these studies; they distort their meaning to make you think we’ve reached the point where their findings can be commercialized.
Fields of study that are still in their infancy—like epigenetics, stem cell research, and studies on the microbiome—go from “promising” to “revolutionary.” Echoing one of James Carville’s slogans for Bill Clinton’s 1992 presidential campaign, pseudoscientists will tell you that “it’s the mitochondrion, stupid” or “it’s epigenetics, stupid.” Whatever is fashionable in research becomes the one true cause of all diseases. It doesn’t matter that the supplements you are being sold as a cure-all have never been tested in humans.
Misappropriation of real science
Closely related to this is extrapolating inappropriately from a real scientific finding, whether it comes from an emerging field of study (as in scienceploitation) or a more foundational one. I heard from Alice Howarth, who has a doctorate in cellular and molecular physiology and is the deputy editor of The Skeptic magazine in the UK. She highlighted a classic example that often entices cancer patients. It is true that cancer cells feed on glucose, and that the more glucose there is, the more energy these cells will have. Pseudoscientists, though, will leap to a conclusion not supported by the evidence. “For example,” she writes, “’therefore, you can starve cancer cells of glucose by restricting sugar in the diet.’ This red flag can be exceptionally tricky to spot because, while the conclusions are faulty, the premise contains some truth.”
Absence of progress
New pseudoscientific beliefs can manifest and get quickly monetized based on misinterpreted scientific findings, yes; but older pseudosciences like homeopathy still exist, and they are characterized by an absence of progress.
Medicine changes and improves, astrophysicists discover new phenomena, biologists are still identifying new species of beetles; but pseudosciences that have been around for a while ossify. This is because the belief comes first, and the evidence must support it at any cost, so there is no self-correction as there would be in science. The same heavily diluted homeopathic concoctions that were around 100 years ago can be bought at your local pharmacy—not because they work (they don’t), but because homeopaths are only interested in proving that they are right. Hop into a time machine and go see a physician in the early 1900s and you will see that applications of real science change a lot.
Blinding with science
Scientific papers are commonly used as a smokescreen to convince you that a pseudoscientific claim is real. Their titles are long and complex, and often the paper itself is behind a paywall and can’t easily be accessed. This is called “blinding with science:” listing papers that are only vaguely related to the product being sold in an attempt to convince a customer that the product itself has a scientific seal of approval.
When you see a scientific article positioned as evidence for a product, ask yourself: did the study even test this product? A paper about quantum mechanics—a real field of physics—does not prove that a new gadget will cure your headaches by some bit of quantum science. Likewise, a review article on epigenetics does not prove that you need to buy an expensive kit to test your epigenetic markers or enrol in a monthly delivery of supplements said to tweak your epigenetic health. Don’t be fooled.
Science washing of ancient beliefs
Pseudosciences originally rooted in spiritual beliefs can get reinvented when superstitions are painted over with a thin coat of science. “Demonic possession and blocked energy meridians don’t get many fans these days,” Nicholas Tiller, who has a doctorate in applied physiology and who regularly interrogates trendy claims in the world of fitness, tells me. “One of the more subtle tactics in advocating for pseudoscience is when the purported mechanisms evolve over time, usually to drag an outdated approach into modern times.” Cupping, the practice of applying suckers or heated cups to the body and creating a vacuum that causes ecchymoses, was once meant to unblock meridians, channels along which an energy force was imagined to course by the Ancient Chinese—we’re back to plumbing, once again.
Now? Cupping service providers speak of “blood flow” and “oxygenation,” washing away the antiquated notions in scientific verbiage to give it a sheen of credibility. Tiller tells me he sees this with acupuncture, reflexology, gua sha, heat and cold therapy, and many other pseudoscientific interventions. What started as spiritual dogma becomes “scientific” to reach a savvier audience.
A biased handling of positive and negative results
When I was doing experiments in the lab and the outcome was not what I expected, I would troubleshoot the experiment, repeat it, change the reagents, have someone else do it… but if the result was the same each time, I had to grapple with the reality that I had been wrong. This is a core principle of science: an aspiration to being objective and to changing your mind in the face of strong evidence.
The pseudosciences pretend to do the same, but they do not. Their proponents often seek out confirmation for their beliefs in experiments that do not have a real threat of failure. When they get positive results from very loose studies that are bound to go their way, they cite this as definitive evidence for their theory. But when the outcome is not what they expected, they come up with brand-new hypotheses—excuses, really—for why the test did not work. In research into telepathy and other purported powers of the mind, a common way to explain negative results is that a skeptic was in the room. Therefore, psychic phenomena were “disturbed” by their negativity.
So-called complementary and alternative medicine, when put through the ringer of a rigorous clinical trial and yielding negative results, will be saved by saying that trials can’t work because every patient is so “radically unique” that treatment needs to be hyper-personalized. Higher forms of evidence thus cannot be trusted; conveniently enough, we can only rely on anecdotes. Pseudosciences are thus immunized from criticisms and contradictions.
Overreliance on testimonials
An outdated hallmark of pseudoscience is that its adherents are no fans of peer review. Scientific papers get published after being revised and commented upon by other scientists (the “peers” in “peer review”); but you wouldn’t want an actual scientist criticizing your paper on a nonsensical intervention and highlighting all the holes in it. This is why pseudoscientific services tended not to have papers published about them in the past.
But this has changed. With the rise of predatory journals that will publish anything for a fee and the emergence of journals dedicated to a particular pseudoscience like homeopathy, ill-scrutinized papers can be weaponized to give credence to nonsense.
Despite this recent development, we still see an overreliance on testimonials. Pseudoscientists can’t wrangle together enough funds to study everything; they thus continue to publish the endorsement of end users as if they are evidence of benefit. Multilevel marketing companies selling pseudoscientific skin patches to cure just about anything depend heavily on the people who sell the patches talking about how their lives have changed. They must also give voice to other customers who have tried the patches, testifying that they really do work. Imagine if drug approval by our government was done solely on the basis that a dozen people claimed it had helped them. Testimonials are not a reliable form of evidence, because we don’t know what would have naturally happened without the intervention, or what else the person was taking that could have benefitted them, or if what they are reporting is even reliable.
But to sell a pseudoscience, these personal stories work just fine.
Vague claims
A medical product that has been shown to lower blood pressure can claim that it is a treatment for high blood pressure. A pseudoscientific intervention that pretends to do the same can legally phrase the claim in an insidious way: “promotes the maintenance of healthy blood pressure.” (See an example here.)
I’d argue that anyone stepping into a pharmacy having just been told by their doctor that they have high blood pressure would see this phrase on a bottle of dietary supplements and think, “Clearly, this means it has been shown to lower blood pressure.” No.
The use of this kind of ambiguous language—“helps with,” “promotes,” “maintains”—can be made for just about any product without legal repercussions and without good scientific evidence. It really means that the main ingredient has been shown, in some study or other, to play some kind of a role in blood pressure, to use the above example. It does not mean that taking more of it will either lower or increase your blood pressure. Sneaky, isn’t it?
In the United States, this is referred to as a “structure/function claim,” meaning that the ingredient is known to be involved in a particular anatomical structure or its function. When you see these weasel words inside of a health claim (“helps,” “promotes,” “maintains,” or my favourite expression, “helps the body heal itself”), remember that they are there to sell you something that has not been scientifically proven to work. They are potential red flags for pseudoscience.
Logical fallacies
A tried-and-true technique to sell you something that isn’t backed by science is to make arguments that sound good but aren’t actually sound, what are called logical fallacies. The appeal to nature is the most common one I’ve seen in the health space. Marketing material will tell us that it’s natural, therefore it is both beneficial and safe. But many natural things are not safe for us: venoms, toxins, asbestos. A product needs to be tested; its origin tells us nothing about its safety or its effectiveness.
Similarly, if something has been used for a long time, it does not necessarily mean that it works. Doctors performed bloodletting for centuries and, outside of a few narrow modern needs, it doesn’t treat anything and can be dangerous. Something new is not necessarily better than what came before, either, and just because a product is popular does not imply it works. Those are all fallacies: superficially convincing, but bad arguments at their core.
Politicizing spin
An emerging hallmark that was pointed out to me by Timothy Caulfield is the politicizing spin. Caulfield is research director of the Health Law Institute at the University of Alberta and the author of many books about the wellness industry and its pseudoscientific tendrils. “They play to their political audience,” Caulfield tells me, “placing their pseudoscience on the political spectrum.” For example, he writes, “they’ll use manly language to make their bunk product feel like a fit for a manosphere audience. Do not let grievance or polarizing language overwhelm scientific facts!”
Associated with science denialism and conspiracy theories
If I’m selling what I believe to be an all-natural cure-all that renders modern medicine obsolete, I will likely deny that vaccines and pharmaceutical drugs work. Selling a powerful alternative means the mainstream intervention is either outdated or it never worked at all. I then have to contend with the fact that few doctors endorse my product. Why don’t they? Why can’t they see that what I’m selling is revolutionary?
The answer often comes in the form of a conspiracy theory: it’s because every doctor is in the pocket of the pharmaceutical industry, whose singular goal is to keep people sick and get them addicted to expensive products that do not heal them. And the theory grows in scope. It’s not just the doctors but everyone employed by a university who teaches medical students… and the journalists who cover the health space… and the government agencies that regulate health products. Pretty soon, most people are part of the alleged racket, and these days there seems to be a special appetite for these conspiracy theories. They help explain, in simple terms, why society appears to be crumbling, and they boost the egos of the people who believe in the theory by telling them that they are superior. Real conspiracies exist, yes, but be skeptical of grand conspiracy theories in health circles.
The traits of a secular guru
Pseudoscience has its hallmarks but its loudest proponents too share characteristics, and it’s only right to mention the work of academics Christopher Kavanagh and Matthew Browne who on their podcast Decoding the Gurus have crafted and fine-tuned a Gurometer. This tool evaluates how much an influencer or public intellectual behaves like a secular guru, meaning someone who uses sophisticated language to say little while creating the illusion of expertise. Those who make a living peddling pseudoscience tend to share these traits.
They pretend to be experts at everything, linking together different types of knowledge in novel yet incorrect ways. They flatter their followers and dismiss criticism as coming from an outgroup that doesn’t understand them. Their ideas are repressed or taboo because mainstream thinking is always wrong and irrelevant. They frequently air out their grievances against an establishment that ignores them and wonder why their views are not more popular.
They are in love with their own ideas and cultivate attention on themselves and what they say. Their warnings of a world lurching from calamity to calamity get ignored, much like Cassandra in Greek mythology. They exist to scare you and to convince you that only they can be trusted. Importantly, they have a revolutionary theory that will serve to immortalize them in the history books, a paradigm shift that lesser scientists have been incapable of ushering in. They invoke a powerful conspiracy to suppress this knowledge while monetizing most of their activities in order to profit handsomely—and in the health sphere, the most common pseudoscientific product is a dietary supplement: easy to manufacture and partially or completely exempt from regulatory oversight.
What I would add to this list is a suspicion that has grown in me in the last year, fed by the ubiquitousness of video recordings and what they reveal when we watch long enough: that under the confidence they radiate, many of these pseudoscience influencers are not mentally well.
Self-aggrandisement, esoteric verbiage, and whale carcasses
Last June, Brian Johnson AKA The Liver King was arrested after making a series of threats against podcaster Joe Rogan. He travelled to Austin, Texas to pick a fight with Rogan… to the death. The Liver King had become famous on Instagram, preaching for the alleged benefits of eating organ meat and adopting a so-called ancestral lifestyle. He also had his own line of supplements: ground-up bovine organs for people who couldn’t stomach eating a cow’s pancreas. He railed against the alleged dangers of sunscreen and Wi-Fi. In short, The Liver King was a pseudoscience peddler.
In the days leading up to his arrest and during the arrest itself, the Liver King’s crew kept filming and uploading to Instagram. What I saw were warning signs that the wellness influencer was not well. Addressing Rogan via Instagram, he referred to himself with a straight face as “willing to die, hoping that you’ll choke me out, I pray to God, because that’s a dream come true.” When law enforcement officers arrived at his hotel room to arrest him, he asked them for a toilet break first because he had done four coffee enemas that morning. The videos posted in that short time period—and there are many—show him to be rambling, self-aggrandizing, and intense. He does not sound like an actor pretending to espouse a fringe lifestyle to make money; he sounds to my ears like he is detaching himself from reality.
In the early days of the COVID-19 pandemic, another peddler of pseudoscience, Dr. Christiane Northrup, filmed herself espousing all sorts of esoteric concepts, far removed from the medicine she had been trained in. In between musical passages played on the harp, she spoke of Indigo children, geomancers, and time travellers, claiming to have once lived in Atlantis and that humanity was on the cusp of becoming a new species: Homo illuminus, freed from Evil with a capital “E.”
Then, of course, there’s Dr. Joe Mercola, whose conversations with an alleged spiritual channeller were leaked to our Office. In them, he says he will earn more Nobel Prizes than anyone before him for a new revolutionary theory he is putting together, which involves pushing carbon dioxide gas up the rectum to feed the bacteria living in the colon.
Mercola raised money for Robert F. Kennedy Jr’s unsuccessful presidential campaign. Kennedy, who has endorsed all manners of health pseudoscience over the years including taking a very vocal stance against vaccines, has left behind him a trail of questionable incidents. He dumped a dead bear he had wanted to eat but didn’t have time to take with him behind a bush in Central Park; he wrote in his diary that he once cut off a road-killed racoon’s penis to study its genitals; he chainsawed a dead whale’s head off and strapped it to the top of his minivan, driving for five hours with his daughter while both were wearing plastic bags over their heads to protect themselves from the whale juices; and his cousin said he used to put mice and baby chickens in his dorm room blender to feed to his hawks.
Finally, there’s Braden Peters, better known as Clavicular, a young influencer who claims that hitting your jaw with a hammer will help the bone grow back stronger and thus give your jaw a more masculine (and desirable) look. To some, Clavicular gives the impression of an influencer who has figured out life; but last April, he experienced a drug overdose while live-streaming. He later explained that all the substances he was taking were a way to cope, to fit in with other people.
We often denounce those who sell pseudoscientific interventions as “grifters”—swindlers who lie simply to make money—but we forget that many of them could use a few sessions on a therapist’s couch… or possibly a psychiatrist’s.
In summary
Pseudosciences continue to mutate and, while their core characteristics are unlikely to change much, new hallmarks will emerge and be recognized by those few of us who wade through this morass every day. It’s important to acknowledge that a pseudoscience is a simplistic simulacrum of science, kept into motion by people who either do not understand real scientific inquiry or whose egos are bruised by it. Research fosters humility; pretending to do science, however, rapidly feeds someone’s self-image in a way that science struggles to do.
New pseudoscientific ideas spring up, yes, but as Dr. David Gorski, an oncologist and managing editor of Science-Based Medicine, likes to phrase it, “Everything old is new again.” Pseudoscientific conjectures like terrain theory—in which viruses and bacteria do not cause disease, a real mockery of Pasteur’s scientific theory—get repackaged for younger generations who weren’t around to see them thoroughly debunked.
If you stumble upon an influencer who claims a revolutionary new health theory, who relies heavily on people’s testimonials, who can explain away any criticism of their idea without truly engaging with it, and who politicizes their stance and claims they are the victim of a grand conspiracy theory, you’ll know that the idea you are being sold is only barely trying to be scientific.