Read This, Not That: The Hidden Cost of Nutrition Misinformation
Our daily lives are inundated with misleading claims about nutrition. That’s not just distracting — it’s also harming our health.
“Things all came crashing down when I got a blister on the bottom of my foot that didn’t heal,” David explained.
“It put me in the hospital in danger of losing my foot.”
David had spent the last nine years treating his Type 2 diabetes with a low-fat vegan diet, on the advice of a doctor who authored a popular diet book. As part of this doctor’s program, David was told that diet is the best treatment for his condition, that the medical system is designed to keep us sick, and that he should stop taking his diabetes medications. Despite the doctor’s confident assurances, the diet failed to control David’s diabetes. Years of extreme blood sugar levels left him with nerve damage in his feet and eyes and a reduced ability to heal.
In the hours after I asked my Twitter audience to share how they’ve been harmed by nutrition misinformation, a steady trickle of stories like David’s began appearing in my inbox. One woman was hospitalized for oxalate kidney stones caused by a very-high-vegetable diet she had designed based on paleo and vegetarian sources online. One man lost excessive weight and had to be hospitalized for fecal impaction after following the carnivore diet. Another carnivore dieter had already gone public with his story of requiring a triple coronary bypass after years of believing the claim — common in the carnivore and low-carb diet communities — that the very high LDL cholesterol sometimes caused by these diets isn’t harmful.
Reading these accounts, I wondered how many people had been harmed by nutrition misinformation in less extreme ways that are harder to detect. Investigating further, two things occurred to me. First, the public health burden of nutrition misinformation is probably larger than most of us realize. Second, we know very little about it.
What Is Nutrition Misinformation?
Any discussion of nutrition misinformation must start by answering a deceptively simple question: What is it? Misinformation has been defined in various ways, but here I use it to mean information that is incorrect or misleading. Misleading information can be factually accurate, but lead us to incorrect conclusions. For example, the terms “multigrain” and “wheat” on bread packaging are technically accurate, but may lead shoppers to think they’re buying whole-grain bread when in reality it’s closer to white. Misleading claims also include those that are supported by some amount of evidence but exaggerate its effect size or level of certainty.
Yet this leads immediately to a thornier question: How do we decide what information is incorrect or misleading? The uncomfortable truth is that there’s no bright line that separates accurate claims from misinformation. Sometimes a simple citation check reveals a clear-cut case of misinformation; for example, a passage cites a study that directly contradicts it. More often, the judgment call is murkier. If a book accurately cites a study but doesn’t mention an important limitation that weakens its findings, is that misinformation? What if a book makes an argument that one or two researchers believe, but the rest of the field thinks is hogwash?
Further complicating matters, nutrition is a notoriously slippery science compared to most other biological or physical sciences. There are many vigorous debates in nutrition in which neither side’s arguments are misinformation.
As the director of Red Pen Reviews, a nonprofit that grades the information quality of popular nutrition books, I grapple with these problems often. Despite the challenges, we think it’s possible to make useful judgments — with subject-specific knowledge, a formalized scoring system, and the right mindset.
I’ll use the term “nutrition misinformation” in this piece with the understanding that it’s often hard to judge and it’s a flawed proxy for information quality. While it can be hard to define, it’s not hard to find, and it shows up in many forms, from the obvious to the insidious.
The Water We Swim In
Consider the case of Brian Wansink, former director of the Cornell Food and Brand Lab. Until 2018, Wansink was a prolific and influential researcher who wrote the popular book Mindless Eating and helped develop the 2010 Dietary Guidelines for Americans. In 2006, he published a position paper on nutrition misinformation for the Academy of Nutrition and Dietetics, the largest organization of nutrition professionals in the world, admonishing members to “provide consumers with sound, science-based nutrition information and help them to recognize misinformation.”
In 2017, a small group of academics called the “data thugs” revealed a number of troubling irregularities in Wansink’s papers. Subsequent investigation by the data thugs and others revealed a smorgasbord of scientific shenanigans ranging from sloppiness to possible data manipulation. Wansink eventually resigned his position after Cornell determined he had committed scientific misconduct, and at least 18 of his papers have been retracted. In retrospect, much of his work appears to be sophisticated misinformation.
Although this is an extreme and unusual situation, it shows that misinformation can come from anywhere, including the scientific community — and it can be hard to detect. Misinformation from the scientific community is especially corrosive precisely because scientific research provides the gold standard of information quality.
Yet I believe that people like Wansink are only a small part of the problem. Most of the misinformation that comes from the scientific community isn’t due to outright misconduct but rather suboptimal research practices by presumably well-intentioned people. Failure to follow best practices in study design, analysis, and reporting can, and often does, lead to misleading results.
This is a problem that afflicts all fields of science, but nutrition research faces several key challenges that make it an especially slippery science. First, since it lies at the intersection of countless food properties and many facets of human physiology, nutrition is incredibly complicated. Second, some of the most important nutrition-related conditions, like heart disease, emerge over many years. This makes them hard to study in tightly controlled trials, meaning that much of the evidence in nutrition science comes from observational studies that have a harder time teasing out cause-and-effect relationships. Third, it’s difficult to accurately measure what people eat in their usual lives, including by asking them, as most observational studies do. Randomized controlled trials, in which people are randomly assigned to different diets and compared over time, are a partial solution, but it’s hard to get people to stick to the assigned diet unless they’re locked inside a research facility (which happens in some studies).
Collectively, these limitations are so serious that some researchers want to relax the standards of evidence for nutrition relative to other areas of science.
Others (ahem, perhaps including me) dismiss this as “grading nutrition research on a curve.”
Even under ideal conditions, nutrition research is hard — and conditions are not always ideal.
Although public misinformation can come directly from published research, more often it arises as the information passes through the bullhorn of academic press releases and popular media. To illustrate this, let’s follow the path of a research finding as it winds its way from a scientific journal article to the public. In September 2022, the European Journal of Preventive Cardiology published a study reporting that British adults who drink two to three cups of coffee per day are at a lower risk of developing cardiovascular disease and dying than those who don’t drink coffee. Although it’s tempting to conclude that coffee is good for the heart and keeps us from dying, this is an observational study and it’s not clear that the association is due to the coffee itself, rather than some other difference between people who drink a lot of coffee versus people who don’t.
The authors of the paper are fairly cautious in their interpretation; they describe the finding as an association and only obliquely imply that drinking coffee reduces health risks.
The academic press release is similarly judicious, although it puts more emphasis on the less cautious statements in the paper. By the time we get to the CNN Health article, the shackles of restraint have been cast off. “Coffee lowers risk of heart problems and early death,” the headline proclaims. The article spends two sentences acknowledging that the findings don’t necessarily imply cause and effect, then proceeds as if they do.
Although in this example the CNN Health article is the main culprit, research suggests that academic press releases are a major part of the problem. When press releases exaggerate research findings or provide unwarranted health advice based on them, it tends to be repeated in the news. Carried away by enthusiasm for their own work, researchers themselves are often complicit. “The problems that a lot of scientists complain about are driven by their own choices to approve press release material that is in fact exaggerated,” says Chris Chambers, who led research on the media’s role in scientific communication. The problem is then amplified as the information passes to journalists and popular nutrition writers, because they often have limited science literacy and a strong incentive to write articles that get clicks.
Do you know that the scent of vanilla increases blood flow to the penis?
That only people with Type B blood can eat all types of dairy and remain in good health?
That eating too little salt as a child increases a person’s risk of drug addiction later in life?
Questionable claims like these abound in popular nutrition books. While these particular claims may be chuckle-worthy, what’s less amusing is that Americans buy about 5 million diet books per year, many of them saturated with misinformation.
What is the information quality of the average popular nutrition book? Red Pen Reviews has published 18 reviews of these books, which we score using a structured method that yields percentage scores for scientific accuracy, reference accuracy, and healthfulness. Among these books, scientific accuracy scores range from 20% to 95%, with an average of 48%.
Reference accuracy and healthfulness do somewhat better, with average scores of 65% and 67%.
This suggests that the information quality of popular nutrition books is highly variable, low to medium on average, and particularly poor in scientific accuracy.
Of course, nutrition books are only a fraction of the nutrition media environment. How common is nutrition misinformation in other media? Very little research has been done on this to date, and the few studies that have been published cover only narrow slices of the nutrition media environment. A 2019 review on health-related misinformation shared on social media identified only three studies on nutrition misinformation; across very different contexts (Italian social media, YouTube videos on anorexia, and Arabic Twitter), misinformation was common.
An additional study searched the scientific literature and popular media for “myths and presumptions” about obesity — mostly related to nutrition — and concluded that “false and scientifically unsupported beliefs about obesity are pervasive in both scientific literature and the popular press.”
Although the problem hasn’t been quantified well, anyone who has used the Internet will recognize that fishy nutrition claims are common there. Take, for example, the ancestral lifestyle advocate Brian “Liver King” Johnson, who likes to throw spears and pose shirtless with enormous raw beef livers. With nearly 6 million followers across social media, Liver King is an influential source of diet and lifestyle advice. Until recently, he insisted that his “all-natural” bodybuilder physique was entirely due to his “ancestral” diet, exercise, and lifestyle. Late in 2022, he was forced to come clean after a video exposed his extensive use of bodybuilding drugs. Now consider that Liver King is just one of thousands of diet influencers across the Internet.
It may not shock you to learn that a man who looks like a professional bodybuilder and wears animal pelts on his head owes his physique more to drugs than to beef liver. But nutrition misinformation isn’t always so obvious. At times, it’s so subtle and pervasive that it’s simply the water we swim in. Walking through the grocery store, you may have noticed that many foods and supplements make health claims of one kind or another. Since explicit claims about the impact of a food on specific health conditions are regulated in the U.S. and Europe, health claims are usually more indirect. They’re typically statements about a food’s nutrient content like “low fat” or vague structure/function claims like “supports brain health,” which are only lightly regulated.
To explain how these health claims could be harmful, consider Tang, a drink that is little more than flavored sugar water. The citrus-adjacent beverage advertises that it provides 100% of a person’s daily requirement for vitamin C. Although vitamin C is not a nutrient of concern for the U.S. public
and therefore likely has no nutritional benefit for most people, this may give consumers the impression that Tang is healthier than it is, increasing the likelihood that people will buy it and drink it. As we saw previously, breads labeled “multigrain” and “wheat” often confuse consumers into thinking they’re buying healthier whole-grain bread, when it’s actually closer to refined white bread. Studies tend to suggest that health claims on product packaging increase the likelihood that people will buy and consume those foods and beverages.
Health claims on food packaging can be informative and may lead to healthier choices in some cases. Nonetheless, some of these claims are harmful because — despite being technically accurate — they mislead consumers into buying food that isn’t as healthy as they think it is. Although the impact of this misdirection is probably subtle, it has the potential to happen each time we buy food, so it could add up to a substantial public health burden. Yet its impact remains largely unknown.
It seems likely that nutrition misinformation is pervasive. Why is there so much of it, and why do people fall for it?
We Are What We Eat
My father once told me that the three topics people are the most irrational about are religion, politics, and nutrition. While you may or may not agree with this, these topics have two things in common that favor misinformation: They relate strongly to our identities, and they’re hard to get definitive answers about.
The foods we eat, and how we eat them, are tightly intertwined with our identities. We’re attached to the foods that represent our cultures and our families. We’re attached to our personal eating habits. We display our values and discernment to others through the foods we select. And as something we put into our bodies several times a day, food has a personal intimacy that most areas of science don’t.
The internet has also facilitated our atomization into various diet tribes. Some even describe themselves in these terms, like the World Carnivore Tribe on Facebook. On Twitter, users form communities and announce their affiliations using icons and emojis: steak emojis for carnivores and Ⓥ icons for vegans.
Adopting a set of tribal beliefs, whether accurate or not, may play an important social role by signaling and reinforcing group affiliation.
Once a person has identified with a tribe, they tend to adopt tribal beliefs,
minimize the downsides of those beliefs, and defend the tribe against perceived attacks from alternative ideas. This dynamic can favor misinformation and cause people to resist corrective evidence.
To understand why this happens, we have to understand the appeal of popular diets and the communities that surround them: they offer solutions to peoples’ problems. “A lot of people are just desperate to not be in pain,” explains Alan Levinovitz, associate professor of religion at James Madison University. Levinovitz is the author of the book Natural: How Faith in Nature’s Goodness Leads to Harmful Fads, Unjust Laws, and Flawed Science, which draws parallels between religion and diet cultures. “Nutrition misinformation is part of every single magico-religious tradition and is part of a broader area of misinformation, which is misinformation around healing.” In addition to providing community, Levinovitz argues, diets can provide a sense of hope for people who suffer from health conditions or worry about developing them. Diets give us a feeling of control over our vulnerable human lives. They also offer simple rules to navigate life’s thicket of daily decisions more easily. In these ways, diets can fill a similar psychological niche as religion (this is not to deny that diets can have physical health benefits as well). From this perspective, diets are more than just food, and believing misinformation is more than just a cognitive error.
People who are suffering want relief, and if none is available from their doctor or other conventional sources, they may turn to alternative sources that offer sympathy, community, and extraordinary claims. And there are plenty of people ready to make such claims.
Consider two books, both written in an accessible style for a general audience. One, Why Calories Count, argues that century-old findings showing that calorie intake impacts body weight are still true. Another, The Calorie Myth, argues that experts have been wrong about calories all along. Which do you think sold more copies? If you guessed the second, you’re correct.
Research suggests that misinformation often spreads more readily than accurate information and this is correlated with its greater novelty. Novel and exaggerated claims command more attention and sell more books, so authors have a strong incentive to make them and publishers have a strong incentive to print them. Compounding the problem, there is little disincentive against publishing misinformation because the general public usually doesn’t have the resources to critically evaluate it. Inaccurate or misleading nutrition claims rarely come back to bite the author. In fact, when delivered skillfully, the dividends of these types of claims are attention, respect, and money. Yet their impact on the audience is often not as beneficial.
Dying for Health
Sifting through the academic literature on nutrition misinformation, I was struck by the absence of answers to obvious questions. In particular, I wasn’t able to find a single estimate of the total public health burden of nutrition misinformation. So I built a model. It estimates the number of premature deaths caused by nutrition misinformation each year in the U.S.
It’s not hard to understand why no one has done this before. Since we don’t have estimates of the total impact of nutrition misinformation on eating behavior, we can’t hope to infer a precise estimate of its impact on health. Academic researchers probably aren’t excited about publishing a model whose output relies on guesses and has a 30-fold uncertainty range. So why do it?
First, it allows us to place plausible bounds around our estimate. Second, it identifies our main sources of uncertainty and helps us think about how those might be addressed in the future. Third, it provides a foundation on which more precise models can be built.
Conceptually, the model is very simple. It multiplies together three parameters:
How many people die each year in the U.S.?
What percentage of all deaths are premature and caused by suboptimal nutrition?
What percentage of suboptimal nutrition is caused by nutrition misinformation?
About 2.9 million people die each year in the U.S. What percentage of these are premature deaths caused by suboptimal nutrition? The published estimates I found range from 14% to 22%. This implies that about 400,000 to 600,000 people die from suboptimal nutrition in the U.S. each year. Although it’s fair to be skeptical of these figures, I don’t think this is the model’s main source of uncertainty so I take them at face value.
The model’s main source of uncertainty is the percentage of suboptimal nutrition that is caused by nutrition misinformation. I reached out to several experts, and my Twitter audience, to brainstorm ways to estimate this parameter, but I didn’t come up with anything workable. So this is where we enter the world of plausible guesses. I think nutrition misinformation impacts average eating habits to some degree, but I don’t think it’s one of the main reasons most people eat a suboptimal diet. Obvious factors like cost, convenience, taste, habits, and culture are probably more important for most people. Therefore, I take 1% as my lower bound and 20% as my upper bound.
I also use corresponding lower-bound and upper-bound estimates for the percentage of all deaths that are caused by suboptimal nutrition.
Using these inputs, the model estimates that between 4,000 and 127,000 Americans are killed by nutrition misinformation yearly. If you don’t like my guesses, you can make your own: As a rule of thumb, for each 1% divergence from an optimal diet that is caused by nutrition misinformation, 5,000 Americans die each year.
For context, the U.S. Department of Transportation estimates that about 43,000 Americans were killed in motor vehicle accidents in 2021 — an unusually deadly year. The Centers for Disease Control and Prevention (CDC) estimate that about 19,000 Americans were killed in gun homicides in 2020. Both statistics have triggered urgent national conversations about public safety. For the death count from nutrition misinformation to exceed each of these individually, it would only have to account for about 9% of our suboptimal diet choices. I think it’s plausible that nutrition misinformation could kill more Americans each year than both combined.
Dying is bad, but so is wasting money, and in the U.S. we waste a lot of money on ineffective dietary supplements and health foods. In 2020, Americans spent about $50 billion on dietary supplements, or $194 per adult on average. A subset of these supplements are well supported by evidence, but most are probably ineffective.
For example, Americans spend about $2.1 billion per year on weight loss supplements, and despite grandiose claims from the likes of Dr. Oz, not one of these has been shown to cause clinically meaningful weight loss.
The ineffectiveness of weight loss supplements is particularly striking when compared with the new generation of weight loss drugs like Wegovy, which causes 15%-18% loss of body weight in people with obesity when paired with diet and exercise advice.
Nutrition misinformation can also harm us by leading us to forgo effective medical treatment. When Steve Jobs was diagnosed with a typically treatable form of pancreatic cancer in 2003, he initially declined surgery, and instead tried to treat it with a vegan diet and other alternative treatments. This was ineffective, and nine months later he had the surgery. While it’s impossible to say whether Jobs’ flirtation with alternative remedies is the reason why he eventually succumbed to his cancer, we do know that, on average, cancer patients who choose alternative treatments (which may include nutritional supplements and/or special diets) instead of conventional cancer treatment are about 2.5 times as likely to die during follow-up as patients who receive conventional treatment.
Nutrition misinformation can also foster an irrational fear of ordinary foods. According to the popular book The Carnivore Code, most plant foods are riddled with toxins, meat that isn’t pasture-raised is suboptimal, and even tap water is a questionable “fluoride, chlorine, and pharmaceutical-enriched liquid.”
The way to avoid obesity, disease, depression, and low-libido is a strict diet of pasture-raised meats and organs like liver and testicles. While it’s normal and healthy to have rules around what we eat, when rules become irrationally rigid and enforced by fear, they can cross the line into disordered eating. The archetype of this is orthorexia nervosa, an eating disorder in which people excessively restrict the types of foods they eat, as opposed to the total amount of food, as in anorexia nervosa. This can cause serious psychological distress and disconnect people from their culture, friends, and family. People who follow The Carnivore Code and other strict diets don’t necessarily have an eating disorder, but it’s not hard to see how the arguments might incline some of them in that direction. And while a strict carnivorous diet is an extreme example, less extreme diets like gluten-free, plant-based, and low-carb diets can have a similar impact on susceptible people.
Trust in institutions is a fundamental building block of a well-functioning society. I believe nutrition misinformation has a subtly corrosive effect on society by reducing public trust in scientists, nutrition professionals, doctors, and the government. A common theme in popular nutrition books is a contrarian narrative that paints these groups as incompetent, biased, and/or corrupt.
Delegitimizing conventional nutrition authorities lets authors fill the void with their own arguments, which in my experience, are usually weaker than those they seek to dismiss. That said, conventional nutrition authorities do make mistakes, and sometimes big ones.
Nutrition science is hard, most people are naturally overconfident in their beliefs, and some people have real conflicts of interest. For these reasons, we should be somewhat skeptical of conventional nutrition authorities! But probably not as skeptical as certain nutrition contrarians would like us to be.
Setting the Table
How can we address nutrition misinformation? The first challenge is that we don’t know much about it. How much does it guide our eating behavior? What are its main sources? What is its public health impact? And, fundamentally, what even counts as nutrition misinformation?
It’s easy to pen a concise definition of nutrition misinformation, but the devil is in the details. As we’ve seen, nutrition is a complex and uncertain science, and there’s no bright line between accurate information and misinformation.
However, I believe it’s nevertheless possible to judge the information quality of nutrition claims in a useful, if imperfect way. This topic is too complex to fully address here, but I think the following principles are important. First, due to the uncertainty involved, we shouldn’t necessarily try to identify nutrition misinformation per se. A better approach is to judge the information quality of claims using a semiquantitative scale, like the zero-to-four scale we use at Red Pen Reviews.
Second, scoring should be done according to a well-defined method that’s designed to maximize informativeness and consistency and minimize human bias. Third, reviewers need both nutrition-specific expertise and a more general ability to think critically about the strength of different evidence types.
Once we’ve identified low-quality nutrition information, what do we do about it? It’s important to point out that we’re already addressing this problem to some extent. In the U.S. and the European Union, health claims on foods and supplements are regulated, and watchdog organizations like the Center for Science in the Public Interest help enforce this regulation through regulatory letters and in the courts. Foods and supplements cannot make direct claims about treating or preventing specific health conditions without convincing evidence. Although largely invisible to the public, and not completely effective, government regulation of health claims on foods and supplements is a vast dam holding back a churning sea of nonsense.
Yet across most nutrition media, there is very little protecting the public from misinformation. Health claims in diet books aren’t regulated, publishers rarely fact-check them, and authors and publishers have little incentive to maintain high information quality because the public doesn’t have the resources to critically evaluate most of their claims.
An analysis of the books we’ve reviewed so far at Red Pen Reviews reveals that there is no correlation between our overall information quality score for a book and its Amazon rating. While book reviews in respected media outlets like The New York Times and The Atlantic may appear more informative, most aren’t written by experts and don’t fact-check even a single citation in the book they’re reviewing. One-off reviews by experts can be informative but they aren’t available for most books and they may not be easily accessible to the public. The public simply does not have a reliable way to gauge the information quality of most popular nutrition books, and I would argue, nutrition media in general.
This is why we need organizations that assess and communicate the information quality of popular nutrition media in a rigorous and accessible way. Not only does this give the public a much-needed resource for judging the information quality of nutrition media, it provides a long-term incentive for authors to publish better information in the first place. I’d like to see these types of efforts expand into other media classes like social media and news media, and other domains of health and performance like medicine and athletics, but they will need more attention and resources to realize their potential.
Social media is a major conduit for misinformation, so voluntary or legislated anti-misinformation measures could potentially apply some portion control to the public’s nutrition misinformation diet. Research suggests that a combination of modest anti-misinformation measures could reduce the spread of viral political misinformation on social media by more than half. For example, platforms could curtail the algorithmic amplification of misinformation topics, reducing their impact without having to remove them outright. Or they could “nudge” users to consider information accuracy before sharing content more likely to contain misinformation, which reduces the spread of misinformation. However, studies like this haven’t yet been done on nutrition misinformation specifically. In addition, curtailing nutrition misinformation on social media requires being able to identify it in the first place, which can be challenging.
Another strategy for combating misinformation is to give the public stronger signals of the quality of online information sources. One project along these lines is NewsGuard, a browser extension that uses a green-red signal and a zero-to-100 “trust score” to indicate whether a news website meets basic standards of credibility and transparency. Nothing like this currently exists in nutrition media.
We should also work toward addressing misinformation in scientific research and its translation through university press releases and news media. There are many possibilities for addressing this problem, including emphasizing best research practices in scientific training and journal requirements, identifying and addressing the countless “predatoryjournals” that don’t meet scientific standards, supporting watchdogs like the data thugs and Retraction Watch, improving how researchers communicate their findings to journalists, and training journalists to communicate science more accurately. Since we know that many studies don’t replicate, public communication about science should focus on bodies of evidence rather than the unreliable churn of the latest nutrition studies.
Ultimately, we may also need to ask a deeper question: How do we address the suffering and alienation that drive some people toward nutrition misinformation? If we’re going to pull the rug out from under beliefs that give people hope and a sense of control over their lives, can we offer them a better alternative?
I don’t have answers to these questions, but perhaps we should consider how society might meet these needs in a more constructive way.
Given its apparent importance, I think we should be investing more resources in studying and addressing nutrition misinformation. I hope to see a world where people like David don’t have to face amputation before figuring out that the diet they’re on isn’t as infallible as they were told.
Name changed for anonymity at David’s request.
This is not an exhaustive list of the special challenges that face nutrition science. The inability to use placebo controls, and the fact that one food usually has to replace another, are two other reasons why interpreting the findings of nutrition studies can be difficult.
GRADE is a widely used system for judging and communicating the strength of conclusions in systematic evidence reviews. A group of nutrition researchers published an alternative system called NutriGrade that applies a more lenient metric to nutrition studies.
Peter Lurie, president of the Center for Science in the Public Interest, described it in this way when we spoke. Researchers from the GRADE working group made a similar argument in their response to the NutriGrade paper that advocates more lenient standards of evidence for nutrition: “lack of blinded randomized controlled trials and the resulting sparse bodies of randomized evidence is not a methodologic shortcoming of the GRADE approach but a limitation of the evidence base.”
“In concert with the findings from the present study, non-caffeinated compounds are likely responsible for the beneficial effects of coffee consumption on CVD and survival. … Mild–moderate coffee intake of all types should not be discouraged but rather considered part of a healthy lifestyle” (Chieng et al. 2022, p. 9).
The Bulletproof Diet, p. 208. See our review of this book here.
Eat Right 4 Your Type, p. 174. See our review of this book here.
The Salt Fix, pp. 100, 107. See our review of this book here.
For reference, we define scores of 0%-49%, 50%-74%, and 75%-100% as indicating low (red), medium (yellow), and high (green) information quality. A scientific accuracy score of 48% means that a book’s claims are weakly supported by evidence, on average. Before converting to a percentage score, we use a 0-4 semi-quantitative scoring system. Forty-eight percent corresponds to a score of 2, which for scientific accuracy criterion 1.1 is defined as “Overall, relevant evidence is intrinsically weakly convincing but is consistent with the author’s claim. Or, relevant evidence is intrinsically convincing but only weakly supports the author’s claim.”
Reference accuracy scores how well a citation supports the passage it’s associated with. Healthfulness is a composite of scores for how well an intervention would address the target condition in the target population, how it would impact general health, and whether it would supply all essential nutrients and important nonessential nutrients like fiber.
An Italian study reports that nutrition is one of the most popular topics in Italian Facebook’s “conspiracy echo chamber.” A second study finds that YouTube videos about anorexia often promote misleading pro-anorexia narratives. A third study reports that more than half of health-related claims on Arabic Twitter (including, but not limited to nutrition claims) from nonofficial health institutes and dieticians were rated as “false” by a panel of expert judges.
This FDA page gives an overview of the categories of health claims, including nutrient content claims and structure/function claims, and how they are regulated.
For example, the carnivore diet and to a lesser extent low-carbohydrate diets in general, appear to increase LDL cholesterol in some people. A common belief in the carnivore diet community is that high LDL cholesterol is harmless and possibly even beneficial. This is contrary to a large and convincing body of scientific evidence. The vegan diet, on the other hand, tends to lower LDL cholesterol, and I have not encountered any writing from that community arguing that high LDL cholesterol is harmless.
As judged by the Amazon.com sales rank and the number of ratings on December 8, 2022. Why Calories Count had a sales rank of #1,723,952 in Books and 58 ratings. The Calorie Myth had a sales rank of #349,528 in Books and 802 ratings.
This is my 80% confidence range.
This is more of a stretch if we think about it in terms of years of life lost rather than number of deaths. The reason is that nutrition-related premature deaths tend to kill people who are middle-aged and older (mostly from heart attacks and strokes), whereas motor vehicle crashes and firearm homicides tend to kill younger people.
In a Washington Post column, Craig Hopp, deputy director of the U.S. National Center for Complementary and Integrative Health, is quoted saying that the number of dietary supplements that have well-established benefits is a “short list.” A U.S. Preventive Services Task Force Recommendation Statement published in The Lancet concludes that “the current evidence is insufficient to assess the balance of benefits and harms of the use of multivitamin supplements for the prevention of cardiovascular disease or cancer.”
Defined as at least 5% average loss of body weight sustained over at least one year, which is a common threshold for clinically meaningful weight loss.
To be clear, I’m not arguing that these diets are bad overall, simply that some of the information associated with these diet communities can foster irrational fears and disordered eating in susceptible people.
Science journalist Gary Taubes makes this argument a centerpiece of his influential nutrition books. For example, a scathing passage from page 451 in Good Calories, Bad Calories argues that researchers in the fields of “nutrition, chronic disease, and obesity” are not real scientists, and what they do is not real science. Much of the rest of the book details alleged incompetence in research and government nutrition policy. Similar arguments have been repeated in a number of other popular nutrition books, including Nina Teicholz’s book The Big Fat Surprise: Why Butter, Meat and Cheese Belong in a Healthy Diet.
The biggest one that comes to mind for me is the idea, most prevalent in the 1980s to 2000s in the U.S., that low-carbohydrate diets are dangerous. The idea was gradually slain by a steady trickle of randomized controlled trials and observational studies.
The Fact Checker column in The Washington Post uses a similar scale to judge the factuality of political rhetoric — from one to four pinocchios.
Credit to Alan Levinovitz for getting me to think about this.
Stephan J. Guyenet, PhD is a former researcher in the fields of obesity and neuroscience and the current director of Red Pen Reviews. His book The Hungry Brain was named one of the best books of the year by Publishers Weekly and called “essential” by The New York Times Book Review.
By highlighting text and “starring” your selection, you can create a personal marker to a passage.
What you save is stored only on your specific browser locally, and is never sent to the server. Other visitors will not see your highlights, and you will not see your previously saved highlights when visiting the site through a different browser.
To add a highlight: after selecting a passage, click the star . It will add a quick-access bookmark.
To remove a highlight: after hovering over a previously saved highlight, click the cross . It will remove the bookmark.
To remove all saved highlights throughout the site, you can click here to completely clear your cache. All selections have been cleared.