“Chocoholics” and other nutrition myths

We all make multiple food choices daily, and our beliefs about nourishment are often shaped more by popular culture than by science. Americans spend billions of dollars annually in search of the next weight loss miracle. But what should we believe? Here are some common nutrition myths.

Though people have used terms like “chocoholic” since the 1960s, language about food addiction has intensified in recent decades. Food Addicts Anonymous lists sugar, along with flour and wheat, as addictive in nature. Multiple companies sell supplements to help people “detox” from sugar.

Is it really addictive? The high sugar and fat content of many chocolates can trigger addiction-like cravings. But behaviors like smiling and hugging also affect those neural pathways, and we don’t consider them addictive.

A 2016 review in the European Journal of Nutrition found “little evidence to support sugar addiction in humans.” Ironically, the idea of sugar addiction can, in and of itself, perpetuate the myth. People who believe that a food is addictive, and restrict their diet as a result, may experience increased thoughts about off-limits foods, according to a study by psychologists at the University of Liverpool.

Chocolate is high in magnesium, which helps regulate your cardiovascular system. Antioxidants in chocolate help clear plaque out of the arteries, reducing the risk of heart disease. Flavonoids found in chocolate may lower blood pressure and improve your blood flow overall. So go ahead, have a Hershey Bar.

Social media influencers, wellness corporations and celebrities tell us to avoid all processed foods, since they are supposedly less nutritious. If we eliminate them, the website “Eat This, Not That” claims that we can slow down aging, have less cellulite and even have better bowel movements.

Well, broadly speaking, humans have been “processing” food for centuries, by grinding and curing meat and cooking meals, among other activities. The U.S. Department of Agriculture defines processed foods as including all food that has undergone any changes to its natural state, from washing to preserving to packaging.

Modern food processing includes a range of methods — canning, grain-husk removal, fermentation and freezing — that can preserve nutrients. Frozen produce can be more nutrient-dense than those same items sold fresh in the grocery store.

Another way to process food is to fortify it: Pregnant women benefit from the folate, which can help prevent neural tube defects, that the Food and Drug Administration mandates in enriched grain products. The vitamin D added to foods such as milk and tofu helps many of us during the winter months when the sun doesn’t hit our skin. Salt enhanced with iodine is often our primary source of this essential nutrient.

In 1945, the U.S. Food and Nutrition Board estimated that the average man needed 2,500 calories each day and would therefore need 2.5 liters of water daily (though the group also noted that most of that water would come from prepared foods). From there, the notion of an eight-cup daily requirement got applied to the general population. The myth has found staying power in messaging from groups sponsored by bottled-water companies, which backed studies aiming to show that people were not hydrating enough.

The American Physiological Society found no scientific evidence for the eight-cups-a-day recommendation. Many factors can influence an individual’s hydration needs, including climate, physical activity, sweat rate, body weight and hormones. “No single formula fits everyone,” confirms the Mayo Clinic. Some body cues can help us gauge our fluid needs, such as urine color and sweat rate — and it’s worth noting that when we feel thirsty, we’re often mildly dehydrated. Don’t gulp down glasses of water at a time, though: Taking sips throughout the day can lead to better absorption.

One fad diet dictates that raw foods, in their “natural state,” have more nutritional value. Advocates recommend that people not consume any foods prepared at temperatures higher than 118 degrees. “The most fragile of enzymes start to die off at that 42C/115F degree mark,” claims one blogger. Cooking supposedly “denatures” those enzymes and “destroys” vitamins, minerals and proteins.

Cooking with heat, or “thermal processing,” actually improves the bioavailability of many nutrients and phytochemicals, such as the lycopene in tomatoes. Foods including sweet potatoes, dry beans, grains and rhubarb also have nutrients that are better absorbed and digested when cooked. Different methods of cooking can have different effects on overall nutrient density: Boiling vegetables can reduce their water-soluble vitamins, such as thiamin and vitamin C, but steaming, baking and stir-frying can minimize this loss.

When people are trying to lose weight, they often invent or adopt rules to keep them from eating — and a common one is to not eat after dark, or to have a cutoff time for eating. As a result, many have internalized the idea that consuming food after a specific hour leads to weight gain and ill health. Writing in O, the Oprah Magazine in 2003, Oprah Winfrey described how she didn’t eat after 7:30 p.m. — “not even a grape.” Under the headline “Why Not to Eat after 7 p.m.,” the website Livestrong.com claims that people should not eat “beyond the traditional dinner hour.”

Some research shows that people who consume most of their daily calories in the evening tend to choose less nutrient-rich foods (and drink more alcohol), but that doesn’t make nighttime eating inherently unhealthy. Imposing arbitrary time restrictions on eating can have negative effects on health. Since the body continues to use energy overnight, the drops in blood sugar associated with forgoing food can disturb sleep patterns.

Meanwhile, some research has shown that pre-sleep snacking might have benefits: Scientists in the Netherlands found that having a bite before bed can improve muscle recovery after exercise training.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to Top