Science Digest – June 18, 2021

Hey, Premium Members!

We have another awesome issue of the Science Digest for your weekend reading. Check it out to learn how…

Exercise corrects cognitive impairments following a night of sleep deprivation.
Caffeine consumption increases glaucoma risk in genetically susceptible people.
Creatine intake improves cognitive performance in older adults.

And much more!

In other news: We’ve got another Crowdcast live Q&A coming up Saturday, July 10, at 9:30 am PDT. The code for this event is BDNF. Remember, you can always access the most recent event code and Q&A calendar on the member’s dashboard.

And coming soon: A brand-new interview, featuring repeat guest of FoundMyFitness Dr. Satchin Panda. He and I talk the latest on circadian rhythms, time-restricted eating, and optimizing performance. You won’t want to miss this one!

Enjoy!

Rhonda and team
Science Digest – June 18, 2021
Exercise restores cognitive losses caused by sleep deprivation.

Chronic sleep deprivation, a risk factor for cardiovascular and neurodegenerative diseases, is common among adults in industrialized nations, with many reporting less than seven hours of sleep each night during the workweek. Acute sleep deprivation impairs cognitive function, working memory, attention, and executive function, increasing the rate of mistakes made while driving, at work, and at home. A recent report details the effects of aerobic exercise on cognitive function following a night of sleep deprivation.

Sleep deprivation has been shown to decrease brain oxygen saturation, contributing to cognitive impairment. However, previous research has demonstrated that aerobic exercise increases blood flow to the prefrontal cortex region of the brain, activating key regions that improve cognitive function. Enhanced blood flow to the brain during exercise increases oxygen saturation in brain tissue and improves energy metabolism.

In this study, the researchers recruited 12 participants (average age, 21 years) and measured their baseline maximum aerobic capacity while riding a stationary bicycle. On a separate study visit, participants completed cognitive testing and answered questionnaires before and after cycling for 20 minutes at 60 percent of their maximum aerobic capacity. Participants wore an electrode cap on their heads during exercise to measure electrical activity in the brain. They completed this study visit twice in random order: once after a full night of rest and once following a night of total sleep deprivation.

Sleep-deprived participants reported feeling sleepier and performed significantly worse on cognitive tests prior to exercise. Following exercise, both sleepiness and cognitive function were improved in both groups, although cognitive scores were still comparatively worse in the sleep-deprived group. Under both rested and sleep-deprived conditions, blood flow to the prefrontal cortex increased within 12 minutes of initiating exercise and was maintained until the end of exercise. However, the researchers did not find a relationship between oxygenation levels and cognitive function, so whether exercise-induced brain oxygenation had a beneficial effect on cognitive function is unclear.

The authors concluded that exercise corrects the cognitive impairment induced by sleep deprivation, although future research is needed to identify the best exercise protocol to maximize the benefits on brain oxygenation following sleep deprivation.

Link to full report.
Learn more about the connection between sleep and cognitive function in this video featuring Dr. Rhonda Patrick.

Higher caffeine consumption increases glaucoma risk in genetically susceptible people.

Consuming caffeinated beverages such as tea and coffee is a common practice in many cultures around the world. While tea and coffee contain beneficial plant compounds such as antioxidants that lower the risk of disease, the caffeine in tea and coffee increases blood pressure, which may contribute to hypertension-related diseases such as chronic kidney disease, stroke, and glaucoma. Findings of a recent report detail the relationship between caffeine consumption and glaucoma risk.

In glaucoma, the pressure of fluid inside the eye (called intraocular pressure) increases, damaging the optic nerve at the back of the eye. Over time, as damage to the optic nerve accumulates, the risk of vision loss increases. While previous research investigating acute caffeine intake has demonstrated an increase in intraocular pressure following caffeine conumption, research on chronic caffeine consumption has found no relationship between caffeine intake and glaucoma risk. The risk of habitual caffeine consumption may depend on genetics, as other research has found a positive relationship between habitual caffeine intake and glaucoma in those with genetic susceptibility.

The authors conducted a genome-wide association study, a type of study in which researchers look for associations between gene variations called single-nucleotide polymorphisms and disease prevalence. The authors collected genetic data, clinical data measuring intraocular eye pressure, and self-reported dietary data regarding coffee and tea consumption from more than 100,000 participants enrolled in the United Kingdom Biobank study. Next, the authors calculated each participant’s polygenic risk score, which estimates an individual’s genetic susceptibility to a specific disease. Finally, the authors performed a Mendelian randomization analysis, which measures variation in specific genes in order to examine the cause and effect relationship between environmental factors (coffee consumption and total caffeine intake) and disease (glaucoma) risk.

The data revealed that overall, greater total caffeine intake was associated with lower intraocular pressure, but the relationship was not statistically significant. Participants consuming the most caffeine (232 milligrams of caffeine per day or more, the amount in 20 ounces of coffee) had a reduction in intraocular pressure of 0.10 millimeters of mercury (the unit used for blood and eye pressure) compared to participants with the lowest caffeine intake (less than 87 milligrams per day, the amount in one 8-ounce cup of coffee).

Among participants with a high polygenic risk score for glaucoma, however, consuming greater amounts of caffeine (more than 480 milligrams per day, the amount in 42 ounces of coffee) was significantly associated with an increase in intraocular pressure of 0.35 millimeters of mercury compared to participants consuming the least caffeine (less than 80 milligrams per day, the amount in 7 ounces of coffee). While the authors found no overall relationship between caffeine intake and glaucoma, participants with high polygenic risk scores consuming 321 milligrams of caffeine per day or more were four times more likely to develop glaucoma.

The authors concluded that they found no relationship between caffeine consumption and glaucoma risk in the general population. However, their analyses revealed a significant relationship between higher caffeine consumption and increased risk of high intraocular pressure and glaucoma incidence in those with the highest genetic susceptibility for the disease.

Link to full report.

Creatine intake improves cognitive performance in older adults.

Cognitive decline is a personally challenging issue that degrades the quality of life for those affected. It is also a growing public health burden, with one in nine adults aged 65 years and older living with some level of cognitive impairment. Previous research has identified dietary components that reduce age-related cognitive impairment, including vitamin E, folate, and vitamin D. Authors of a new report investigated the effects of creatine on cognitive function in older adults.

Creatine is a compound found primarily in the brain and muscle. It plays important roles in the production of energy in the form of adenosine triphosphate (ATP). Creatine is obtained in the diet from red meat, poultry, fish, and seafood, with an 8-ounce cut of lean beef or a 4-ounce cut of salmon providing roughly 1.5 to 2.5 grams of creatine before cooking. Creatine is also available as a dietary supplement.

In older adults, metabolic dysfunction decreases cellular energy, potentially contributing to cognitive decline. One meta-analysis found that dietary creatine improves aspects of cognitive function in young and older healthy adults.

The authors of the current study analyzed data from more than 1,300 participants (average age, 71 years) in the National Health and Nutrition Examination Survey (NHANES), a long-term study collecting lifestyle and health data from people living in the United States. Participants completed cognitive testing and a 24-hour dietary recall during an interview with NHANES staff. The researchers analyzed the dietary data for foods containing creatine and calculated each participant’s estimated daily creatine intake. This assessment did not include supplemental creatine.

Participants with higher dietary creatine intake tended to have higher cognitive scores, even after taking into account other nutrition factors and socioeconomic status. Participants in the top 50 percent of creatine intake consumed more than 0.95 grams of creatine per day. These participants performed better on cognitive testing than those in the bottom 50 percent.

These findings suggest creatine from food may be protective against age-related cognitive decline. Future research is needed to determine a dose of creatine that improves cognition without damaging kidney health, which is a concern for older adults.

Link to full report.
Read more about the effects of creatine on cognition, depression, physical performance, and more in this overview article.

Vitamin D deficiency increases addictive behaviors.

Opioid use disorder is defined as a problematic pattern of opioid use that leads to serious impairment or distress. It has emerged as a public health crisis in the United States, affecting as many as two million people aged 12 years and older. A multifaceted approach to reducing opioid use disorders involves targeting preventable environmental factors in addition to restricting opioid prescriptions. One group of researchers investigated the effects of vitamin D status on reward-seeking behaviors like opioid use and sunlight exposure in mice.

Opioid drugs are highly addictive due to their effects on the reward-seeking centers in the brain. Previous research suggests sunbed tanning may be addictive in ways similar to the effects of opioids. Exposure to ultraviolet (UV) light causes the skin to produce vitamin D. It also induces the production of beta-endorphins, which are natural opioids with addictive properties. Humans’ UV light-seeking behavior may have developed as an evolutionary strategy to maintain adequate vitamin D levels. Whether vitamin D status has an effect on UV light-seeking or prescription opioid-seeking behavior is unclear.

To assess the relationship between vitamin D levels and opioid use disorder, the authors analyzed data from the National Health and Nutrition Examination Survey (NHANES), a long-term study collecting lifestyle and health data from participants in the United States. To determine the effects of vitamin D deficiency on opioid reward sensitivity, opioid-seeking behavior, and pain, the researchers performed multiple experiments with normal mice, mice without functioning vitamin D receptors, and mice without functioning opioid receptors.

Compared to NHANES participants with normal serum vitamin D levels (greater than 20 nanograms per milliliter), those with insufficient vitamin D levels (12 to 20 nanograms per milliliter) were 1.27 times more likely to have opioid use disorder. Participants with vitamin D deficiency (less than 12 nanograms per milliliter) were 1.62 times more likely to have opioid use disorder. In mice, vitamin D deficiency increased the rewarding and pain-relieving effects of UV light exposure through altered gene expression in key brain regions. Finally, vitamin D deficiency sensitized mice to the pain-relieving and rewarding effects of opioid drugs, while restoring vitamin D levels to normal decreased both UV light-seeking and opioid-seeking behavior.

The authors concluded that the addictive effects of sunlight exposure may be an evolutionary strategy to maintain adequate vitamin D levels and that opioid-seeking may be a consequence of vitamin D deficiency. Their results imply that obtaining and maintaining adequate vitamin D levels through UV light exposure or supplementation may help prevent or treat opioid use disorder.

Link to full report.
Learn more about vitamin D in our in-progress overview article.

Sugar intake in adolescence may cause long-term behavior and cognitive problems later in life.

An abundance of scientific research demonstrates that overconsumption of added dietary sugar is harmful to physical health, but research on its long-term effects on cognitive health is lacking. Findings from a recent study in mice indicate that overconsumption of added sugar during adolescence contributes to the risk of becoming obese, hyperactive, and cognitively impaired as adults.

Current public health guidelines recommend that children between the ages of two and 19 years limit their intake of added sugars to less than 10 percent of their total daily calories – roughly 12 teaspoons per day. Despite these recommendations, the average child living in the United States consumes 15 to 18 teaspoons of added sugar every day, primarily from sugar-sweetened beverages, desserts, and sweet snacks.

The authors of the study conducted two experiments involving four groups of young (adolescent) mice. All of the mice could eat freely from their standard mouse food, but their water sources varied. In the first experiment, water was always available to two groups of mice, but one group had access to only plain drinking water, while one group had a choice between plain water or water containing a 25 percent sugar solution. In the second experiment, both groups had access to only plain drinking water, but one group was given restricted access (two hours per day, five days per week) to water containing a 25 percent sugar solution. At the end of the 12-week intervention, the authors of the study subjected the mice to a battery of behavioral and memory tests and assessed neurogenesis (the foundation for memory) in the animals’ brains.

They found that the mice that consumed high quantities of sugar became hyperactive when they were exposed to a new situation and had impaired memory formation – phenomena commonly observed in attention deficit and hyperactivity disorders. They also demonstrated impaired neurogenesis in their brains, especially within the dentate gyrus area of the hippocampus, a region associated with memory formation. About four weeks into the 12-week intervention, the mice began to gain weight, and by the end of the study, they were about 10 percent overweight. Mice whose access to sugar was restricted consumed four times less sugar than those who had full access. They did not gain weight, and they experienced fewer neurocognitive deficits.

These findings suggest that overconsumption of sugar in mice promotes weight gain, behavior problems, and memory impairments. Although this was a mouse study, many of these findings are translatable to humans. Learn more about the harmful effects of sugar on the brain in this episode featuring Dr. Rhonda Patrick.

Link to full study.

Wearable device may predict inflammatory bowel disease flares.

Inflammatory bowel diseases, a group of conditions characterized by chronic inflammation of the digestive tract, affect the lives of nearly seven million people worldwide. The two most common inflammatory bowel diseases are ulcerative colitis and Crohn’s disease. A report published in 2020 describes a wearable device that detects the presence of biomarkers associated with inflammatory bowel diseases in sweat, potentially signaling a symptom flare.

A common feature of many inflammatory bowel diseases are periods of remission and relapse. These periods of relapse are often referred to as “flares.” During flares, interleukin-1 beta (IL-1 beta) and C-reactive protein (CRP), are often elevated. IL-1 beta is a proinflammatory cytokine that mediates the body’s inflammatory response. CRP is a protein that increases in the blood with inflammation and infection.

The authors of the study developed a wearable, watch-like device that monitored levels of IL-1 beta and CRP in the sweat of 20 healthy adults (18 to 65 years old) over a period of up to 30 hours. They studied healthy people to help establish the levels of these two biomarkers in people without inflammatory bowel disease. A removable strip on the device collected the sweat, providing real-time monitoring of the biomarkers. They also measured the biomarkers using a standard assay and compared the two assessments.

They found that the device was highly accurate at measuring sweat levels of IL-1 beta and CRP, compared to measurements with the standard assay. Their findings demonstrate proof-of-feasibility for a wearable device that can signal an inflammatory bowel disease flare. Use of such a device may offer a non-invasive way to help people with inflammatory bowel disease track their inflammatory status and help guide clinicians’ treatment decisions.

The authors of this study later measured pro-inflammatory proteins in the sweat of people with COVID-19 to predict cytokine storm, a potentially life-threatening condition that occurs when an infection provokes an excessive immune response. They found that their sweat-based sensor detected cytokine levels in passive sweat that correlated with cytokine levels in the patients’ blood. Wearable devices such as these demonstrate great utility in helping people with both acute and chronic illness.

Link to study abstract.

Learn more about wearable devices in this episode featuring Dr. Michael Snyder.
Learn more about how scientists are using wearable devices to identify whether people might have an infection – including COVID-19 – in this clip featuring Dr. Michael Snyder.

Bright light reduces symptoms of hyperactivity in young children.

Light is the primary signal that establishes the circadian rhythms, the body’s 24-hour cycles of biological, hormonal, and behavioral patterns. Exposure to bright light affects mood, attention, and activity levels. Findings from a recent study suggest that exposure to bright light reduces symptoms of hyperactivity in young children.

The study involved 48 preschool children (three to six years old) who were enrolled in E4Kids, a longitudinal study conducted in Australia. The children wore ActiGraph monitors (wristwatch-like electronic devices), which provided measurements of their light exposure and activity levels in one-minute intervals over a period of two weeks. Their parents provided information about the children’s sleep habits, activity levels, behavior, and demographics. The researchers collected other data about the children, including height, weight, body mass, activity/rest patterns, age, and gender.

They found that children who were exposed to brighter light for longer periods of time exhibited fewer symptoms of hyperactivity, regardless of the timing of the exposure. They noted these effects even at low levels of light, suggesting that young children are more sensitive to the effects of light than adults.

This was an observational study and did not identify mechanisms that drive the relationship between light exposure and behavior. However, the authors of the study posited that light exposure might influence norepinephrine release in the brain, promoting increased vigilance, alertness, and attention. They also suggested that light exposure promotes cortisol release, compensating for the diminished cortisol response commonly observed in children with attention disorders.

Link to full study.

Altering energy metabolism lengthens healthspan and lifespan in mice.

Metabolic dysfunction escalates with age and increases the risk of frailty and chronic disease. Many investigational antiaging drugs target enzymes involved in energy metabolism, such as mammalian target of rapamycin (mTOR) and insulin-like growth factor (IGF)-1. Findings of a new study in mice detail the role of sirtuins in energy metabolism and healthy aging.

Sirtuins are enzymes that play key roles in healthspan and longevity in multiple organisms. They are linked to the regulation of a variety of metabolic processes, including the release of insulin, mobilization of lipids, response to stress, and modulation of lifespan. Sirtuins respond to physiological changes in energy levels, thereby regulating energy homeostasis and health.

When fasting or exercising, the liver helps maintain blood sugar levels using the alternating processes of glycogenolysis (the breakdown of glycogen) and gluconeogenesis (the creation of new glucose). In glycogenolysis, single glucose molecules are released from larger glucose chains, called glycogen, for use as energy. Because the body stores a limited amount of glycogen, the liver uses gluconeogenesis to create glucose from non-carbohydrate precursors, such as lactate and glycerol. The role of gluconeogenesis in aging is unclear, with one study reporting an increase in gluconeogenesis with age in rats and other studies in rats reporting decreased gluconeogenic ability.

Using breeding techniques, the investigators produced a line of mice with livers that over-express the sirtuin enzyme SIRT6, which increases during fasting and regulates glucose metabolism. The researchers bred these mice with normal mice to produce litters with a mix of genotypes. Using this technique, the researchers were able to compare aging processes among littermates, which reduced confounding factors in their analyses.

Compared to their normal siblings, male SIRT6 mice exhibited a 27 percent increase in average lifespan and an 11 percent increase in maximal lifespan. Female SIRT6 mice showed a 15 percent increase in both average and maximal lifespan. Normal mice experienced greater metabolic dysfunction, performed less physical activity, and developed more inflammatory and degenerative diseases with age compared to SIRT6 mice. A series of metabolic tests revealed that SIRT6 mice had increased gluconeogenic gene expression in the liver and enhanced glycerol release from adipose tissue, which provided extra fuel for gluconeogenesis.

The authors concluded that SIRT6 controls healthspan and lifespan through regulating energy metabolism. This mechanistically complicated study will provide the groundwork for future aging research in humans.

Link to full report.
Watch aging expert Dr. David Sinclair discuss how metabolism regulates the circadian clock and sirtuin expression.
Learn more about sirtuins in our overview article.

Omega-6 and omega-3 fats kill cancer cells.

Cancer is a leading cause of death in the United States, but new therapies targeting diet and lifestyle may aid in fighting the disease. Emerging research suggests that modulating the kinds of nutrients available to cancer cells may encourage or inhibit their growth. Findings of a recent study detail the effects of polyunsaturated fats on cancer growth in mice.

Polyunsaturated fats are fatty acids that have more than one unsaturated carbon bond in their chemical structure, such as omega-3 and omega-6 fatty acids. They are present in fish, nuts, and seeds and are more prone to oxidation than other fatty acids. Polyunsaturated fats play critical roles in cardiovascular and neurological health.

While cancer is often viewed as a genetic disease, metabolic changes occur in cancer cells that contribute to their growth and spread. Cancer cells tend to produce energy by converting glucose to lactic acid, which acidifies the tumor environment, a phenomenon known as “cancer acidosis.” The acidity of the tumor environment alters the way cancer cells metabolize fats, increasing both the storage and the breakdown of fats.

Previous research demonstrated that targeting fatty acid metabolism may promote cancer cell death and prevent cancer spread by increasing lipid peroxidation (the breakdown of fats in the presence of iron) resulting in oxidative stress. Peroxidized lipids signal cellular damage, inducing a type of programmed cell death called ferroptosis (apoptosis occurring due to iron). However, it is unclear how different types of dietary fats affect cancer metabolism.

First, the authors of the present study exposed cervical, colorectal, and pharyngeal (throat) cancer cells to a variety of fats and measured the fats’ effects on metabolism. The fats they used included saturated fats, like those found in butter and palm oil; monounsaturated fats, like those found in olive oil; and polyunsaturated fats, including omega-6 fats, like those found in corn oil, and omega-3 fats, like those found in fatty fish (eicosapentaenoic acid [EPA] and docosahexaenoic acid [DHA]). Next, they fed mice a diet supplemented with either olive oil or fish oil for four weeks. Two weeks into the diet, the mice developed cancer and the researchers measured the effects of the two diets on cancer progression. Finally, the researchers administered compounds to mice with cancer that induce or inhibit ferroptosis to further explore the mechanisms of cancer cell metabolism.

In cancer cells, exposure to both omega-6 and omega-3 polyunsaturated fats enhanced fat uptake and storage, which increased lipid peroxidation and induced ferroptosis to a greater extent than saturated or monounsaturated fats. Fats with the least saturation (meaning the most double bonds) were most effective, with the omega-3 fat DHA demonstrating the most cancer-fighting ability. However, this result only occurred in cancer cells with an acidic pH compared to a neutral pH. Mice that received the supplemental fish oil lost harmful white adipose tissue but gained metabolically beneficial brown adipose tissue, improving whole-body metabolism. The fish oil diet delayed cancer growth and increased survival, and this effect was magnified by the addition of a ferroptosis-inducing drug.

The authors of this important work concluded that polyunsaturated fats may be an effective add-on treatment to complement pharmacological therapies for cancer.

Link to full study.
Watch Dr. Valter Longo discuss the use of a ketogenic diet and the fasting mimicking diet as an adjunct to chemotherapy to treat aggressive cancer.

Vegan children have healthy hearts but have stunted growth and nutritional deficiencies.

The number of people who follow plant-based diets is on the rise, for reasons including environmental sustainability, improved health, and concerns for animal welfare. The latest update to the Dietary Guidelines for Americans includes vegetarian and vegan options that meet all nutrient requirements; however, much of the research contributing to these recommendations was conducted in adults. A recent report describes the effect of plant-based diets on growth and cardiovascular disease risk factors in children.

Although cardiovascular disease is considered a disease of aging, risk factors in childhood, such as poor diet and sedentary lifestyle, may translate to earlier onset and increased severity of disease in adulthood. One review of observational studies reported that people who adhered to vegetarian and vegan diets had a reduced risk of ischemic heart disease and cancer than people who ate a mixed diet. However, vegetarian and vegan diets tend to provide less calcium and have been associated with an increased risk of fractures, a concern for growing children.

The researchers recruited families with children between the ages of five and ten years, who had a normal body mass index (between 18.5 and 29.9), and who followed a vegetarian or vegan diet. Then they recruited a group of age-matched children who consumed an omnivorous diet, with almost 200 participants total. The researchers collected data from the participants, including family health history and lifestyle, cardiometabolic markers, body composition, and arterial blood flow. The participants wore an activity monitor for two days and kept a food journal for four days to assess their habitual eating and activity habits.

Compared to children who consumed an omnivorous diet, children who ate a vegetarian or vegan diet had less fat mass but similar amounts of lean muscle mass. Vegetarian and vegan children also had lower bone mineral density, although this relationship was not significant in vegetarian children when taking height into account. Vegetarian children had lower total cholesterol, good HDL cholesterol, and serum vitamin B12 and vitamin D without sufficient supplementation. They also had higher glucose, bad VLDL cholesterol, and triglycerides, markers of poor cardiometabolic health.

Vegan children were more than an inch (3 centimeters) shorter than omnivorous children, even after taking their parents’ height into account. They also had lower total cholesterol, including good HDL and bad LDL cholesterol, and less inflammation, but were also more likely to have low iron and vitamin B12 levels without sufficient supplementation.

The authors concluded that vegan children had lower cardiovascular disease risk but had an increased risk of nutritional deficiencies, lower bone mineral density, and shorter height. Vegetarian children had less severe nutritional deficiencies but, unexpectedly, a less favorable cardiometabolic risk profile.

Print Friendly, PDF & Email