A couple weeks ago, I wrote a column for Science News about the apparent link between fast-food diets and fatty liver disease, a serious and potentially lethal condition in people. In this week's column for that magazine, I cover what might be considered its counterpoise—how eating fatty-liver products can induce a serious and potentially lethal condition, at least in the mice being tested. Presumably, humans could face a similar risk.
The fatty-liver comestible at issue: foie gras.
One somewhat reassuring aspect, at least in the United States, people don't tend to consume much foie gras. It's the excessively fatty liver of ducks or geese—often served pureed into a mousse or pâté and then doctored with any of various spices. I say doctored because I'm not a liver aficianado by any means and it would take a lot of doctoring to make it go down.
I used to take 2 hours to eat about 4 ounces of liver as a child, and I only bothered to try because in our household, the only alternative to finishing it was to leave the dinner table and go straight to bed—at 4:30 p.m. Those episodes left a bad taste in my mouth for anything linked to liver.
As I matured, I lost much of the genetically ingrained taste for fatty foods. So, as you might imagine, fatty liver is one of the last foods that would appeal to me.
However, it appeals to plenty of others, especially many who consider themselves gourmands.
A new study by researchers in the United States and Sweden now finds that the process of overfeeding waterfowl to make their livers especially fatty really stresses those livers. And that stress can lead to the development of protein abnormalities—a misfolding of the proteins into hair-like shapes known as amyloids.
In the June 26 Proceedings of the National Academy of Sciences, those researchers now show that when amyloid-rich foie gras is fed to mice, it can seed tissues in the rodents to begin making even more amyloid. The researchers describe this as the fatty-liver-based food "infecting" the animals with a propensity for amyloidosis: life-threatening disease where the affected tissues—which can be liver, heart, or gastrointestinal tract—don't work properly because their proteins' shape is all wrong.
There are plenty of caveats associated with the findings. And I would direct you to read the longer article in Science News to learn more about them. They explain why there is probably little immediate cause for panic, even among most foie gras lovers.
Among the biggest of these: Affected animals were all at high risk for amyloidosis to start with. Among human populations that would match that condition—individuals with tuberculosis and leprosy. When you think about it, the people in the United States most likely to suffer from either of those diseases are indigents. Such individuals are hardly likely to eat, much less overindulge, in foie gras, which typically goes for $6 or more per ounce.
Showing posts with label Nutrition. Show all posts
Showing posts with label Nutrition. Show all posts
Friday, June 22, 2007
Thursday, May 31, 2007
Food Security Stat
One of the Agriculture Department's 5-year goals has been to reduce the prevalence of very low food security among low-income households (those having incomes at 130 percent of the nation's poverty line or lower) to no more than 7.4 percent by this year. A new report, released today, suggests that achieving that goal will be next to impossible.
In 2005, the most recent year for which data are available, "the prevalence of very low food security among low-income households stood at 12.6 percent, up from 10.9 percent in 2000."
USDA defines very low food security as being where: "at times during the year, food intake of one or more household members is reduced and normal eating patterns disrupted because the household lacks sufficient money and other resources for food."
Source: Nord, M. 2007. Characteristics of Low-Income Households With Very Low Food Security: An Analysis of the USDA GPRA Food Security Indicator. USDA Economic Research Service Report #EIB-25 (May) Available at: http://
www.ers.usda.gov/publications/eib25
In 2005, the most recent year for which data are available, "the prevalence of very low food security among low-income households stood at 12.6 percent, up from 10.9 percent in 2000."
USDA defines very low food security as being where: "at times during the year, food intake of one or more household members is reduced and normal eating patterns disrupted because the household lacks sufficient money and other resources for food."
Source: Nord, M. 2007. Characteristics of Low-Income Households With Very Low Food Security: An Analysis of the USDA GPRA Food Security Indicator. USDA Economic Research Service Report #EIB-25 (May) Available at: http://
www.ers.usda.gov/publications/eib25
Monday, May 21, 2007
Protein Helps Curb Hunger
All things being equal, diets higher in protein are better at holding hunger at bay than meals richer in fat or carbs. That's the finding of a set of prolonged feeding trials run by scientists at Purdue University.
John W. Apolzan and his coworkers advertised for volunteers in the local newspapers and ended up enrolling 12 men between the ages of 21 and 43 and another 10 between the ages of 63 and 79. After calculating how many calories it would take for each man to maintain his current weight, the researchers tailored diets to deliver just that much energy to each man over the course of three 18-day cycles. All foods except for water were supplied the participants, and any uneaten food was returned and weighed.
The recommended intake of protein is 0.8 gram per kilogram of bodyweight—or about 2 ounces for a 155 pound man. In one cycle, each man got slightly more than that: 1 g of protein per kilogram of bodyweight per day. In the other cycles, he got 0.5 or 0.75 g/kg day. The ordering of these 18-day dietary cycles were randomly assigned to each participant. At the end of each cycle, the scientists administered hourly questionaires throughout the waking hours of one day to assess hunger and desire to eat in each of the volunteers.
In the May Journal of Nutrition, Apolzan's group reports that the men reported being 20 percent less hungry after the highest protein diet phase than after either of the others. Similarly, each man's desire to eat was, on average, almost 30 percent greater on the mid-level protein diet and 50 percent greater on the low-protein diet than when the volunteers got the high-protein fare.
It now appears that for those of us wishing to curb the siren call of calories, eating too little protein—as 15 to 40 percent of older Americans do—might foster overeating.
John W. Apolzan and his coworkers advertised for volunteers in the local newspapers and ended up enrolling 12 men between the ages of 21 and 43 and another 10 between the ages of 63 and 79. After calculating how many calories it would take for each man to maintain his current weight, the researchers tailored diets to deliver just that much energy to each man over the course of three 18-day cycles. All foods except for water were supplied the participants, and any uneaten food was returned and weighed.
The recommended intake of protein is 0.8 gram per kilogram of bodyweight—or about 2 ounces for a 155 pound man. In one cycle, each man got slightly more than that: 1 g of protein per kilogram of bodyweight per day. In the other cycles, he got 0.5 or 0.75 g/kg day. The ordering of these 18-day dietary cycles were randomly assigned to each participant. At the end of each cycle, the scientists administered hourly questionaires throughout the waking hours of one day to assess hunger and desire to eat in each of the volunteers.
In the May Journal of Nutrition, Apolzan's group reports that the men reported being 20 percent less hungry after the highest protein diet phase than after either of the others. Similarly, each man's desire to eat was, on average, almost 30 percent greater on the mid-level protein diet and 50 percent greater on the low-protein diet than when the volunteers got the high-protein fare.
It now appears that for those of us wishing to curb the siren call of calories, eating too little protein—as 15 to 40 percent of older Americans do—might foster overeating.
Not Enough Time to Cook
There has been the expectation that as income falls, the amount of time a family spends cooking will climb--in part to economize but also because less time employed outside the home leaves individuals more time to cook. However, contrary to patterns seen in the past, it now appears that low-income U.S. families spend very little time preparing meals.
Indeed, a study issued this week reports that low-income families don't allocate nearly as much time to food preparation as would be necessary to implement the Thrifty Food Plan, an Agriculture Department program which shows Food Stamp recipients how to prepare nutritious meals using low-cost foods available under the Food Stamp program.
Preparing meals from scratch that comply with recommendations of the Thrifty Food Plan take an estimated 80 to 130 minutes, on average, per day. In fact, the new study finds, low-income families where all adults work full-time typically reserve only 40 minutes per day for meal preparation.
Studies by the boatload have shown that people tend to down healthier fare when they eat at home. Moreover, meals cooked from scratch tend to have more nutrients, fewer preservatives, less salt, less sugar, and less fat than foods that have been commercially processed.
In their new report, "Who Has Time to Cook?", Lisa Mancino and Constance Newman of the Agriculture Department's Economic Research Service sifted through data collected by the Bureau of Labor Statistics and Census Bureau on how individuals use their time throughout the day. For this analysis, they focused, of course, on time spent cooking.
As might be expected, women who don't work outside the home--people who in the past might have been termed housewives--spent the most time in the kitchen. On average, they devoted slightly more than 70 minutes a day preparing meals. Women who worked part-time outside the home averaged about 55 minutes a day fixing meals, and full-time working women spent a mere 38 to 46 minutes a day cooking.
Single women found less incentive to cook. On average, those that worked spent 15 fewer minutes per day cooking than those who were married or lived with partners. Perhaps surprisingly, single non-working women spent a half-hour less cooking food per day than those who were married or otherwise partnered.
What about men? Fuhgeddaboudit, as my New York relatives would say. Regardless of income level, those with full- or part-time jobs spent 13 to 17 minutes a day cooking; those who were unemployed spent a mere half-hour or less, on average.
The bottom line, Mancino and Newman say, is that the Thrifty Food Plan doesn't account for how little time people now find available for meal preparation. To offer useful guidance, this Plan will need significant a retooling, they argue, finding recipes for alternatives that can be whipped up in far less time.
As a woman who typically spends 11 to 14 hours outside the home at work and in commuting, I can attest that even when the larder is well-stocked, I have little enthusiasm for spending an hour or more preparing dinner. Except on weekends, even breakfast is prepared on the fly.
Indeed, I'm convinced that too little time and motivation to cook has become one major fallout of our overextended workforce. A correllary, those of us who don't have the energy to cook are also unlikely to possess the energy to exercise in what little free time they can find.
It's not even that we're all doing this just to chase the almight buck. Many jobs require long hours--and exist great distances from where the workforce is likely to live. When will society decide to value quality of life? Once we're all fat and sick? Oops...we're already there, aren't we?
Indeed, a study issued this week reports that low-income families don't allocate nearly as much time to food preparation as would be necessary to implement the Thrifty Food Plan, an Agriculture Department program which shows Food Stamp recipients how to prepare nutritious meals using low-cost foods available under the Food Stamp program.
Preparing meals from scratch that comply with recommendations of the Thrifty Food Plan take an estimated 80 to 130 minutes, on average, per day. In fact, the new study finds, low-income families where all adults work full-time typically reserve only 40 minutes per day for meal preparation.
Studies by the boatload have shown that people tend to down healthier fare when they eat at home. Moreover, meals cooked from scratch tend to have more nutrients, fewer preservatives, less salt, less sugar, and less fat than foods that have been commercially processed.
In their new report, "Who Has Time to Cook?", Lisa Mancino and Constance Newman of the Agriculture Department's Economic Research Service sifted through data collected by the Bureau of Labor Statistics and Census Bureau on how individuals use their time throughout the day. For this analysis, they focused, of course, on time spent cooking.
As might be expected, women who don't work outside the home--people who in the past might have been termed housewives--spent the most time in the kitchen. On average, they devoted slightly more than 70 minutes a day preparing meals. Women who worked part-time outside the home averaged about 55 minutes a day fixing meals, and full-time working women spent a mere 38 to 46 minutes a day cooking.
Single women found less incentive to cook. On average, those that worked spent 15 fewer minutes per day cooking than those who were married or lived with partners. Perhaps surprisingly, single non-working women spent a half-hour less cooking food per day than those who were married or otherwise partnered.
What about men? Fuhgeddaboudit, as my New York relatives would say. Regardless of income level, those with full- or part-time jobs spent 13 to 17 minutes a day cooking; those who were unemployed spent a mere half-hour or less, on average.
The bottom line, Mancino and Newman say, is that the Thrifty Food Plan doesn't account for how little time people now find available for meal preparation. To offer useful guidance, this Plan will need significant a retooling, they argue, finding recipes for alternatives that can be whipped up in far less time.
As a woman who typically spends 11 to 14 hours outside the home at work and in commuting, I can attest that even when the larder is well-stocked, I have little enthusiasm for spending an hour or more preparing dinner. Except on weekends, even breakfast is prepared on the fly.
Indeed, I'm convinced that too little time and motivation to cook has become one major fallout of our overextended workforce. A correllary, those of us who don't have the energy to cook are also unlikely to possess the energy to exercise in what little free time they can find.
It's not even that we're all doing this just to chase the almight buck. Many jobs require long hours--and exist great distances from where the workforce is likely to live. When will society decide to value quality of life? Once we're all fat and sick? Oops...we're already there, aren't we?
Sunday, May 20, 2007
Hot Flash Newsflash II
Even people who don't cotton to tofu usually find soy nuts palatable. The hard, toasted seeds taste sort of like a cross between peanuts and pretzels. It now turns out that as snacks go, these may have an extra benefit--at least for women experiencing menopausal symptoms. Eating a handful of soy nuts at various times throughout the day cut the number of hot flashes they experienced by at least 40 percent.
The study recruited 60 heathy postmenopausal Boston-area women to take part in a pair of eight-week dietary sequences. In one, they ate a low-fat, high-carbohydrate diet rich in calcium and fish. In the other sequence, they ate this diet supplemented with a half-cup of soy nuts each day, with the "nuts" to be spread out in three or four portions several hours apart. Half started on the diet without soy nuts, the rest on the soy-supplemented one.
In addition to experiencing fewer hot-flash episodes while they were in the soy-nut phase of the trial, the recruits also reported fewer other physical and emotional symptoms of menopause. Francine K. Welty and her colleagues at Beth Israel Deaconess Medical Center in Boston report their team's findings in the April Journal of Women's Health.
The amount of soy and its consumption throughout the day were each intended to mimic, in part, the typical day-long intake of soy in many Asian cultures, where the prevalence of hot flashes is low. Indeed, Welty's group notes, an estimated 10 to 25 percent of Chinese and Indonesian women typically experience this menopausal symptom compared to some 60 to 90 percent of women in Western countries.
I happened onto this paper while I was taking advantage of a nice offer by the publisher of this journal. In honor of National Women’s Health Week, last week, it opened access to the contents of this issue and any other issue of the journal at no cost--but only through June 15. To do so, log in to http://www.liebertonline.com/jwh.
The study recruited 60 heathy postmenopausal Boston-area women to take part in a pair of eight-week dietary sequences. In one, they ate a low-fat, high-carbohydrate diet rich in calcium and fish. In the other sequence, they ate this diet supplemented with a half-cup of soy nuts each day, with the "nuts" to be spread out in three or four portions several hours apart. Half started on the diet without soy nuts, the rest on the soy-supplemented one.
In addition to experiencing fewer hot-flash episodes while they were in the soy-nut phase of the trial, the recruits also reported fewer other physical and emotional symptoms of menopause. Francine K. Welty and her colleagues at Beth Israel Deaconess Medical Center in Boston report their team's findings in the April Journal of Women's Health.
The amount of soy and its consumption throughout the day were each intended to mimic, in part, the typical day-long intake of soy in many Asian cultures, where the prevalence of hot flashes is low. Indeed, Welty's group notes, an estimated 10 to 25 percent of Chinese and Indonesian women typically experience this menopausal symptom compared to some 60 to 90 percent of women in Western countries.
I happened onto this paper while I was taking advantage of a nice offer by the publisher of this journal. In honor of National Women’s Health Week, last week, it opened access to the contents of this issue and any other issue of the journal at no cost--but only through June 15. To do so, log in to http://www.liebertonline.com/jwh.
Friday, May 18, 2007
Killer Stats
Today, cancer and heart disease are neck-and-neck leaders as the top causes of death throughout the world. Each claims about 8 million lives annually. Although infections and stroke used to kill roughly equal numbers of people each year at the turn of the millennium, their trajectories are veering in very different directions, with stroke rates climbing slowly and infectious diseases other than HIV/AIDS plummeting dramatically.
If the projections indicated by the World Health Organization data hold true through 2030, the fastest climbing disease killer will be HIV/AIDS. Its death toll in 2002 was 2.8 million people. By 2030, WHO projects mortality from this infectious disease will more than double--to 6.5 million. That wouldn't put it far behind stroke, which is projected to claim about 7 million lives a year by then. Both will still be well behind heart disease at more than 9 million deaths a year and cancer at more than 11 million annually.
One of the saddest stats: WHO projects that tobacco related deaths will reach reach about 8.3 million a year by 2030, which would account for about 10 percent of all deaths globally.
Source: World Health Statistics 2007, released May 18, by the World Health Organization, Geneva, Switzerland, p. 12.
If the projections indicated by the World Health Organization data hold true through 2030, the fastest climbing disease killer will be HIV/AIDS. Its death toll in 2002 was 2.8 million people. By 2030, WHO projects mortality from this infectious disease will more than double--to 6.5 million. That wouldn't put it far behind stroke, which is projected to claim about 7 million lives a year by then. Both will still be well behind heart disease at more than 9 million deaths a year and cancer at more than 11 million annually.
One of the saddest stats: WHO projects that tobacco related deaths will reach reach about 8.3 million a year by 2030, which would account for about 10 percent of all deaths globally.
Source: World Health Statistics 2007, released May 18, by the World Health Organization, Geneva, Switzerland, p. 12.
Wednesday, May 16, 2007
Promising Diet Pill—Not!
A Chinese weight-loss concoction, known simply as NT, has shown promising results in animal tests. Not only have growing rodents gained less weight, but adults actually slimmed some. Subsequent U.S. tests of the herbal combo confirmed those findings—again in rodents. However, when U.S. obesity researchers gave NT to overweight people in a pilot test, a major side effect emerged: diarrhea.
In hopes of overcoming that problem--which traced to natural laxatives in the herbal preparation--the latter research team reformulated the diet preparation. Then, they fed it to 105 healthy people, 18 to 65 years old, for what was to be 24 weeks. One-third got a low-dose mix, another third got double that dose, and a final third got inactive, look-alike pills, termed a placebo. Good news, at the end of 8 weeks: No diarrhea in either treatment group. Bad news: No weight loss, either.
“In fact, the high dose gave less weight loss than the placebo,” Andrew T. Roberts of the Pennington Biomedical Research Center in Baton Rouge, La., and his colleagues report in the March Journal of Medicinal Food. So disappointing were the results that the scientists terminated their study immediately, 16 weeks early.
NT consists of ~40% rhubarb root-and-stem extract, ~26 percent tumeric (Curcuma longae), ~13% red-sage root (Salvia miltiorrhizae), ~13 percent astragulus root, and ~7% dried ginger (Zingiberis officinalis). When the Roberts’ group realized that these natural products also contained gallic acid, a food constituent that has its own weight-loss properties, they decided to make this the primary ingredient in their newly reformulated test preparation. For their test combo, NT became only 20 percent of the total, by weight, with gallic acid making up the remainder.
The team attributes the likely downfall of the new preparation to its reliance on gallic acid. The researchers found that no matter what they did in attempting to augment its absorption, the compound's concentration in blood never exceeded 20 percent of the administered dose—and a max of 10 micromolar concentrations.
Roberts’ team concludes: “GA will not be an effective oral supplement for the treatment of human obesity.”
In hopes of overcoming that problem--which traced to natural laxatives in the herbal preparation--the latter research team reformulated the diet preparation. Then, they fed it to 105 healthy people, 18 to 65 years old, for what was to be 24 weeks. One-third got a low-dose mix, another third got double that dose, and a final third got inactive, look-alike pills, termed a placebo. Good news, at the end of 8 weeks: No diarrhea in either treatment group. Bad news: No weight loss, either.
“In fact, the high dose gave less weight loss than the placebo,” Andrew T. Roberts of the Pennington Biomedical Research Center in Baton Rouge, La., and his colleagues report in the March Journal of Medicinal Food. So disappointing were the results that the scientists terminated their study immediately, 16 weeks early.
NT consists of ~40% rhubarb root-and-stem extract, ~26 percent tumeric (Curcuma longae), ~13% red-sage root (Salvia miltiorrhizae), ~13 percent astragulus root, and ~7% dried ginger (Zingiberis officinalis). When the Roberts’ group realized that these natural products also contained gallic acid, a food constituent that has its own weight-loss properties, they decided to make this the primary ingredient in their newly reformulated test preparation. For their test combo, NT became only 20 percent of the total, by weight, with gallic acid making up the remainder.
The team attributes the likely downfall of the new preparation to its reliance on gallic acid. The researchers found that no matter what they did in attempting to augment its absorption, the compound's concentration in blood never exceeded 20 percent of the administered dose—and a max of 10 micromolar concentrations.
Roberts’ team concludes: “GA will not be an effective oral supplement for the treatment of human obesity.”
Garlic as Antibiotic
Studies have shown that raw garlic has the ability to kill many bacterial germs. A group of researchers from the United Arab Emirates now report that the bulb's juice contains at least some of the constituents responsible. However, boiling that juice for 10 to 30 minutes—such as to pasteurize it—partially or completely eliminates its antibacterial effect, depending on the germ against which it's deployed.
Even boiling the juice for just 5 minutes roughly halved its germ-killing prowess, the researchers report in the March Journal of Medicinal Food.
Storing the juice—even at temperatures approaching freezing (i.e. 4 °C)—can also significantly diminish its antibiotic properties.
The authors conclude that “in order to obtain optimum [germicidal] results, garlic juice should be used fresh, and during cooking it is advisable not to expose garlic to boiling for more than 5 minutes.”
What about just using minced garlic? That’s what a Turkish research team investigated, and their findings appear in a second report in the same journal.
Ali Aydin of Istanbul University Faculty of Veterinary Medicine mixed freshly chopped garlic into ground beef and uncooked Ciğ Kőfte, a type of meatball containing bulgar wheat. Then, the researchers refrigerated some samples and left others at room temperature for up to 2 days.
Adding the garlic to ground beef slowed somewhat the growth of germs. However, it didn’t kill them or even halt their growth. For yet unexplained reasons, the garlic proved more effective in the meatballs. At 10% garlic, by weight, bacterial growth in the kőfte was 13 percent lower at room temperature than in the untreated, similarly unrefrigerated raw meatballs. The difference between refrigerated samples was even smaller.
The 10% garlic treatment was the most effective concentration tested—but so high, the researchers admit, as to risk dramatically altering a food's taste.
Their hope had been a way to reduce the risk of food poisoning in regions where refrigeration is iffy or where street vendors hold meat for long periods prior to cooking. However, the scientists concluded, minced garlic’s “antimicrobial effect, even for the highest . . . concentration, is not satisfactory from a practical point of view.”
Bottom line: Garlic is no substitute for keeping meat refrigerated until ready to cook, and your hands and work surfaces clean.
Even boiling the juice for just 5 minutes roughly halved its germ-killing prowess, the researchers report in the March Journal of Medicinal Food.
Storing the juice—even at temperatures approaching freezing (i.e. 4 °C)—can also significantly diminish its antibiotic properties.
The authors conclude that “in order to obtain optimum [germicidal] results, garlic juice should be used fresh, and during cooking it is advisable not to expose garlic to boiling for more than 5 minutes.”
What about just using minced garlic? That’s what a Turkish research team investigated, and their findings appear in a second report in the same journal.
Ali Aydin of Istanbul University Faculty of Veterinary Medicine mixed freshly chopped garlic into ground beef and uncooked Ciğ Kőfte, a type of meatball containing bulgar wheat. Then, the researchers refrigerated some samples and left others at room temperature for up to 2 days.
Adding the garlic to ground beef slowed somewhat the growth of germs. However, it didn’t kill them or even halt their growth. For yet unexplained reasons, the garlic proved more effective in the meatballs. At 10% garlic, by weight, bacterial growth in the kőfte was 13 percent lower at room temperature than in the untreated, similarly unrefrigerated raw meatballs. The difference between refrigerated samples was even smaller.
The 10% garlic treatment was the most effective concentration tested—but so high, the researchers admit, as to risk dramatically altering a food's taste.
Their hope had been a way to reduce the risk of food poisoning in regions where refrigeration is iffy or where street vendors hold meat for long periods prior to cooking. However, the scientists concluded, minced garlic’s “antimicrobial effect, even for the highest . . . concentration, is not satisfactory from a practical point of view.”
Bottom line: Garlic is no substitute for keeping meat refrigerated until ready to cook, and your hands and work surfaces clean.
Food Spending Stats
The average family of four spends nearly $9,000 a year on food.
That's based on federal surveys in 2004--the most recent year for which data are available--showing that the average annual per person spending on food in America was $2,207. Of that total, roughly $1,350 per person was spent for meals consumed at home, another $860 for food eaten in restaurants or elsewhere.
Spending varied substantially by income and family status. For instance, single moms spent some $1,600 per person on food, while married couples without children spent an average of $2,740 per person to eat each year. People living alone spent more than twice as much on their food each year, per capita, than did families of at least six.
Finally, food costs comprised a larger share of income for the poorest families—about 37 percent of household income compared to just 6.6 percent of income for the wealthiest households.
Source: Blisaard, N. and H. Stewart. 2007. Food Spending in American Households, 2003-04. USDA Economic Research Service, Economic Information Bulletin #23(March). Available at: http://www.ers.usda.gov
That's based on federal surveys in 2004--the most recent year for which data are available--showing that the average annual per person spending on food in America was $2,207. Of that total, roughly $1,350 per person was spent for meals consumed at home, another $860 for food eaten in restaurants or elsewhere.
Spending varied substantially by income and family status. For instance, single moms spent some $1,600 per person on food, while married couples without children spent an average of $2,740 per person to eat each year. People living alone spent more than twice as much on their food each year, per capita, than did families of at least six.
Finally, food costs comprised a larger share of income for the poorest families—about 37 percent of household income compared to just 6.6 percent of income for the wealthiest households.
Source: Blisaard, N. and H. Stewart. 2007. Food Spending in American Households, 2003-04. USDA Economic Research Service, Economic Information Bulletin #23(March). Available at: http://www.ers.usda.gov
Thursday, May 3, 2007
More Fat Stats
Researchers at the Beltsville (Md.) Human Nutrition Research Center have just released data on fat-consumption trends over the past 3 decades, based upon a representative survey of the U.S. population. The good news is that overall fat intake is down--from a high of about 45 percent of calories in the late '70s, to about 37 percent of calories today. However, despite the message that saturated fats tend to be the least healthy, the new data show that even today, more than half of U.S. adults derive 10 to 15 % of their calories from sat fats. Younger adults--those 20 to 50 years old--consumed the highest quantities--on average, more than 100 grams per day among men, and more than 75 grams per day among women.
What's the biggest contributor of fats to the adult diet? Desserts, at 11%, should come as no surprise. However, it turns out that pizza, burritos, and tacos were equally big contributors. Next on the list--above bacon even: regular salad dressings, butter, and margarines.
Source: "Levels and Sources of Fat in the Diets of Adults" by Alanna J. Moshfegh, Joseph D. Goldman and Randy P. LaComb, at the Experimental Biology '07 meeting this week in Washington, D.C.
What's the biggest contributor of fats to the adult diet? Desserts, at 11%, should come as no surprise. However, it turns out that pizza, burritos, and tacos were equally big contributors. Next on the list--above bacon even: regular salad dressings, butter, and margarines.
Source: "Levels and Sources of Fat in the Diets of Adults" by Alanna J. Moshfegh, Joseph D. Goldman and Randy P. LaComb, at the Experimental Biology '07 meeting this week in Washington, D.C.
Sunday, April 29, 2007
Mushrooming Immunity
Worried about a cold or some other infection? Maybe you should stock up on mushrooms.
These fungi can significantly enhance the body's immune response, according to a pair of fascinating talks I sat through today at the Experimental Biology '07 meeting. Although the reported experiments had been conducted in animals, the researchers acknowledged that their work had been prodded by hopes that the same will hold true in humans. And the fungi that proved especially potent in this regard? Those prosaic white buttons that account for 90 percent of the fungi eaten in the United States.
Dayong Wu and his colleagues at USDA's Jean Mayer Human Nutrition Research Center on Aging at Tufts University in Boston described his group's work with young-adult mice. They added a dry powder made from button mushrooms to the rodents' diet for 10 weeks in quantities that amounted to either 2% or 10% of the animals' meals. Other mice got just the unadulterated chow. When later stimulated with a compound that challenges the immune system, mushroom-treated mice had a more robust response. They produced higher amounts of certain immune agents known as cytokines (including some interferon and interleukin molecules and tumor-necrosis-factor alpha).
The finding suggests that for the elderly or others who might have weakened immune systems, one might enlist mushrooms as a dietary agent to shore up the body's defense against infections.
In a second study, Sanhong Yu and her colleagues at Penn State University tested the ability of five mushrooms widely available in U.S. groceries--including crimini and shitake species--to similarly rev up the immune response of activated macrophages. These are a type of white blood cells that are important immune-system players.
All of the fungi proved helpful, Yu reported. But the really big performer? White button mushrooms!
Yu's team then fed these mushrooms as 2% of the diet to mice for a month and showed that when challenged with a synthetic infection, the animals' immune systems again performed more heroically than if they had been dining on mushroom-free chow.
Who knew? Up to now, I'd always thought of mushrooms as more of a garnish for salads than as a health food.
These fungi can significantly enhance the body's immune response, according to a pair of fascinating talks I sat through today at the Experimental Biology '07 meeting. Although the reported experiments had been conducted in animals, the researchers acknowledged that their work had been prodded by hopes that the same will hold true in humans. And the fungi that proved especially potent in this regard? Those prosaic white buttons that account for 90 percent of the fungi eaten in the United States.
Dayong Wu and his colleagues at USDA's Jean Mayer Human Nutrition Research Center on Aging at Tufts University in Boston described his group's work with young-adult mice. They added a dry powder made from button mushrooms to the rodents' diet for 10 weeks in quantities that amounted to either 2% or 10% of the animals' meals. Other mice got just the unadulterated chow. When later stimulated with a compound that challenges the immune system, mushroom-treated mice had a more robust response. They produced higher amounts of certain immune agents known as cytokines (including some interferon and interleukin molecules and tumor-necrosis-factor alpha).
The finding suggests that for the elderly or others who might have weakened immune systems, one might enlist mushrooms as a dietary agent to shore up the body's defense against infections.
In a second study, Sanhong Yu and her colleagues at Penn State University tested the ability of five mushrooms widely available in U.S. groceries--including crimini and shitake species--to similarly rev up the immune response of activated macrophages. These are a type of white blood cells that are important immune-system players.
All of the fungi proved helpful, Yu reported. But the really big performer? White button mushrooms!
Yu's team then fed these mushrooms as 2% of the diet to mice for a month and showed that when challenged with a synthetic infection, the animals' immune systems again performed more heroically than if they had been dining on mushroom-free chow.
Who knew? Up to now, I'd always thought of mushrooms as more of a garnish for salads than as a health food.
Bittersweet Stat
Fruits are sweet because they're naturally endowed with sugars. Natural sweeteners lace even milk and various vegetable juices. However, most sugar in the diet has been deliberately added, whether it's to sweeten corn flakes or soft drinks. Boys 14 to 18 years old down the most such added sugar--a whopping 142.6 grams per day, FDA scientists reported today.
That's 21 percent of the energy consumed by boys this age. It amounts to some 570 calories per day, and is equivalent to 35.7 teaspoons of table sugar.
Beverages, especially soft drinks, accounted for most of this added dietary sugar.
Source: Kathleen C. Ellwood, et al. of the U.S. Food and Drug Administration in College Park, Md., at the Experimental Biology '07 meeting, in Washington, D.C.
That's 21 percent of the energy consumed by boys this age. It amounts to some 570 calories per day, and is equivalent to 35.7 teaspoons of table sugar.
Beverages, especially soft drinks, accounted for most of this added dietary sugar.
Source: Kathleen C. Ellwood, et al. of the U.S. Food and Drug Administration in College Park, Md., at the Experimental Biology '07 meeting, in Washington, D.C.
Saturday, April 28, 2007
Tea Time for Fido?
Excess pounds can contribute to the development of insulin resistance, a prediabetic change, in many people. Obesity triggers changes in dogs that are "nearly identical to that seen in the obese human," notes Samuel Serisier of the Ecole Veterinaire de Nantes (France). However, a commercial dietary supplement derived from green tea can restore much of the insulin sensitivity in such animals, he reported today at Experimental Biology '07. a meeting in Washington, D.C.
Serisier's team recruited 10 volunteers. These obese canines had already developed insulin resistance, a condition where their bodies had begun to ignore the presence of insulin--a hormone needed to shepherd energy into cells.
For 12 weeks, six of the pooches received 80 milligrams of a powdered green-tea extract per kilogram of body weight along with their normal day's food rations. The daily supplement provided the animals a dose of catechins--a class of plant-derived antioxidants--equivalent to what humans would derive from drinking 3 cups of tea. The remaining animals received just their normal chow.
At the end of 3 months, insulin sensitivity had improved by 60 percent in the tea-supplemented dogs, Serisier noted; no change occurred in the unsupplemented animals. Green-tea-catechin supplementation had no impact on weight, food-intake, or body composition (i.e. percent body fat and lean tissue). Treatment was linkeed, however, with a 30 percent drop in serum triglycerides, fatty substances that can contribute to clogged arteries.
The findings would seem to offer a simple treatment to boost the health of pudgy pets. Better still, of course, would be to see that man's and woman's best friends get plenty of exercise and a low-calorie diet until they reach their ideal weight. Not only would that also improve insulin resistance, but also ensure that pet owners get off their duffs for a little extra, much-needed exercise.
Finally, don't attempt to treat your pooch with regular green tea. The brew is rich in caffeine, a compound that can prove lethal to dogs.
Serisier's team recruited 10 volunteers. These obese canines had already developed insulin resistance, a condition where their bodies had begun to ignore the presence of insulin--a hormone needed to shepherd energy into cells.
For 12 weeks, six of the pooches received 80 milligrams of a powdered green-tea extract per kilogram of body weight along with their normal day's food rations. The daily supplement provided the animals a dose of catechins--a class of plant-derived antioxidants--equivalent to what humans would derive from drinking 3 cups of tea. The remaining animals received just their normal chow.
At the end of 3 months, insulin sensitivity had improved by 60 percent in the tea-supplemented dogs, Serisier noted; no change occurred in the unsupplemented animals. Green-tea-catechin supplementation had no impact on weight, food-intake, or body composition (i.e. percent body fat and lean tissue). Treatment was linkeed, however, with a 30 percent drop in serum triglycerides, fatty substances that can contribute to clogged arteries.
The findings would seem to offer a simple treatment to boost the health of pudgy pets. Better still, of course, would be to see that man's and woman's best friends get plenty of exercise and a low-calorie diet until they reach their ideal weight. Not only would that also improve insulin resistance, but also ensure that pet owners get off their duffs for a little extra, much-needed exercise.
Finally, don't attempt to treat your pooch with regular green tea. The brew is rich in caffeine, a compound that can prove lethal to dogs.
Friday, April 27, 2007
Dieting by Dairy
Consuming milk and other dairy products might help people shed unwanted pounds, according to research conducted chiefly--although not exclusively--over the past 7 years by researchers at the University of Tennessee. However, some other studies have challenged the positive findings, leading to the conundrum: Does milk really aid dieters or not?
This morning, the U.S. Department of Agriculture announced it will be weighing in. The agency is commencing a study, which will recruit volunteers for a 15-week trial at its Western Human Nutrition Research Center in Davis, Calif. The men and women must tip the scales at 45 to 100 pounds above their ideal body weight, be healthy, and nonsmokers. The National Dairy Council and Dairy Council of California will be contributing financing for the research.
If you watch much TV, you'll undoubtedly have seen the "Got Milk?" and "24/24" industry campaigns in recent years which have cited the putative weight-reducing benefits of dairy consumption. At least one nutrition-advocacy group--Physicians Committee for Responsible Medicine--has petitioned the Federal Trade Commission over these claims, charging that they "are false and misleading, and in violation of federal advertising guidelines."
I guess Uncle Sam is looking to resolve whether there's merit to research claims--or PCRM's petition.
This morning, the U.S. Department of Agriculture announced it will be weighing in. The agency is commencing a study, which will recruit volunteers for a 15-week trial at its Western Human Nutrition Research Center in Davis, Calif. The men and women must tip the scales at 45 to 100 pounds above their ideal body weight, be healthy, and nonsmokers. The National Dairy Council and Dairy Council of California will be contributing financing for the research.
If you watch much TV, you'll undoubtedly have seen the "Got Milk?" and "24/24" industry campaigns in recent years which have cited the putative weight-reducing benefits of dairy consumption. At least one nutrition-advocacy group--Physicians Committee for Responsible Medicine--has petitioned the Federal Trade Commission over these claims, charging that they "are false and misleading, and in violation of federal advertising guidelines."
I guess Uncle Sam is looking to resolve whether there's merit to research claims--or PCRM's petition.
Thursday, April 26, 2007
Do Schools Still Serve Milk?
When my daughter was in elementary school, I pushed and pushed and pushed her to drink more milk. As a nutrition writer, I knew that skim milk would help her bones bulk up on the strengthening calcium they so dearly needed. And reluctantly, she drank what I put in front of her...until middle school, when she really started balking.
She asked why she had to drink so much milk when her friends drank none. I, of course, assumed that was hyperbole. So, I started naming her friends, and in each case she told me that didn't like milk--and didn't drink it.
As an attempt to prove her wrong, I suggested that her science research project that year should involve querying dozens of her classmates about what they drank, how much, and where they consumed it. She passed out the questionnaires, and I sat with her going over the results when she got them back. No one drank more than a glass of milk a day, and those who did probably amounted to no more than 10 percent of the total--3 to 5 youngsters.
My daughter informed me that most days her school didn't even offer milk in its lunch line. I had her interview the cafeteria manager about milk availability and sales. Sure enough, that woman confirmed that her cafeteria didn't always carry milk; so few children wanted it that it wasn't worth taking up the refrigerated space, she told my daughter.
In high school, my child underwent a sudden and miraculous transformation. She started guzzling milk and reaching for heaping plates of fresh fruits and veggies on a daily basis. However, she also noted that she could only get milk at home because her school kept skim milk in the back--only bringing it out on demand. Cafeteria workers, overtaxed by feeding 1,000 students per lunch period each day, were reluctant to make a special trip to get her skim milk.
What did the other kids drink? Soda pop, sports drinks, and sometimes bottled water.
The reason for this post: Yesterday, the Institute of Medicine (IOM) of the National Academies issued a new report, "Nutrition Standards for Foods in Schools--Leading the Way Toward Healthier Youth." In it, the IOM argues that certain foods should be available in schools and their consumption encouraged for youth of all ages. Among these were low-fat or skim milk products. It would restrict sales of caffeinefree and no-calorie soft drinks to high schools--and recommends they be available only during after-school hours. Sugar-sweetened soft drinks and sugar-added fruit juices would be no-no's.
Okay, I grew up in the Dark Ages, when the only drinks available in school cafeterias were one-cup servings of white or chocolate milk. But how did we come to the point where schools have to be told by an august national health organization like IOM that it's time to start offering milk again? Shouldn't they already know this?
She asked why she had to drink so much milk when her friends drank none. I, of course, assumed that was hyperbole. So, I started naming her friends, and in each case she told me that didn't like milk--and didn't drink it.
As an attempt to prove her wrong, I suggested that her science research project that year should involve querying dozens of her classmates about what they drank, how much, and where they consumed it. She passed out the questionnaires, and I sat with her going over the results when she got them back. No one drank more than a glass of milk a day, and those who did probably amounted to no more than 10 percent of the total--3 to 5 youngsters.
My daughter informed me that most days her school didn't even offer milk in its lunch line. I had her interview the cafeteria manager about milk availability and sales. Sure enough, that woman confirmed that her cafeteria didn't always carry milk; so few children wanted it that it wasn't worth taking up the refrigerated space, she told my daughter.
In high school, my child underwent a sudden and miraculous transformation. She started guzzling milk and reaching for heaping plates of fresh fruits and veggies on a daily basis. However, she also noted that she could only get milk at home because her school kept skim milk in the back--only bringing it out on demand. Cafeteria workers, overtaxed by feeding 1,000 students per lunch period each day, were reluctant to make a special trip to get her skim milk.
What did the other kids drink? Soda pop, sports drinks, and sometimes bottled water.
The reason for this post: Yesterday, the Institute of Medicine (IOM) of the National Academies issued a new report, "Nutrition Standards for Foods in Schools--Leading the Way Toward Healthier Youth." In it, the IOM argues that certain foods should be available in schools and their consumption encouraged for youth of all ages. Among these were low-fat or skim milk products. It would restrict sales of caffeinefree and no-calorie soft drinks to high schools--and recommends they be available only during after-school hours. Sugar-sweetened soft drinks and sugar-added fruit juices would be no-no's.
Okay, I grew up in the Dark Ages, when the only drinks available in school cafeterias were one-cup servings of white or chocolate milk. But how did we come to the point where schools have to be told by an august national health organization like IOM that it's time to start offering milk again? Shouldn't they already know this?
Tuesday, April 24, 2007
Vitamin D and Lead Poisoning
Vitamin D is the everything vitamin--or so it seems. Ample intake has been linked with fighting osteroporosis, cancer, diabetes, gum disease, muscle weakness, autoimmune disease--you name it. The rub: Few people really get ample intake.
It always seemed that more was better. But today I finally ran across a potential drawback to the sunshine vitamin.
It seems that for young children exposed to lead--and the nation's inner cities have many--increasing amounts of D are being linked to increasing body burdens of absorbed lead, a toxic heavy metal that can diminish IQ. To find out more, read the study in the April Environmental Health Perspectives. It was conducted by scientists at the University of Medicine and Dentistry of New Jersey (UMDNJ)-New Jersey Medical School (yes, the school's name really is that long).
John Bogden, an environmental health scientist and one of the study's authors, says the vitamin-lead link was not a surprise. Among its many functions, he notes, vitamin D helps the body absorb calcium. That's why lots of D is good for building strong bones and teeth. Unfortunately, the body responds to lead much as it does to calcium.
The researchers studied 142 low-income black and Hispanic children in Newark, N.J.. over a period of 6 to 7 months. All the kids were between the ages of 1 and 8 . The scientists measured vitamin D and lead in the children in winter and again in summer. Why? Upon exposure to sufficient sunlight, skin can make vitamin D. However, in Newark and other northern cities, sunlight is not strong enough in winter to trigger much if any production of D, so people are dependent on diet for this nutrient. And despite what food and dietary supplement manufacturers tell you, none are making products that are really rich in D. So, Bogden's group reasoned, children might show sharply lower vitamin D levels in winter.
And, in this study, they did.
The surprise, Bogden's group found, was that despite their living in the same neighborhoods and experiencing the same socioeconomic deprivation, Hispanic children in this study had little lead poisoning. For the purposes of this study, that was defined as lead concentrations of at least 10 micrograms per deciliter of blood. The black children, however--especially those 1 to 3 years old--had very high rates: about 12 percent in winter and 22 percent in summer.
The question, Bogden asks, is what underlies this strong ethnic difference? Is it housing? Diet? Access to sunlight? His group will be checking it out.
But the bigger problem, of course, is that these economically disadvantaged kids unwittingly face a Hobson's choice: a trade-off between compromised IQ and all of the health benefits that vitamin D offers.
It always seemed that more was better. But today I finally ran across a potential drawback to the sunshine vitamin.
It seems that for young children exposed to lead--and the nation's inner cities have many--increasing amounts of D are being linked to increasing body burdens of absorbed lead, a toxic heavy metal that can diminish IQ. To find out more, read the study in the April Environmental Health Perspectives. It was conducted by scientists at the University of Medicine and Dentistry of New Jersey (UMDNJ)-New Jersey Medical School (yes, the school's name really is that long).
John Bogden, an environmental health scientist and one of the study's authors, says the vitamin-lead link was not a surprise. Among its many functions, he notes, vitamin D helps the body absorb calcium. That's why lots of D is good for building strong bones and teeth. Unfortunately, the body responds to lead much as it does to calcium.
The researchers studied 142 low-income black and Hispanic children in Newark, N.J.. over a period of 6 to 7 months. All the kids were between the ages of 1 and 8 . The scientists measured vitamin D and lead in the children in winter and again in summer. Why? Upon exposure to sufficient sunlight, skin can make vitamin D. However, in Newark and other northern cities, sunlight is not strong enough in winter to trigger much if any production of D, so people are dependent on diet for this nutrient. And despite what food and dietary supplement manufacturers tell you, none are making products that are really rich in D. So, Bogden's group reasoned, children might show sharply lower vitamin D levels in winter.
And, in this study, they did.
The surprise, Bogden's group found, was that despite their living in the same neighborhoods and experiencing the same socioeconomic deprivation, Hispanic children in this study had little lead poisoning. For the purposes of this study, that was defined as lead concentrations of at least 10 micrograms per deciliter of blood. The black children, however--especially those 1 to 3 years old--had very high rates: about 12 percent in winter and 22 percent in summer.
The question, Bogden asks, is what underlies this strong ethnic difference? Is it housing? Diet? Access to sunlight? His group will be checking it out.
But the bigger problem, of course, is that these economically disadvantaged kids unwittingly face a Hobson's choice: a trade-off between compromised IQ and all of the health benefits that vitamin D offers.
Subscribe to:
Posts (Atom)