Weight Watchers Unveils Revolutionary Revamp of Popular Diet Plan

Leave a comment

Weight-Watchers-Unveils-Rev

Weight-Watchers-Unveils-Rev

Weight Watchers, after helping a lot of people lose their weight, is now changing its diet formula.

The weight-loss giant has replaced its Points plan in the U.S. with a new system called PointsPlus.

PointsPlus, seen as the biggest innovation in more than a decade, uses the latest scientific research to create a program that goes far beyond traditional calorie counting to give people the edge they need to lose weight and keep it off in a fundamentally healthier way.

The net effect is weight loss that is satisfying, healthful and flexible all at the same time.

The new formula takes into account the energy contained in each of the components that make up calories – protein, carbohydrates, fat and fibre – and it also factors how hard the body works to process them (conversion cost) as well their respective eating satisfaction (satiety).

In addition to the new formula, foods that are low in energy density and therefore, more highly satisfying, are emphasized within the program. Specifically, all fresh fruits and most vegetables now have zero PointsPlus values.

The new program features, combined with the fundamentals of the Weight Watchers approach make PointsPlus the most revolutionary and innovative program the company has ever offered.

“Our new PointsPlus program is based on the latest scientific research and is designed to guide people to foods that are nutrient dense and highly satisfying, ensuring they will make healthful decisions, have successful weight loss and learn to keep their weight off long-term,” said Karen Miller-Kovach, chief scientific officer, Weight Watchers International, Inc.

Advertisements

Does diabetes hamper cognitive function by lowering the brain’s cholesterol?

Leave a comment

Diabetes_cholesterol_brain

Diabetes_cholesterol

Low cholesterol is generally a good thing. But decreasing the amount of low-density lipoprotein—LDL, the “bad cholesterol”—is only one part of the body’s equation for a healthy balance of lipids. And although lowering cholesterol can be good for the heart, it’s not always great for the brain, which contains about a quarter of the body’s cholesterol.

Recent research has shown that drugs that lower cholesterol, such as statins, might put some individuals at risk for cognitive problems. And a new study suggests that diabetes, which affects cholesterol synthesis in the liver, might also be changing the rate these compounds are made in the brain.

Researchers found that diabetic mice had less of the crucial cholesterol in the membranes around their neural synapses. “This has broad implications for people with diabetes,” Ronald Kahn, of the Joslin Diabetes Center at Harvard Medical School and coauthor on the new study, said in a prepared statement. The results were published online November 30 in Cell Metabolism.

“Since cholesterol is required by neurons to form synapses with other cells, this decrease in cholesterol could affect how nerves function,” Kahn said. Affected systems could include “appetite regulation, behavior, memory and even pain and motor activity,” he said. They correlate with increased risk of diabetic patients for eating disorders, depression and memory trouble.

For the study, Kahn and his team created mice that lacked sufficient insulin—to mimic a diabetic condition. In particular, the scientists targeted a gene known as SREBP-2 (which controls cholesterol metabolism) and other brain-based genes, causing the mice to produce—and retain—less cholesterol in key structures in the brain.

“These effects occurred in both the neurons and supporting ‘glial’ cells that helped provide some nutrients to the neurons,” Kahn said. “Ultimately this affects the amount of cholesterol that can get into membranes of the neuron, which form the synapses and the synaptic vesicles.”

When the diabetic mice received insulin injections, their genes seemed to return to normal functioning. The researchers noted that longer-term insulin depletion might cause more permanent damage to the myelin sheaths, the fatty covering of nerves that contain more than two thirds of the central nervous system’s cholesterol and are crucial to neural communication.

“It is well known that insulin and diabetes play an important role in regulating cholesterol synthesis in the liver, where most of the cholesterol circulating in the body comes from,” Kahn said. “But nobody had ever suspected that insulin and diabetes would play an important role in cholesterol synthesis in the brain.”

Recommended Daily Vitamin D Intake Gets a Boost

Leave a comment

Recommended-daily-vitamin

Recommended-daily-vitamin

Vitamin D deficiency has become something of a health bugaboo in recent years, especially after a 2009 study that declared three quarters of U.S. adults and teenagers deficient. Low levels of the vitamin—which is manufactured by the body when sunlight hits the skin and can be found in some fatty fish and fortified food products—have been linked to disparate conditions, such as a sluggish immune system and psychosis.

But a new report from the Institutes of Medicine (IOM), released November 30, concludes that the evidence linking vitamin D and calcium deficiency to anything but poor bone health is inconclusive. It also determined that most people in the U.S. and Canada are getting ample amounts of the vitamin. Even so, the organization is raising the level of recommended daily intake.

The new assessment recommends a daily vitamin D dietary allowance of 600 milligrams for most healthy people 9 years and older (with an estimated average requirement of 400 International Units per day—and no more than 4,000 IU of vitamin D per day). People 71 years and older should take 800 mg of the vitamin per day, according to the report. (Children aged 4 to 8 years should not have more than 3,000 IU/day and those aged 1 to 3 years should not have more than 2,500 IU/day.)

The updated daily recommendations are not directly comparable to previous sets, which were established in 1997 and are based on “adequate intakes” rather than on the newer recommended dietary allowance and estimate average requirement. Catharine Ross, chair of the IOM’s review committee and a professor of nutrition at Pennsylvania State University, noted in a Tuesday press briefing that the two values are “like comparing apples to pears.” (The previous adequate intake recommendations were 200 IU per day for infants through age 50, 400 IU/day for ages 51 to 70, and 600 IU/day for those 71 and older.)

Adequate intake was more of “a guesstimate,” Patsy Brannon, a professor of nutritional sciences at Cornell University and member of the IOM review committee, said at the briefing. Those who did not meet the previous adequate intake levels were at a higher risk for deficiency, but “you cannot assume that individuals are deficient if they do not meet the adequate intake,” she explained.

That the IOM is “recognizing that their 1997 recommendations are too low” is substantial progress, says David Hanley, a professor in the departments of Medicine, Community Health Sciences and Oncology at the University of Calgary, who was not involved in the new report.

The antidepressant reboxetine: A headdesk moment in science

Leave a comment

The-antidepressant

The-antidepressant

Every so often there comes a truly “headdesk” moment in science. A moment where you sit there, stunned by a new finding, and thinking, blankly…”OK, now what?”

For psychiatry and behavioral pharmacology, one of those moments came a few weeks ago with the findings of a meta-analysis published in the British Medical Journal (Eyding et al., 2010). The meta-analysis showed that an antidepressant, reboxetine (marketed by Pfizer in Europe, but not in the U.S., under the names Edronax, Norebox, Prolift, Solvex, Davedax or Vestra) doesn’t work. Not only does it not work, it really doesn’t work, and it turns out that Pfizer hadn’t published data on the putative antidepressant from 74% of their patients. Some people have reported that the study found that reboxetine was even “possibly harmful,” but that’s not quite true. What the study did find is that reboxetine produced more side effects (noted as “adverse events”) than placebo (as might be expected), but with no positive effects at all. While many antidepressants on the market today are not great, most are effective in around 60% of patients; reboxetine turns out to be even worse than that.

It turns out that publication bias was rampant. Pfizer and Lundbeck, the two companies running the studies, didn’t publish a lot of their data, especially the data showing no effect and unfortunate side effects. A bit nefarious, that. But bad science will out.

While sales of reboxetine never compared to sales of more traditional antidepressants like citalopram and fluoxetine, the study still puts a major kink in the pharmacotherapies currently available for depression. Whereas drugs like citalopram and fluoxetine primarily target the neurotransmitter serotonin, reboxetine targets the neurotransmitter norepinephrine. So it was hoped when drugs like reboxetine came on to the market that the different chemical focus might prove more effective or change the side-effect profiles normally associated with antidepressants. But obviously the side effects got worse, and reboxetine turns out to not be so effective in patients after all.

And this is a rough moment for scientists studying depression. Why? Because reboxetine works beautifully in our animal models. It’s practically a poster-child antidepressant. It produces acute effects in tests such as forced-swim tests and tail-suspension tests (which use changes in struggle as a measure of antidepressant efficacy). It produces neurogenesis in the hippocampus, which is thought to be correlated with antidepressant effects. When behavioral pharmacologists are doing comparisons between older antidepressants and newer ones, reboxetine is often used as a positive control, a drug known to have an effect in the behavioral test of choice.

But it doesn’t work in patients. And patients are what matters. Now, scientists are stuck with a difficult question: What went wrong? This is more than just an issue with an antidepressant that didn’t work, it’s an issue with the tests we are using to study depression. How effective are they, really? Are we in fact modeling the right things? Do the tail-suspension test and forced-swim test detect antidepressant activity after all? And if they aren’t detecting antidepressant activity, what are they actually doing? What does this mean for both the neurochemical theory of depression and the neurogenesis theory? Reboxetine affects both but still has no clinical effect. Does this mean that both of these theories are wrong? Or does it mean that they are incomplete? And where, exactly, do we go from here?

We may need new models to study depression, or we may need to simply redefine and reexamine the ones that we have. But the latest findings on reboxetine raise more questions than those about pharma companies, scientific conduct and efficacy in patients. They raise questions about the way we study depression and what it is we need to measure to come up with the therapies that patients need. And it makes it more important than ever to study the possible mechanisms behind depression and other mental disorders, to understand how they work and what behaviors and changes we need to detect, to gain new insights into how to combat depression with more success and less…reboxetine.

A Healthy Brain Needs a Healthy Heart

Leave a comment

The-heart-brain-connection

The-heart-brain-connection

When the National Institutes of Health convened a panel of independent experts this past April on how to prevent Alzhei­mer’s disease, the conclusions were pretty grim. The panel determined that “no evidence of even moderate scientific quality” links anything—from herbal or nutritional supplements to prescription medications to social, economic or environmental conditions—with the slightest decrease in the risk of developing Alzheimer’s. Furthermore, the committee argued, there is little credible evidence that you can do anything to delay the kinds of memory problems that are often associated with aging. The researchers’ conclusions made headlines around the world and struck a blow at the many purveyors of “brain boosters,” “memory enhancers” and “cognitive-training software” that advertise their wares on the Web and on television. One of the panel experts later told reporters in a conference call that the group wanted to “dissuade folks from spending extraordinary amounts of money on stuff that doesn’t work.”

But did the panel overstate its case? Some memory and cognition researchers privately grumbled that the conclusions were too negative—particularly with respect to the potential benefits of not smoking, treating high blood pressure and engaging in physical activity. In late September the British Journal of Sports Medicine published a few of these criticisms. As a longtime science journalist, I suspected that this is the kind of instructive controversy with top-level people taking opposing positions that often occurs at the leading edge of research. As I spoke with various researchers, I realized that the disagreements signaled newly emerging views of how the brain ages. Investigators are exploring whether they need to look beyond the brain to the heart to understand what happens to nerve cells over the course of decades. In the process, they are uncovering new roles for the cardiovascular system, including ones that go beyond supplying the brain with plenty of oxygen-rich blood. The findings could suggest useful avenues for delaying dementia or less severe memory problems.

Dementia, of course, is a complex biological phenomenon. Although Alzheimer’s is the most common cause of dementia in older adults, it is not the only cause. Other conditions can contribute to dementia as well, says Eric B. Larson, executive director of the Group Health Research Institute in Seattle. For example, physicians have long known that suffering a stroke, in which blood flow to the brain has been interrupted by a clot or a hemorrhage, can lead to dementia. But research over the past few years has documented the importance of very tiny strokes—strokes so small they can be detected only under a microscope after death—as another possible cause for dementia. Studies at autopsy of people who had dementia have detected many of these so-called microvascular infarcts either by themselves or along with the plaques and tangles more typical of Alzheimer’s in the brains of people with dementia. These findings suggest that most dementias, even those caused by Alzheimer’s, are triggered by multiple pathological processes and will require more than one treatment.

Proving that cardiovascular treatment is one of those approaches will take some doing. Just because microinfarcts may make dementia worse does not mean that preventing them will delay the brain’s overall deterioration. Maybe severe dementia makes people more vulnerable to microinfarcts. And just because better control of high blood pressure and increased physical activity seem to decrease a person’s risk of stroke, that does not necessarily mean they are less likely to suffer microinfarcts. Correlation, after all, does not necessarily imply causation. That scientific truism was the problem that kept bothering the panel of outside experts put together by the NIH. Thus, the expert panel concluded, with one exception, that “all existing evidence suggests that antihypertensive treatment results in no cognitive benefit.” Data showing the benefits of boosting physical activity in folks with confirmed memory problems were “preliminary.”

Duke scientists look deeper for coal ash hazards

Leave a comment

Coal-ash-hazards

Coal-ash-hazards

As the U.S. Environmental Protection Agency weighs whether to define coal ash as hazardous waste, a Duke University study identifies new monitoring protocols and insights that can help investigators more accurately measure and predict the ecological impacts of coal ash contaminants. “The take-away lesson is we need to change how and where we look for coal ash contaminants,” says Avner Vengosh, professor of geochemistry and water quality at Duke’s Nicholas School of the Environment. “Risks to water quality and aquatic life don’t end with surface water contamination, but much of our current monitoring does.”

The study, published online this week in the peer-reviewed journal Environmental Science and Technology, documents contaminant levels in aquatic ecosystems over an 18-month period following a massive coal sludge spill in 2008 at a Tennessee Valley Authority power plant in Kingston, Tenn.

By analyzing more than 220 water samples collected over the 18-month period, the Duke team found that high concentrations of arsenic from the TVA coal ash remained in pore water — water trapped within river-bottom sediment — long after contaminant levels in surface waters dropped back below safe thresholds. Samples extracted from 10 centimeters to half a meter below the surface of sediment in downstream rivers contained arsenic levels of up to 2,000 parts per billion – well above the EPA’s thresholds of 10 parts per billion for safe drinking water, and 150 parts per billion for protection of aquatic life.

“It’s like cleaning your house,” Vengosh says of the finding. “Everything may look clean, but if you look under the rugs, that’s where you find the dirt.”

The potential impacts of pore water contamination extend far beyond the river bottom, he explains, because “this is where the biological food chain begins, so any bioaccumulation of toxins will start here.”

The research team, which included two graduate students from Duke’s Nicholas School of the Environment and Pratt School of Engineering, also found that acidity and the loss or gain of oxygen in water play key roles in controlling how arsenic, selenium and other coal ash contaminants leach into the environment. Knowing this will help scientists better predict the fate and migration of contaminants derived from coal ash residues, particularly those stored in holding ponds and landfills, as well as any potential leakage into lakes, rivers and other aquatic systems.

The study comes as the EPA is considering whether to define ash from coal-burning power plants as hazardous waste. The deadline for public comment to the EPA was Nov. 19; a final ruling — what Vengosh calls “a defining moment” — is expected in coming months.

“At more than 3.7 million cubic meters, the scope of the TVA spill is unprecedented, but similar processes are taking place in holding ponds, landfills and other coal ash storage facilities across the nation,” he says. “As long as coal ash isn’t regulated as hazardous waste, there is no way to prevent discharges of contaminants from these facilities and protect the environment.”

Soil microbes define dangerous rates of climate change

Leave a comment

Soil-microbes-define

Soil-microbes-define

The rate of global warming could lead to a rapid release of carbon from peatlands that would further accelerate global warming. Two recent studies published by the Mathematics Research Institute at the University of Exeter highlight the risk that this ‘compost bomb’ instability could pose, and calculate the conditions under which it could occur.

The same Exeter team is now exploring a possible link between the theories described in the studies and last summer’s devastating peatland fires in Russia.

The first paper is published in the European Journal of Social Science and the second in Proceedings of the Royal Society A.

The first paper by Catherine Luke and Professor Peter Cox describes the basic phenomenon. When soil microbes decompose organic matter they release heat – this is why compost heaps are often warmer than the air around them.

The compost bomb instability is a runaway feedback that occurs when the heat is generated by microbes more quickly than it can escape to the atmosphere. This in turn requires that the active decomposing soil layer is thermally-insulated from the atmosphere.

Catherine Luke explains: “The compost bomb instability is most likely to occur in drying organic soils covered by an insulating lichen or moss layer”.

The second paper led by Dr Sebastian Wieczorek and Professor Peter Ashwin, also of the University of Exeter, proves there is a dangerous rate of global warming beyond which the compost bomb instability occurs.

This is in contrast to the general belief that tipping points correspond to dangerous levels of global warming.

Sebastian Wieczorek explains: “The compost bomb instability is a novel type of rate-dependent climate tipping point”.

The Exeter team is now modelling the potential impact of the compost bomb instability on future climate change, including the potential link to the Russian peatland fires.It is also working to identify other rate-dependent tipping points.

Older Entries Newer Entries