Published by
Stanford Medicine



Myths, Nutrition, Obesity

Effects of diet sodas on weight gain remain uncertain

Effects of diet sodas on weight gain remain uncertain

Recent studies suggesting that diet sodas may lead to weight gain have stirred up interest among diet-soda-drinkers and non-diet-soda-drinkers alike, confirming suspicions that the “diet” label and zero-calorie contents may be too good to be true. One of these studies, presented to the American Diabetes Association in June, associated diet soda consumption with a waistline size increase 70% greater than non-users. These results, along with the results of several related studies, are in line with thinking suggesting diet or “light” foods and beverages may contribute to weight gain.

But Loyola University obesity specialist Jessica Bartfield, MD, thinks that we should take these studies with a grain of salt (or, if you prefer, aspartame). A release that came out today quotes her take on the issue:

“I suspect that people are likely drinking those diet sodas to wash down high-fat and high-calorie fast-food or take-out meals, not as a complement to a healthy meal prepared at home or to quench a thirst after a tough workout.”

In other words, it’s not the fake sugar in diet soda that causes weight gain–it’s the lifestyle choices that usually accompany it. Switching from regular soda to zero-calorie diet varieties, she argues, may be tremendously effective as a weight-loss strategy–just as long as users aren’t canceling it out with an otherwise high-calorie diet.

Bartfield also points out the importance, in the case of obesity studies, of taking all factors into account:

“The association studies are significant and provocative, but don’t prove cause and effect,” said Bartfield, who counsels weight-loss patients at the Chicago-area Loyola University Health System. “Although these studies controlled for many factors, such as age, physical activity, calories consumed and smoking, there are still a tremendous number of factors such as dietary patterns, sleep, genetics, and medication use that account for weight gain.”

Dieters looking for a satisfying answer to their weight-loss questions may be annoyed by the back-and-forth on issues like these. Then again, if obesity were a straightforward issue, we’d have solved it already.

Photo by computerjoe

Autism, Genetics, Myths, Neuroscience, Research

Unsung brain-cell population implicated in variety of autism

Like the late Rodney Dangerfield, and as I once wrote in Stanford Medicine, glial cells “don’t get no respect.” Combined, the three glial cell types – astrocytes, oligodendricytes, and microglia – constitute a good 90 percent of all the cells in the brain. Yet the remaining 10 percent – the neurons – are so celebrated they’ve lent their name to brain science: neurobiology.

Stanford’s Ben Barres, MD, PhD, a lonely voice in the wilderness, has long advocated paying more attention to glial cells. His experience as a young neurologist in the 1980s convinced him that they’re involved in all sorts of brain pathology.

And, belatedly, glial cells are getting some grudging respect, in appreciation of their increasingly well-characterized roles in everything from directing blood vessels to increase their diameters in the vicinity of busy nerve circuits to determining which synapses will live and which will die.

In a new study just published in Nature Neuroscience, a genetic deficiency known to be responsible for Rett syndrome, the most physically disabling of the autistic disorders, has been shown to wreak many of its damaging effects via astrocytes. These gorgeous star-shaped glial cells, alone, account for almost half of all cells in the human brain (although by volume not so much, as they’re smaller than neurons).

In the study, investigators at Oregon Health and Science University employed a mouse model of Rett’s syndrome in which the condition’s defining gene defect was present in every cell of every mouse. When the investigators restored that defective gene to the mice’s astrocytes – and only their astrocytes – many of the signature symptoms of the disease cleared up.

Rett Syndrome was once assumed to be exclusively a function of damaged neurons. This latest finding, like many others over the past decade, goes to show that glial cells aren’t just a bunch of packing peanuts whose main job is to keep our neurons from jiggling when we jog.

Photo by Ibtrav

Myths, Nutrition, Obesity

The dark side of "light" snacks: study shows substitutes may contribute to weight gain

The dark side of "light" snacks: study shows substitutes may contribute to weight gain

Yet another get-thin-quick scheme has been debunked: A new study by Purdue University researchers shows that fake fats used in low-calorie snacks may actually contribute to weight gain.

Synthetic fat substitutes are the cornerstone of zero-calorie snack foods that market themselves as “diet” products. But when Purdue researchers put one group of mice on a high-fat diet and another group on a mixed diet containing both fatty products and products containing fat substitutes, they noticed that the mixed-diet mice gained more weight than the high-fat-diet mice. 

“But,” I cried, dejectedly stuffing a handful of chips into my face, “how can low-calorie foods cause weight gain?” The release explains:

Food with a sweet or fatty taste usually indicates a large number of calories, and the taste triggers various responses by the body, including salivation, hormonal secretions and metabolic reactions. Fat substitutes can interfere with that relationship when the body expects to receive a large burst of calories but is fooled by a fat substitute.

“Is there any good news?” I whimpered, wiping tears and crumbs off onto my sleeve. Certainly. Synthetic fat substitutes only seem to promote weight gain when consumed as part of a high-fat diet. Low-fat dieters, like some of the mice in Purdue’s study, are safe from fat substitutes’ greasy grasp:

The rats that were fed a low-fat diet didn’t experience significant weight gain from either type of potato chips. However, when those same rats were switched to a high-fat diet, the rats that had eaten both types of potato chips ate more food and gained more weight and body fat than the rats that had eaten only the high-calorie chips.

As always, easy-weight-loss solutions like fat substitutes and artificial sweeteners (which have also been linked to weight gain) can prove disappointing and even dangerous. Looks like healthy foods and exercise are still your best bet.

Photo by bamalibrarylady

Global Health, Health Policy, Medicine and Society, Myths, Patient Care

The English patient meets the British health-care system… eventually

In all respects excluding its expense, health care in the U.S. has been given a bum rap. (See here, here and here.) I’ve kvetched about the shortcomings of the Canadian and French health-care systems, as experienced by my own flesh and blood relatives (or theirs).

Now, the latest knock on the British one. (Not that it’s the only thumbs-down on British health care.) A snippet:

Katherine Murphy, the director of the Patients Association, said it had heard from people whose hip or knee replacement had been postponed once or twice without them being offered a new date, leaving them in pain and with their independence compromised.

Previously: Forbes: U.S. still most-innovative country in biomedicine, Rand Corp. study says U.S. health care for elderly superior to UK’s, U.S. health system’s sketchy WHO rating is bogus, says horse’s mouth and Rush to judgment regarding the state of U.S. health care?


Infectious Disease, Myths, Nutrition

Does vitamin C work for the common cold?

Since I spend quite a bit of time traveling and can’t afford to get many colds, the slightest tickle in my throat usually sends me bounding into the kitchen in search of a bolus of vitamin C. There I tear into a small blue packet containing a pastel powder, dump it into a glass of water, and gulp down the resulting fizzy elixir. Thinking rationally, I recognize the vitamin C is unlikely to help much – and yet I still practice this ritual every time I feel unwell.

Now I’ve come across an excellent analysis of whether or not vitamin C works for the common cold on Clinical Correlations and it offers another reminder that I’m probably wasting my money. Carolyn Bevan, MD, writes:

…at the end of the day, is there any benefit to taking a daily vitamin C supplement, or for chugging down that fizzy shot of mega-dose vitamin C when you feel a cold coming on? If you are a marathon runner, or if you are planning a winter adventure in the arctic tundra, you should certainly consider a daily dose of vitamin C. For the rest of us, it doesn’t seem to be worth the hassle and expense of adding one more pill to our daily routine.

How Bevan gets to that conclusion is definitely worth reading – and her analysis is even peer reviewed. Thanks to her effort, at least until I replace marathon flights with marathon runs, it seems that I can probably skip the vitamin C.

Aging, Global Health, Health Policy, In the News, Myths, Patient Care

Rand Corp. study says US health care for elderly superior to UK's

I’ve written before – in fact more than once – about statistics purporting to show that the U.S. health care system stinks compared with those of, say, Canada or Europe. It appears these apples-and-oranges comparisons may be full of beans.

The evidence keeps piling up. In a just-out Rand Corporation study, investigators compared older American versus older English citizens’ death rates from various aging-associated diseases. Older Americans are almost twice as likely to be diagnosed with such increasingly common conditions as type 2 diabetes, high blood pressure, and the like. Is this evidence of an inferior U.S. health care system?

Probably not. Those suffering from these syndromes in the U.S. are only half as likely to die from them as their British counterparts. As a result, the afflicted Americans live at least as long as the afflicted British. Overall, 65-year-old Americans can expect to outlive their like-aged friends across the pond by about three months, despite their higher likelihood of having or getting an aging-associated illness.

The researchers note two possible explanations for why sick elders live longer in America than in England:

One is that the illnesses studied result in higher mortality in England than in the United States. The second is that the English are diagnosed at a later stage in the disease process than Americans.

Either explanation, they add, implies a more responsive health care system in the U.S. – at least for older people, who have nearly universal access to it. The fault appears to lie not in the health care system we Americans frequent (if you ignore its expense), but in our own lifestyle choices and, perhaps, other factors outside the control of both our health-care system and ourselves.

History, Imaging, Medicine and Society, Myths

Mickey Mouse: "Doctors always know best"

Letters of Note has posted some amusing correspondence from Disney Studios about some harmful X-rays in one of its comic books. Apparently the story caused a great deal of anxiety amongst young patients in Pennsylvania. The letter, from 1932 and signed by Mickey Mouse, was meant to put young patients at ease and remind them that:

. . .when anything gets wrong with ME, the first thing I do is go to a hospital. And then I do whatever the doctor tells me.

‘Cause doctors always know best.

Head over to Letters of Note for the context and for the rest of the letter.


Global Health, Health Policy, In the News, Medicine and Society, Myths, Science Policy

U.S. health system's sketchy WHO rating is bogus, says horse's mouth

Last October I wrote about studies suggesting that America’s reputation for shoddy health care was undeserved.

Now, in a letter to the New England Journal of Medicine, Paul Musgrove, PhD, writes that the U.S.’s much bandied-about #37 rating among the world’s health-care systems is a “zombie number” that should be retired from public discourse.

He should know. Musgrove was editor-in-chief of the 2002 World Health Organization report, Health Systems: Improving Performance, that first coined the statistic. Despite his position, Musgrove writes, he had no editorial control over the health-system rankings.

Three years after the report was issued, Musgrove took the unusual step of refuting its methodology in a critique published in The Lancet. The U.S. health-care system’s rank was based on incomplete, and largely imputed rather than observed, data and is “meaningless,” he writes.

Aging, Health and Fitness, History, Myths, Pain, Research, Stanford News

Ain't no cure for the muscle-cramp blues: The sun slowly sets on quinine

Ain't no cure for the muscle-cramp blues: The sun slowly sets on quinine

On June 8, 1962 – the ninth anniversary of Queen Elizabeth II’s coronation – Commander Walter Edward Whitehead, the bearded pitchman for Schweppes quinine water, was named a member of the Order of the British Empire. (In a fitting display of royal symmetry, a certain mister Gilbey, purveyor of the eponymous gin, was knighted in the same ceremony.)

Those of a certain age will recall television commercials featuring Cdr. Whitehead extolling the virtues of the beverage. It wasn’t hard to imagine that stately gentleman in his younger days, charging through the jungles of one colony or another. Armed with quinine an empire ascended, malaria be damned.

Alas, like the British Empire, quinine has fallen on hard times. Although the bitter substance was indeed the first treatment for malaria – and the only effective one for 300 years – it became largely outmoded with the advent of better drugs after World War II.

Still, until quite recently quinine was also the front-line treatment for a dreaded scourge of the sendentary classes: muscle cramps. They strike without warning.

Most cases of muscle cramps never get reported to public health authorities, so it’s difficult to say how common they are. But you probably know someone who’s had them. You’ve probably had them, too. And the older you get, the more likely you’re having one right now.

But according to a just published Stanford study whose senior author was neurologist Yuen So, MD, PhD, quinine’s not such a great choice for muscle cramps, either. Not that it’s wholly ineffective – it can reduce symptoms by one-third to one-half – but that comes with a 1-in-25 chance of serious side effects: for example, hematologic disorders. Better not to mess with it unless the cramps are really bad and nothing else works.

In fact, at the moment there are no really useful, tried-and-tested prescriptions for this common disorder, the study found. Quinine itself, once freely available over the counter, has been taken off the shelves. (Forget about quinine water. You’d have to drink a few liters of it to get any efficacy for muscle cramps, and by the time you were done you’d probably have cramps in another important organ.)

So, paraphrasing Eddie Cochran, there ain’t no cure for the muscle-cramp blues. There’s a great prescription for getting them, though: Just sit at your desk all day long reading technical articles, combing the Web for news updates, and blogging your brains out.

Cardiovascular Medicine, Health Policy, In the News, Myths, Nutrition, Research

A grain of salt concerning salt-intake reduction

Headlines were made, a month or two ago, by a mathematical-modeling study published in the New England Journal of Medicine suggesting that lowering average U.S. dietary salt intake by 3 grams – about a half-teaspoonful – per day would result in 44,0000 to 92,000 fewer deaths. Kindred putative benefits include substantially reduced heart-attack rates and coronary heart disease cases. (Stanford nephrologist Glenn Chertow, MD, MPH, contributed to that report.)

But this week, a commentary (registration may be required) in the Journal of the American Medical Association cautions against swift, sweeping public-health measures with the goal of dropping the country’s collective salt intake. Written by Michael Alderman, MD, of the Department of Epicemiology and Population Health, Albert Einstein College of Medicine, the commentary makes some bracing observations:

Multiple randomized clinical trials… have established that reduction of sodium intake sufficient to lower blood pressure also increases sympathetic nerve activity, decreases insulin sensitivity, activates the renin angiotensin system, and stimulates aldosterone stimulation.

And who will watch over the guardians?
The only randomized clinical comparisons of different sodium intakes for which the endpoints were morbidity and mortality – the gold standard – have involved patients with heart failure, Alderman writes. And what happened?

[A] more restricted sodium intake significantly increased mortality and hospitalization. . . . These results are consistent with the view that overzealous restiction of sodium may be harmful for patients with heart failure.

Alderman continues:

Rarely, smoking accepted, do observational studies . . . justify a public health intervention. The 1980 National Dietary Guidelines recommended population-wide reduction of total fat intake. In response to an unanticipated epidemic of obesity and diabetes, to which the authors concluded the 1980 recommendations might have contributed, the 2000 committee withdrew its earlier recommendation. Trans-fat consumption and postmenopausal hormone therapy are other examples of how well-meaning interventions, based on insufficent science, can have hazardous consequences.

He’s certainly right about that. In the January 13 issue of the American Journal of Clinical Nutrition, Ron Krauss, MD, of the Children’s Hospital Oakland Research Institute, and colleagues published a big fat metastudy concluding that there was no link between saturated-fat intake and increased risk of coronary heart disease or cardiovascular disease. As founder of the American Heart Association’s Council on Nutrition, Physical Activity, and Metabolism, Krauss knows a thing or two about cardiovascular disease risk. (He’s done seminal work on the effect of low- versus high-carb diets on LDL particle size, a possibly important heart-disease risk factor.)

About 20 years ago consumer-health advocacy groups such as the Center for Science in the Public Interest began loudly pounding the drum about saturated fats then widely used by fast-food chains. The result was a public-relations-coerced conversion from lard and beef tallow to those delightful replacements known as partially hydrogenated vegetable oils – the source of the (now we know) truly evil trans-fats referenced in the paragraph above. That switch didn’t work out so well.

Right-size those recommendations
Perhaps comparing salt reduction to saturated-fat cold turkey or to trans-fat substitution is unfair. Taking some of the salt out of the processed foods most Americans live on is probably pretty benign. After all, we can all still salt up to the max in the privacy of our kitchens if we so choose. On the other hand, effectively forcing people onto a trans-fat-rich diet by infusing their french fries with galloping globs of it – or miseducating them into what may, in retrospect, have turned out to be a rather ruinous high-carb diet by making them feel guilty every time they bite down on a cheeseburger – can be actively injurious.

That said, surely the ideal way to attack even the most widespread public-health problem is on a personalized basis. Those with low or middling salt intake should be left alone, and those with both high salt intake and evidence of, say, hypertension should certainly be encouraged to try cutting down to see if it makes a difference. But where food, genes and belt sizes mix, one size does not fit all.

Stanford Medicine Resources: