Published by
Stanford Medicine



Myths, Nutrition, Obesity

The dark side of "light" snacks: study shows substitutes may contribute to weight gain

The dark side of "light" snacks: study shows substitutes may contribute to weight gain

Yet another get-thin-quick scheme has been debunked: A new study by Purdue University researchers shows that fake fats used in low-calorie snacks may actually contribute to weight gain.

Synthetic fat substitutes are the cornerstone of zero-calorie snack foods that market themselves as “diet” products. But when Purdue researchers put one group of mice on a high-fat diet and another group on a mixed diet containing both fatty products and products containing fat substitutes, they noticed that the mixed-diet mice gained more weight than the high-fat-diet mice. 

“But,” I cried, dejectedly stuffing a handful of chips into my face, “how can low-calorie foods cause weight gain?” The release explains:

Food with a sweet or fatty taste usually indicates a large number of calories, and the taste triggers various responses by the body, including salivation, hormonal secretions and metabolic reactions. Fat substitutes can interfere with that relationship when the body expects to receive a large burst of calories but is fooled by a fat substitute.

“Is there any good news?” I whimpered, wiping tears and crumbs off onto my sleeve. Certainly. Synthetic fat substitutes only seem to promote weight gain when consumed as part of a high-fat diet. Low-fat dieters, like some of the mice in Purdue’s study, are safe from fat substitutes’ greasy grasp:

The rats that were fed a low-fat diet didn’t experience significant weight gain from either type of potato chips. However, when those same rats were switched to a high-fat diet, the rats that had eaten both types of potato chips ate more food and gained more weight and body fat than the rats that had eaten only the high-calorie chips.

As always, easy-weight-loss solutions like fat substitutes and artificial sweeteners (which have also been linked to weight gain) can prove disappointing and even dangerous. Looks like healthy foods and exercise are still your best bet.

Photo by bamalibrarylady

Global Health, Health Policy, Medicine and Society, Myths, Patient Care

The English patient meets the British health-care system… eventually

In all respects excluding its expense, health care in the U.S. has been given a bum rap. (See here, here and here.) I’ve kvetched about the shortcomings of the Canadian and French health-care systems, as experienced by my own flesh and blood relatives (or theirs).

Now, the latest knock on the British one. (Not that it’s the only thumbs-down on British health care.) A snippet:

Katherine Murphy, the director of the Patients Association, said it had heard from people whose hip or knee replacement had been postponed once or twice without them being offered a new date, leaving them in pain and with their independence compromised.

Previously: Forbes: U.S. still most-innovative country in biomedicine, Rand Corp. study says U.S. health care for elderly superior to UK’s, U.S. health system’s sketchy WHO rating is bogus, says horse’s mouth and Rush to judgment regarding the state of U.S. health care?


Infectious Disease, Myths, Nutrition

Does vitamin C work for the common cold?

Since I spend quite a bit of time traveling and can’t afford to get many colds, the slightest tickle in my throat usually sends me bounding into the kitchen in search of a bolus of vitamin C. There I tear into a small blue packet containing a pastel powder, dump it into a glass of water, and gulp down the resulting fizzy elixir. Thinking rationally, I recognize the vitamin C is unlikely to help much – and yet I still practice this ritual every time I feel unwell.

Now I’ve come across an excellent analysis of whether or not vitamin C works for the common cold on Clinical Correlations and it offers another reminder that I’m probably wasting my money. Carolyn Bevan, MD, writes:

…at the end of the day, is there any benefit to taking a daily vitamin C supplement, or for chugging down that fizzy shot of mega-dose vitamin C when you feel a cold coming on? If you are a marathon runner, or if you are planning a winter adventure in the arctic tundra, you should certainly consider a daily dose of vitamin C. For the rest of us, it doesn’t seem to be worth the hassle and expense of adding one more pill to our daily routine.

How Bevan gets to that conclusion is definitely worth reading – and her analysis is even peer reviewed. Thanks to her effort, at least until I replace marathon flights with marathon runs, it seems that I can probably skip the vitamin C.

Aging, Global Health, Health Policy, In the News, Myths, Patient Care

Rand Corp. study says US health care for elderly superior to UK's

I’ve written before – in fact more than once – about statistics purporting to show that the U.S. health care system stinks compared with those of, say, Canada or Europe. It appears these apples-and-oranges comparisons may be full of beans.

The evidence keeps piling up. In a just-out Rand Corporation study, investigators compared older American versus older English citizens’ death rates from various aging-associated diseases. Older Americans are almost twice as likely to be diagnosed with such increasingly common conditions as type 2 diabetes, high blood pressure, and the like. Is this evidence of an inferior U.S. health care system?

Probably not. Those suffering from these syndromes in the U.S. are only half as likely to die from them as their British counterparts. As a result, the afflicted Americans live at least as long as the afflicted British. Overall, 65-year-old Americans can expect to outlive their like-aged friends across the pond by about three months, despite their higher likelihood of having or getting an aging-associated illness.

The researchers note two possible explanations for why sick elders live longer in America than in England:

One is that the illnesses studied result in higher mortality in England than in the United States. The second is that the English are diagnosed at a later stage in the disease process than Americans.

Either explanation, they add, implies a more responsive health care system in the U.S. – at least for older people, who have nearly universal access to it. The fault appears to lie not in the health care system we Americans frequent (if you ignore its expense), but in our own lifestyle choices and, perhaps, other factors outside the control of both our health-care system and ourselves.

History, Imaging, Medicine and Society, Myths

Mickey Mouse: "Doctors always know best"

Letters of Note has posted some amusing correspondence from Disney Studios about some harmful X-rays in one of its comic books. Apparently the story caused a great deal of anxiety amongst young patients in Pennsylvania. The letter, from 1932 and signed by Mickey Mouse, was meant to put young patients at ease and remind them that:

. . .when anything gets wrong with ME, the first thing I do is go to a hospital. And then I do whatever the doctor tells me.

‘Cause doctors always know best.

Head over to Letters of Note for the context and for the rest of the letter.


Global Health, Health Policy, In the News, Medicine and Society, Myths, Science Policy

U.S. health system's sketchy WHO rating is bogus, says horse's mouth

Last October I wrote about studies suggesting that America’s reputation for shoddy health care was undeserved.

Now, in a letter to the New England Journal of Medicine, Paul Musgrove, PhD, writes that the U.S.’s much bandied-about #37 rating among the world’s health-care systems is a “zombie number” that should be retired from public discourse.

He should know. Musgrove was editor-in-chief of the 2002 World Health Organization report, Health Systems: Improving Performance, that first coined the statistic. Despite his position, Musgrove writes, he had no editorial control over the health-system rankings.

Three years after the report was issued, Musgrove took the unusual step of refuting its methodology in a critique published in The Lancet. The U.S. health-care system’s rank was based on incomplete, and largely imputed rather than observed, data and is “meaningless,” he writes.

Aging, Health and Fitness, History, Myths, Pain, Research, Stanford News

Ain't no cure for the muscle-cramp blues: The sun slowly sets on quinine

Ain't no cure for the muscle-cramp blues: The sun slowly sets on quinine

On June 8, 1962 – the ninth anniversary of Queen Elizabeth II’s coronation – Commander Walter Edward Whitehead, the bearded pitchman for Schweppes quinine water, was named a member of the Order of the British Empire. (In a fitting display of royal symmetry, a certain mister Gilbey, purveyor of the eponymous gin, was knighted in the same ceremony.)

Those of a certain age will recall television commercials featuring Cdr. Whitehead extolling the virtues of the beverage. It wasn’t hard to imagine that stately gentleman in his younger days, charging through the jungles of one colony or another. Armed with quinine an empire ascended, malaria be damned.

Alas, like the British Empire, quinine has fallen on hard times. Although the bitter substance was indeed the first treatment for malaria – and the only effective one for 300 years – it became largely outmoded with the advent of better drugs after World War II.

Still, until quite recently quinine was also the front-line treatment for a dreaded scourge of the sendentary classes: muscle cramps. They strike without warning.

Most cases of muscle cramps never get reported to public health authorities, so it’s difficult to say how common they are. But you probably know someone who’s had them. You’ve probably had them, too. And the older you get, the more likely you’re having one right now.

But according to a just published Stanford study whose senior author was neurologist Yuen So, MD, PhD, quinine’s not such a great choice for muscle cramps, either. Not that it’s wholly ineffective – it can reduce symptoms by one-third to one-half – but that comes with a 1-in-25 chance of serious side effects: for example, hematologic disorders. Better not to mess with it unless the cramps are really bad and nothing else works.

In fact, at the moment there are no really useful, tried-and-tested prescriptions for this common disorder, the study found. Quinine itself, once freely available over the counter, has been taken off the shelves. (Forget about quinine water. You’d have to drink a few liters of it to get any efficacy for muscle cramps, and by the time you were done you’d probably have cramps in another important organ.)

So, paraphrasing Eddie Cochran, there ain’t no cure for the muscle-cramp blues. There’s a great prescription for getting them, though: Just sit at your desk all day long reading technical articles, combing the Web for news updates, and blogging your brains out.

Cardiovascular Medicine, Health Policy, In the News, Myths, Nutrition, Research

A grain of salt concerning salt-intake reduction

Headlines were made, a month or two ago, by a mathematical-modeling study published in the New England Journal of Medicine suggesting that lowering average U.S. dietary salt intake by 3 grams – about a half-teaspoonful – per day would result in 44,0000 to 92,000 fewer deaths. Kindred putative benefits include substantially reduced heart-attack rates and coronary heart disease cases. (Stanford nephrologist Glenn Chertow, MD, MPH, contributed to that report.)

But this week, a commentary (registration may be required) in the Journal of the American Medical Association cautions against swift, sweeping public-health measures with the goal of dropping the country’s collective salt intake. Written by Michael Alderman, MD, of the Department of Epicemiology and Population Health, Albert Einstein College of Medicine, the commentary makes some bracing observations:

Multiple randomized clinical trials… have established that reduction of sodium intake sufficient to lower blood pressure also increases sympathetic nerve activity, decreases insulin sensitivity, activates the renin angiotensin system, and stimulates aldosterone stimulation.

And who will watch over the guardians?
The only randomized clinical comparisons of different sodium intakes for which the endpoints were morbidity and mortality – the gold standard – have involved patients with heart failure, Alderman writes. And what happened?

[A] more restricted sodium intake significantly increased mortality and hospitalization. . . . These results are consistent with the view that overzealous restiction of sodium may be harmful for patients with heart failure.

Alderman continues:

Rarely, smoking accepted, do observational studies . . . justify a public health intervention. The 1980 National Dietary Guidelines recommended population-wide reduction of total fat intake. In response to an unanticipated epidemic of obesity and diabetes, to which the authors concluded the 1980 recommendations might have contributed, the 2000 committee withdrew its earlier recommendation. Trans-fat consumption and postmenopausal hormone therapy are other examples of how well-meaning interventions, based on insufficent science, can have hazardous consequences.

He’s certainly right about that. In the January 13 issue of the American Journal of Clinical Nutrition, Ron Krauss, MD, of the Children’s Hospital Oakland Research Institute, and colleagues published a big fat metastudy concluding that there was no link between saturated-fat intake and increased risk of coronary heart disease or cardiovascular disease. As founder of the American Heart Association’s Council on Nutrition, Physical Activity, and Metabolism, Krauss knows a thing or two about cardiovascular disease risk. (He’s done seminal work on the effect of low- versus high-carb diets on LDL particle size, a possibly important heart-disease risk factor.)

About 20 years ago consumer-health advocacy groups such as the Center for Science in the Public Interest began loudly pounding the drum about saturated fats then widely used by fast-food chains. The result was a public-relations-coerced conversion from lard and beef tallow to those delightful replacements known as partially hydrogenated vegetable oils – the source of the (now we know) truly evil trans-fats referenced in the paragraph above. That switch didn’t work out so well.

Right-size those recommendations
Perhaps comparing salt reduction to saturated-fat cold turkey or to trans-fat substitution is unfair. Taking some of the salt out of the processed foods most Americans live on is probably pretty benign. After all, we can all still salt up to the max in the privacy of our kitchens if we so choose. On the other hand, effectively forcing people onto a trans-fat-rich diet by infusing their french fries with galloping globs of it – or miseducating them into what may, in retrospect, have turned out to be a rather ruinous high-carb diet by making them feel guilty every time they bite down on a cheeseburger – can be actively injurious.

That said, surely the ideal way to attack even the most widespread public-health problem is on a personalized basis. Those with low or middling salt intake should be left alone, and those with both high salt intake and evidence of, say, hypertension should certainly be encouraged to try cutting down to see if it makes a difference. But where food, genes and belt sizes mix, one size does not fit all.

Autism, Medicine and Society, Myths

Wired tackles the vaccine controversy

This week in Wired magazine, Amy Wallace attempts to inject some science into the controversy over vaccines and autism. She also gives her readers a peek into the life of Paul Offit, a high-profile Philadelphia pediatrician who has been targeted by anti-vaccine advocates for his defense of childhood vaccination. The man receives death threats.

Wallace herself comes out strong in defense of vaccines, writing:

Twelve epidemiological studies have found no data that links the MMR (measles/mumps/rubella) vaccine to autism; six studies have found no trace of an association between thimerosal (a preservative containing ethylmercury that was used in vaccines until 2001) and autism, and three other studies have found no indication that thimerosal causes even subtle neurological problems. The so-called epidemic, researchers assert, is the result of improved diagnosis, which has identified as autistic many kids who once might have been labeled mentally retarded or just plain slow. In fact, the growing body of science indicates that the autistic spectrum – which may well turn out to encompass several discrete conditions – may largely be genetic in origin.

Unsurprisingly, the article has touched off a firestorm of debate in the online comments section. Instead of simply letting the anonymous arguments rage, however, Wired is experimenting: They’ve set up a page where users can post questions about the article’s sources. Wired‘s writers and editors will then answer the questions over a period of several weeks. It’s an interesting (and interactive) idea, and I’ll be watching to see whether it cuts down on the sort of misinformation that often rules online comments.

For more on vaccines and autism, see “Why this fear of vaccines?” and the Spring 2009 issue of Stanford Medicine.

Stephanie Pappas is a guest blogger based in Houston, Texas. She was formerly an intern for the Stanford School of Medicine Office of Communication and Public Affairs.

Cardiovascular Medicine, Global Health, Health Policy, Medicine and Society, Myths, Patient Care

Rush to judgment regarding the state of U.S. health care?

“The U.S. health care system, although it’s the costliest in the world, doesn’t even deliver the goods when it comes to delivering health.” That’s the conventional wisdom. But is it true?

At least a few well-researched studies by credentialed and respected experts suggest we might want to pause to consider whether, in the mad dash now underway to fix our ailing health care system, we could inadvertently end up breaking parts of it that work very well.

Money down the drain?
Let’s start with a claim we’ve been hearing a lot lately, summarized as follows: High-tech treatments and procedures (MRI and CT imaging, sweeping use of pricey meds, etc.) cost a fortune, yet produce no substantial treatment gains and carry little preventive payoff.

Here’s a study, by University of Pennsylavia scholars, showing that five-year survival rates in the U.S. for cancer and heart disease are the world’s highest, due not only to earlier detection but also to more-aggressive treatment of these conditions once they’re caught. (Who knew?)

And here’s this National Bureau of Economic Research analysis whose author looked state by state and found that the more a state was characterized by high usage of sophisticated diagnostic imaging, the greater the longevity within that state. According to the study, this was a causal relationship, not an artifact (such as richer states having healthier people and more high-tech equipment).

No doubt there’s a whole lot of prescribing and imaging going on in these United States, and some of that — maybe a lot of it — is wasteful. But there are reasonable ways of dealing with this short of a complete overhaul: Why not ban self-referral (i.e., to a diagnostic facility, by a physician who owns a piece of said facility). How about reforming American tort laws, whose financial costs to the health care system far exceed the direct-litigation expenses because medical practitioners prescribe diagnostics and drugs out of fear of malpractice suits, and because specialists’ sky-high malpractice insurance premiums are passed on to patients.

Life in the U.S.A.: Short, nasty, and brutish?
Finally, what about the claim that “U.S. life expectancy falls short of that in other advanced countries, no doubt as a result of our broken healthcare system”? After all, there’s no primary endpoint like death, is there?

This book by health economists Robert Ohsfeldt of Texas A&M and John Schneider of Health Economics Consulting Group, published in 2006, disputes the widely held assumption that Americans’ substandard life expectancy reflects the poor health care served up in this country. The authors found that, once you factor accidents and homicides out of the picture, U.S. longevity is unsurpassed. As reprehensible as our high murder rates are, they can’t be construed as an indictment of the American health care system, can they? (And if they can, what in the bills now worming their way through Congress would change this?)

Also dragging on America’s calculated overall life expectancy is what all acknowledge to be a relatively high infant-mortality rate. The presumption is that this high rate probably stems from an abundance of preterm births attributable, in turn, to mothers deprived of prenatal care. That’s a plausible claim — but one confounded a bit by a recent March of Dimes white paper, which flags high and increasing rates of preterm births in the United States — and in Canada, and in Sweden, and in Denmark, countries seldom accused of neglecting pregnant moms.

An alternative hypothesis is that high premie rates could reflect older would-be parents’ rising resort, in economically advanced countries, to fertility-assistance techniques. And America’s infamous high reported infant-mortality rate is to no small extent the product of the heroic efforts now routinely made (certainly in the United States, as any neonatologist specialist can attest) to rescue premies that, in past days, would have been logged not as dead infants but as stillbirths.

Expensive? No question. Inefficient? Maybe. But, just maybe, those bucketloads of bucks we collectively throw at our medical problems aren’t going entirely down the drain.

Stanford Medicine Resources: