Published by
Stanford Medicine



Ebola, In the News, Myths, Science

The slippery slope toward "a dangerous dependence on facts"

The slippery slope toward "a dangerous dependence on facts"

220px-Sputnik_asmThe ever-funny Andy Borowitz has written in The New Yorker about a previously unreported challenge in the fight against Ebola: It might make Americans believe in science. He writes:

In interviews conducted across the nation, leading anti-science activists expressed their concern that the American people, wracked with anxiety over the possible spread of the virus, might desperately look to science to save the day.

“It’s a very human reaction,” said Harland Dorrinson, a prominent anti-science activist from Springfield, Missouri. “If you put them under enough stress, perfectly rational people will panic and start believing in science.”

For someone who left science to become a writer specifically to help explain science to the public, this piece is both funny and also so very not funny at the same time. Almost 20 years after I put down my pipette, Americans are, if anything, less willing to let science guide their health, energy, or environmental decisions than they were back when I started – thus the humor in Borowitz’ piece.

All of this makes me wonder if I could have spared myself several decades of worrying about clever analogies, agonizing about transitions, and racing the clock to make deadlines and done something less stressful with my life. Something fulfilling. Something where at the end of the day, my work would help people live happier, healthier lives rather than producing something people will ignore if it doesn’t fit their ideology.

Matthew Nisbet and Dietram Scheufele have written a number of articles about science communication and its effects on public perception of science. In the American Journal of Botony they write, “Often when the relationship between science and society breaks down, science illiteracy is typically blamed, the absence of quality science coverage is bemoaned, and there is a call put out for ‘more Carl Sagans.’”

In a nutshell, that sums up my career switch. I bemoaned the absence of quality science coverage and fully intended to fill that gap.

Then, they go on to shatter my reasons for writing by pointing out that at a period of time when the public’s regard for science was at it’s highest – soon after the Sputnik launch – science literacy was abysmal. In one survey at the time, just 12 percent of people understood the scientific method, yet 90 percent of people believed that science was making their lives better.

What that survey suggests is that even a scientific challenge like Ebola is unlikely to push Americans to be better educated about science. But perhaps with the perfect transition, or really outstanding analogy, those same scientifically illiterate Americans can be convinced that science is making life better and – I’m really dreaming here -should be funded?

If yes, maybe Borowitz’ fictional anti-science advocate will be proved right, and we will head down that slippery slope “in which a belief in science leads to a belief in math, which in turn fosters a dangerous dependence on facts.” One can hope!

Previously: Scientist: Just because someone’s on TV doesn’t mean they’re an expert

Evolution, Genetics, History, Myths, Research, Stanford News

New genetic study: More evidence for modern Ashkenazi Jews’ ancient Hebrew patrimony

New genetic study: More evidence for modern Ashkenazi Jews' ancient Hebrew patrimony

IsraelI hail from the so-called Ashkenazi branch of Jews, who account for the great majority of all Jews in the world today. Ashkenazis are distinguished by the historical fact that, over the last couple of thousand years or so, they propagated throughout Europe, generating and maintaining tens of thousands of distinctly Jewish communities in diverse countries spanning the entire continent. My dad was born in Lithuania; my mom’s mom came from an Eastern European region that has belonged to any one of about a half-dozen countries, depending on what particular year you happen to be talking about; and my mom’s dad grew up in Russia, near the Black Sea.

Tradition holds, though, that Ashkenazi Jews ultimately trace their origins straight back to ancient Israel, whence most Jews were expelled en masse in 70 CE by their Roman conquerors and sent skittering to all parts of the globe. (Jews who initially fled to Spain and Portugal are referred to as Sephardic. Those who took up residence in Iran, Turkey, Iraq and Northern Africa, are designated as Mizrahi.)

But in the late 1970s I read what was then a recent book titled The Thirteenth Tribe, written by polymath Arthur Koestler, advancing a theory that today’s Ashkenazis descend not from the Holy Land but, rather, from Khazaria, a medieval Turkic empire in the Causasus region whose royals, caught between the rock of Islam and the hard place of Christendom, chose the politically expedient course of converting to Judaism. That hypothesis has become highly politicized, with some groups holding that Ashkenazis, who constitute half of Israel’s current population, are colonialist interlopers with zero historical claim to the land of Israel.

Plausible at the time, the Khazar-origin premise has crumbled under the onslaught of modern molecular genetics. The latest volley: a study published this week in Nature Communications. The study’s senior author, Stanford geneticist Peter Underhill, PhD, works in the lab of  Carlos Bustamante, PhD, whose high-resolution techniques have highlighted the historical hopscotch of other migratory peoples.

Underhill, Bustamante and their co-authors analyzed the Y chromosome – a piece of the human genome invariably handed down father-to-son – of a set of Ashkenazi men claiming descent from Levi,  the founder of one of the Twelve Tribes of Israel. (Names such as Levy, Levine and Levitt, for example, bespeak a Levite heritage.)

If Ashkenazis were the spawn of Khazar royals, their DNA would show it. But those Y chromosomes were as Levantine as a levant sandwich. The same genetic “signature” popped up on every Levite sampled (as well as a significant number of non-Levite Ashkenazis), strongly implying descent from a single common ancestor who lived in the Fertile Crescent between 1,500 and 2,500 years ago. That signature is absent in the Y chromosomes of modern European non-Jewish men, and in male inhabitants of what was once Khazaria.

Yes, 2,000 years is a long time, and a fellow gets lonely. Genetic studies of mitochrondria – tiny intracellular power packs that have their own dollop of DNA and are always inherited matrilineally – have conflicted (contrast this with this) but, in combination with broader studies of entire genomes, suggest that a bit of canoodling transpired between Ashkenazi men and local European women, in particular Italian women, early in that two-millenia European sojourn.

I can relate. My wife is 100 percent Italian by heritage, and my daughter by my first marriage is half-Italian.

Previously: Caribbean genetic diversity explored by Stanford/University of Miami researchers, Stanford study investigates our most-recent common ancestors and Stanford study identifies molecular mechanism that triggers Parkinson’s
Photo by cod_gabriel

In the News, Myths, Sleep

What puts you to sleep? Experts weigh in

What puts you to sleep? Experts weigh in

A Huffington Post piece today surveys a panel of experts on best practices for getting a good night’s rest. The researchers advise on what worked for them (“Do boring yet challenging math”) and which tips they’ve tried and found to be overrated (“memory-foam mattresses”).

Clete A. Kushida, MD, PhD, medical director of Stanford Sleep Medicine Center, is one of the experts interviewed in the piece, which is a quick and fun read.

Previously: Tips for fighting fatigue after a sleepless nightExploring the effect of sleep loss on healthMore sleeping tips from a Stanford expert and Study estimates Americans’ insomnia costs nation $63 billion annually

In the News, Myths, Public Safety

Pew Research Center: Gun homicide rate has dropped by half since 1993

Man bites dog. As reported on the Wonkblog and elsewhere yesterday, a new analysis indicates that the rate of gun-induced homicide has plummeted by half over the past two decades.

Asked in a March Pew Research Center survey whether crimes involving guns have increased, held steady or been in remission since twenty years ago, more than half of all respondents said such crimes were on the rise.

Wrong. In 1993 – a year remembered by many of us through a Vaseline-coated lens of nostalgia – the gun-homicide rate in the United States was twice what it is today. The 49 percent drop since then is consistent with a general and steady, if unheralded, drop-off in rates of all violent crimes, as the federal Department of Justice’s Bureau of Justice Statistics confirms.

Actually, the rate of firearm-related homicides began a rapid ascent in the 1960s, peaked in the early 1990s, and has now returned to that of the early 1960s. (Gun-related suicides have also declined, but not as dramatically.)

These statistics do not bring back to life a single innocent person who has been killed, by guns or otherwise, in the past two decades. But they do provide some perspective in what has been an emotion-charged and too-often fact-challenged debate. As I’ve previously written, I fear that the debate leading to the Affordable Care Act – now proving famously tough to implement -a few years ago involved some misconceptions concerning the state of health care in the United States. People on both sides of the current debate on gun-control legislation would be well advised to get the facts straight.

Previously: U.S. health system’s sketchy WHO rating is bogus, says horse’s mouth and Rush to judgment regarding the state of U.S. health care?
Photo by ~Steve Z~

Fertility, Myths, Pediatrics, Pregnancy, Sexual Health, Women's Health

Research supports IUD use for teens

Research supports IUD use for teens

A large body of scientific research supports the safety and effectiveness of intrauterine devices and other forms of long-acting, reversible contraception (LARC) for adolescents, and physicians should offer these birth control methods to young women in their care. That’s the message behind a series of review articles published this week in a special supplemental issue of the Journal of Adolescent Health.

Stanford ob/gyn expert Paula Hillard, MD, who edited the supplement, explained to me that doctors are missing a great opportunity to prevent unwanted pregnancies by not offering young women the LARC birth control methods, which include IUDs and hormonal implants. Not only are the LARC methods very safe, the rate of unintended pregnancy with typical use of these techniques is 20 times lower than for alternate methods such as the Pill or a hormone patch.

But a design flaw in one specific IUD used in the 1970s – the Dalkon Shield – increased women’s risk for pelvic infections and gave all IUDs a bad rap. Use of IUDs among adult American women has been low ever since; it’s even lower in teens.

“Long after it was proven that the Dalkon Shield was particularly bad and newer IUDs were much safer, women were just scared,” Hillard said. “Not only did women stop asking for for them, many doctors also stopped using IUDs.”

The new review articles that Hillard edited are targeted at physicians but contain some interesting tidbits for general readers as well. The article titled “Myths and Misperceptions about Long Acting Reversible Contraception (LARC)” provides scientific evidence to refute several common myths, concluding, for instance, that IUDs don’t cause abortions or infertility, don’t increase women’s rates of ectopic pregnancy above the rates seen in the general population, and can be used by women and teens who have never had children.

And, as Hillard put it for me during our conversation, “These birth control methods are very safe and as effective as sterilization but completely reversible. They work better than anything else, and they’re so easy to use.”

Previously: Will more women begin opting for an IUD?, Promoting the use of IUDs in the developing world, and Study shows women may overestimate the effectiveness of common contraceptives
Photo, by ATIS547, shows a public sculpture on the campus of the University of California, Santa Cruz that is affectionately known as the “Flying IUD”

In the News, Myths, Nutrition, Parenting, Pediatrics

Debunking a Halloween myth: Sugar and hyperactivity

Debunking a Halloween myth: Sugar and hyperactivity

Does sugar make children hyperactive? To the surprise of many, particularly parents gearing up for tonight’s Halloween craziness, the answer is no.

A large body of scientific evidence debunks the notion of a cause-and-effect relationship between sugar consumption and children’s hyperactivity. So what’s actually going on? The San Francisco Chronicle interviewed a Stanford nutrition expert today to find out:

Dr. Tom Robinson, director of the Center for Healthy Weight Lucile Packard Children’s Hospital at Stanford, explains that because so many parents (and thus children) expect eating sweets to make them hyper, it becomes a self-fulfilling prophecy.

“The way we think we should feel has a lot to do with how we do feel,” he said.

The story mentions one of my favorite studies on the subject, in which parents who thought their kids were sugar-sensitive were asked to rate their child’s behavior after the children had consumed soda. Parents who heard that their children received sugar-sweetened sodas rated the youngsters’ behavior significantly worse than those who were told their kids drank artificially-sweetened soda. The catch? All the kids in the study consumed artificially-sweetened sodas.

Several other studies have attacked this question from different angles and reached the same conclusion that eating sugar doesn’t make children hyperactive. But as Robinson notes in the Chronicle piece, there are plenty of other good reasons, besides hyperactivity, to limit children’s sugar consumption. Two such reasons are sugar’s connections to promoting obesity and dental cavities.

Continue Reading »

Myths, Nutrition, Stanford News

Fact or fiction: Talkin' turkey and tryptophan

Fact or fiction: Talkin' turkey and tryptophan

I’m pretty sure you’ve heard of the so-called turkey coma: Tryptophan, an amino acid present in all that turkey you’re going to eat tomorrow, makes you sleepy. Heck, it was fodder for a whole Seinfeld episode. And, some of you may have even used it as an excuse to get out of doing the dishes on Thanksgiving.

Though scientists have debunked the tryptophan/turkey myth, the urban legend continues to live on. I decided to turn to Stanford neuroimmunologist Lawrence Steinman, MD, to finally put the turkey talk to rest. Back in 2005, his lab showed that tryptophan plays a pivotal role in the immune system.

So I asked Steinman: If we feel sleepy after eating a big turkey meal on Thanksgiving, is it due to tryptophan (which is allegedly very high in turkey)? He told me:

Humans cannot make tryptophan. Tryptophan is not higher in turkey than in most other muscle tissue from other animals, more commonly known as meats. When we ingest tryptophan, most is metabolized in the liver. However, some tryptophan gets to the brain, where it is metabolized into serotonin and melatonin. This uptake and conversion may take hours. The effects of alcohol are much faster.

It is not the turkey that makes us sleepy at Thanksgiving. It is the massive ingestion of calories, carbohydrates and often alcohol that results in the desire to sleep. Whatever makes you sleepy on the Thanksgiving holiday just enjoy it. Kick off your shoes, stretch out on the couch and watch a football game. Just refrain from snoring or you risk alarming the guests. But please ask someone to wake you from your nap, so you can help with the dishes!

Previously: Wherein we justify eating more cranberry sauce tomorrow and A guide to your Thanksgiving dinner’s DNA
Photo by orphanjones

Addiction, Behavioral Science, Mental Health, Myths, Neuroscience, Women's Health

Research links bulimia to disordered impulse control

Although some consider eating disorders like bulimia to be the over-hyped, Hollywoodian maladies of the wealthy and superficial, the fact is that they are serious psychiatric disorders. Bulimia seems to be particularly complex from a psychological standpoint.

A recent article in the East Bay Express focused on the disorder and discussed research by Stanford’s James Lock, MD, PhD, psychiatric director of the Comprehensive Eating Disorders Program at Lucile Packard Children’s Hospital. Lock’s research suggests that bulimia is an impulse-control disorder (where the impulse is binge eating), a class of disorders that also includes shoplifting and drug addiction:

As young women with bulimia grow older, destructive impulses like bingeing and purging may become more powerful while parts of the brain that govern impulse control may weaken. And according to… studies, the bulimic brain is more likely to succumb to a variety of self-destructive impulses, making the disorder a sort of psychological Hydra. Over time, these impulses may turn into compulsions, or bad habits, much like drug addiction.

Lock, who has been working with eating-disordered youth at Stanford’s clinic for nine years, noticed that his patients often exhibited behavior consistent with impulse-control issues. Such behavior included sexual promiscuity and kleptomania.  In a study requiring both healthy and bulimic girls to avoid performing a task, Lock noticed that bulimic girls had significantly more difficulty controlling their impulse to perform the forbidden task. Moreover, Lock noticed increased brain activity in the sections of these girls’ frontal lobes responsible for impulse control. His findings seemed to suggest that the girls’ brains were working overtime to manage impulses that healthy girls had no trouble controlling.

Eating disorders and many other mental disorders are medically elusive, since their physiological causes are practically unknown. Research like Lock’s, which considers disorders like bulimia to be serious psychiatric conditions and attempts to link them to other psychological disparities, is a crucial step in solving the mystery.

Previously: KQED health program examines causes and effects of disordered eating

Myths, Nutrition, Obesity

Effects of diet sodas on weight gain remain uncertain

Effects of diet sodas on weight gain remain uncertain

Recent studies suggesting that diet sodas may lead to weight gain have stirred up interest among diet-soda-drinkers and non-diet-soda-drinkers alike, confirming suspicions that the “diet” label and zero-calorie contents may be too good to be true. One of these studies, presented to the American Diabetes Association in June, associated diet soda consumption with a waistline size increase 70% greater than non-users. These results, along with the results of several related studies, are in line with thinking suggesting diet or “light” foods and beverages may contribute to weight gain.

But Loyola University obesity specialist Jessica Bartfield, MD, thinks that we should take these studies with a grain of salt (or, if you prefer, aspartame). A release that came out today quotes her take on the issue:

“I suspect that people are likely drinking those diet sodas to wash down high-fat and high-calorie fast-food or take-out meals, not as a complement to a healthy meal prepared at home or to quench a thirst after a tough workout.”

In other words, it’s not the fake sugar in diet soda that causes weight gain–it’s the lifestyle choices that usually accompany it. Switching from regular soda to zero-calorie diet varieties, she argues, may be tremendously effective as a weight-loss strategy–just as long as users aren’t canceling it out with an otherwise high-calorie diet.

Bartfield also points out the importance, in the case of obesity studies, of taking all factors into account:

“The association studies are significant and provocative, but don’t prove cause and effect,” said Bartfield, who counsels weight-loss patients at the Chicago-area Loyola University Health System. “Although these studies controlled for many factors, such as age, physical activity, calories consumed and smoking, there are still a tremendous number of factors such as dietary patterns, sleep, genetics, and medication use that account for weight gain.”

Dieters looking for a satisfying answer to their weight-loss questions may be annoyed by the back-and-forth on issues like these. Then again, if obesity were a straightforward issue, we’d have solved it already.

Photo by computerjoe

Autism, Genetics, Myths, Neuroscience, Research

Unsung brain-cell population implicated in variety of autism

Like the late Rodney Dangerfield, and as I once wrote in Stanford Medicine, glial cells “don’t get no respect.” Combined, the three glial cell types – astrocytes, oligodendricytes, and microglia – constitute a good 90 percent of all the cells in the brain. Yet the remaining 10 percent – the neurons – are so celebrated they’ve lent their name to brain science: neurobiology.

Stanford’s Ben Barres, MD, PhD, a lonely voice in the wilderness, has long advocated paying more attention to glial cells. His experience as a young neurologist in the 1980s convinced him that they’re involved in all sorts of brain pathology.

And, belatedly, glial cells are getting some grudging respect, in appreciation of their increasingly well-characterized roles in everything from directing blood vessels to increase their diameters in the vicinity of busy nerve circuits to determining which synapses will live and which will die.

In a new study just published in Nature Neuroscience, a genetic deficiency known to be responsible for Rett syndrome, the most physically disabling of the autistic disorders, has been shown to wreak many of its damaging effects via astrocytes. These gorgeous star-shaped glial cells, alone, account for almost half of all cells in the human brain (although by volume not so much, as they’re smaller than neurons).

In the study, investigators at Oregon Health and Science University employed a mouse model of Rett’s syndrome in which the condition’s defining gene defect was present in every cell of every mouse. When the investigators restored that defective gene to the mice’s astrocytes – and only their astrocytes – many of the signature symptoms of the disease cleared up.

Rett Syndrome was once assumed to be exclusively a function of damaged neurons. This latest finding, like many others over the past decade, goes to show that glial cells aren’t just a bunch of packing peanuts whose main job is to keep our neurons from jiggling when we jog.

Photo by Ibtrav

Stanford Medicine Resources: