Published by
Stanford Medicine



In the News, Media, Medical Education, Medicine and Society, Myths, Pregnancy, Research

Reality TV influences perspectives on pregnancy, study shows

Reality TV influences perspectives on pregnancy, study shows

272417047_806faa2243_zA new University of Cincinnati study on the influence that television programs have on pregnant women has found that most women are more affected by TV representations of childbirth than they think.

The study, funded by the NSF and conducted by Danielle Bessett, PhD, assistant professor of sociology, followed a diverse group of 64 women over the course of two years and investigated how they understood their television viewing practices related to pregnancy and birth. It found that class, as measured by education level, had the greatest influence on whether a woman acknowledged television as a significant source of pregnancy-related information. Highly educated women and those who worked outside the home were more likely to dismiss TV, while those with less education and who were unemployed or took care of children at home were more likely to report watching and learning from such shows as TLC’s “Baby Story” and “Maternity Ward” and Discovery Health’s “Birth Day.”

The particularly interesting finding is that TV portrayals affect women’s perceptions even when they don’t believe they have an influence. Bessett developed the term “cultural mythologies of pregnancy” to describe how TV, film, media, and word of mouth create expectations about “the way things are.” Most reality TV and fictionalized programming presents childbirth as more dramatic and full of medical interventions than the majority of births really are, and these images made a lasting impression on women.

As quoted in the press release, Bessett says, “Hearing women –– even women who said TV had no influence on them –– trace their expectations back to specific television episodes is one of the few ways that we can see the power of these mythologies.” Many women mentioned pregnancy representations they had seen long before they got pregnant.

Women who reported watching TV considered it part of a comprehensive childbirth education program and would often evaluate the programs’ reliability, while women who disavowed television saw it as entertainment or education for children, likely from a desire to be seen as valuing science and medical expertise.

“If we believe that television works most insidiously or effectively on people when they don’t realize that it has power, then we can actually argue that the more highly educated women who were the most likely to say that television really didn’t have any effect on them, may in the end actually be more subject to the power of television than were women who saw television as an opportunity to learn about birth and recognized TV’s influence,” hypothesizes Bessett.

“This research implies that many women underestimate or under-report the extent to which their expectations of pregnancy and birth are shaped by popular media,” concludes Bessett, suggesting that “scholars must not only focus on patients’ professed methods for seeking information, but also explore the unrecognized role that television plays in their lives.”

Previously: New reality shows shine harsh light on teen pregnancy and Study: TV dramas can influence birth control use
Photo by johnny_zebra

Addiction, In the News, Myths, Patient Care, Public Health, Public Safety

“24/7 Sobriety” program may offer a simple fix for drunken driving

"24/7 Sobriety" program may offer a simple fix for drunken driving

8684229367_2826035583_zEvery now and then I read a story that takes what I think I know about a certain topic and turns it upside down. Today, my understanding of programs to reduce drunk driving were upended by an article written by Keith Humphreys, PhD, professor of psychiatry and behavioral science at Stanford.

As Humphreys explains, many people mistakenly believe that no one can overcome a drinking problem without treatment involving a professional’s help. This, he says, is a myth, and the success of the “24/7 Sobriety” program highlights the importance of exploring and adopting new ways to combat drunken driving. From the Wall Street Journal article:

Offenders in 24/7 Sobriety can drive all they want to, but they are under a court order not to drink. Every morning and evening, for an average of five months, they visit a police facility to take a breathalyzer test. Unlike most consequences imposed by the criminal justice system, the penalties for noncompliance are swift, certain and modest. Drinking results in mandatory arrest, with a night or two in jail as the typical penalty.

The results have been stunning. Since 2005, the program has administered more than 7 million breathalyzer tests to over 30,000 participants. Offenders have both showed up and passed the test at a rate of over 99%.

Counties that used the 24/7 Sobriety program also had a 12% decrease in repeat drunken-driving arrests and a 9% drop in domestic-violence arrests, according to a 2013 study.

A possible reason why this program works — when attempts to help people with drinking problems often fail — is that the twice daily breathalyzer tests have immediate consequences, Humphreys explains. “It turns out that people with drug and alcohol problems are just like the rest of us. Their behavior is affected much more by what is definitely going to happen today than by what might or might not happen far in the future, even if the potential future consequences are more serious.”

Previously: Can the “24/7 sobriety” model reduce drunken disorderly conduct and violence in London?Alcoholism: Not just a man’s problem and Stopping criminal men from drinking reduces domestic violence
Photo by: KOMUnews

Ebola, In the News, Myths, Science

The slippery slope toward "a dangerous dependence on facts"

The slippery slope toward "a dangerous dependence on facts"

220px-Sputnik_asmThe ever-funny Andy Borowitz has written in The New Yorker about a previously unreported challenge in the fight against Ebola: It might make Americans believe in science. He writes:

In interviews conducted across the nation, leading anti-science activists expressed their concern that the American people, wracked with anxiety over the possible spread of the virus, might desperately look to science to save the day.

“It’s a very human reaction,” said Harland Dorrinson, a prominent anti-science activist from Springfield, Missouri. “If you put them under enough stress, perfectly rational people will panic and start believing in science.”

For someone who left science to become a writer specifically to help explain science to the public, this piece is both funny and also so very not funny at the same time. Almost 20 years after I put down my pipette, Americans are, if anything, less willing to let science guide their health, energy, or environmental decisions than they were back when I started – thus the humor in Borowitz’ piece.

All of this makes me wonder if I could have spared myself several decades of worrying about clever analogies, agonizing about transitions, and racing the clock to make deadlines and done something less stressful with my life. Something fulfilling. Something where at the end of the day, my work would help people live happier, healthier lives rather than producing something people will ignore if it doesn’t fit their ideology.

Matthew Nisbet and Dietram Scheufele have written a number of articles about science communication and its effects on public perception of science. In the American Journal of Botony they write, “Often when the relationship between science and society breaks down, science illiteracy is typically blamed, the absence of quality science coverage is bemoaned, and there is a call put out for ‘more Carl Sagans.’”

In a nutshell, that sums up my career switch. I bemoaned the absence of quality science coverage and fully intended to fill that gap.

Then, they go on to shatter my reasons for writing by pointing out that at a period of time when the public’s regard for science was at it’s highest – soon after the Sputnik launch – science literacy was abysmal. In one survey at the time, just 12 percent of people understood the scientific method, yet 90 percent of people believed that science was making their lives better.

What that survey suggests is that even a scientific challenge like Ebola is unlikely to push Americans to be better educated about science. But perhaps with the perfect transition, or really outstanding analogy, those same scientifically illiterate Americans can be convinced that science is making life better and – I’m really dreaming here -should be funded?

If yes, maybe Borowitz’ fictional anti-science advocate will be proved right, and we will head down that slippery slope “in which a belief in science leads to a belief in math, which in turn fosters a dangerous dependence on facts.” One can hope!

Previously: Scientist: Just because someone’s on TV doesn’t mean they’re an expert

Evolution, Genetics, History, Myths, Research, Stanford News

New genetic study: More evidence for modern Ashkenazi Jews’ ancient Hebrew patrimony

New genetic study: More evidence for modern Ashkenazi Jews' ancient Hebrew patrimony

IsraelI hail from the so-called Ashkenazi branch of Jews, who account for the great majority of all Jews in the world today. Ashkenazis are distinguished by the historical fact that, over the last couple of thousand years or so, they propagated throughout Europe, generating and maintaining tens of thousands of distinctly Jewish communities in diverse countries spanning the entire continent. My dad was born in Lithuania; my mom’s mom came from an Eastern European region that has belonged to any one of about a half-dozen countries, depending on what particular year you happen to be talking about; and my mom’s dad grew up in Russia, near the Black Sea.

Tradition holds, though, that Ashkenazi Jews ultimately trace their origins straight back to ancient Israel, whence most Jews were expelled en masse in 70 CE by their Roman conquerors and sent skittering to all parts of the globe. (Jews who initially fled to Spain and Portugal are referred to as Sephardic. Those who took up residence in Iran, Turkey, Iraq and Northern Africa are designated as Mizrahi.)

But in the late 1970s I read what was then a recent book titled The Thirteenth Tribe, written by polymath Arthur Koestler, advancing a theory that today’s Ashkenazis descend not from the Holy Land but, rather, from Khazaria, a medieval Turkic empire in the Causasus region whose royals, caught between the rock of Islam and the hard place of Christendom, chose the politically expedient course of converting to Judaism. That hypothesis has become highly politicized, with some groups holding that Ashkenazis, who constitute half of Israel’s current population, are colonialist interlopers with zero historical claim to the land of Israel.

Plausible at the time, the Khazar-origin premise has crumbled under the onslaught of modern molecular genetics. The latest volley: a study published this week in Nature Communications. The study’s senior author, Stanford geneticist Peter Underhill, PhD, works in the lab of  Carlos Bustamante, PhD, whose high-resolution techniques have highlighted the historical hopscotch of other migratory peoples.

Underhill, Bustamante and their co-authors analyzed the Y chromosome – a piece of the human genome invariably handed down father-to-son – of a set of Ashkenazi men claiming descent from Levi,  the founder of one of the Twelve Tribes of Israel. (Names such as Levy, Levine and Levitt, for example, bespeak a Levite heritage.)

If Ashkenazis were the spawn of Khazar royals, their DNA would show it. But those Y chromosomes were as Levantine as a levant sandwich. The same genetic “signature” popped up on every Levite sampled (as well as a significant number of non-Levite Ashkenazis), strongly implying descent from a single common ancestor who lived in the Fertile Crescent between 1,500 and 2,500 years ago. That signature is absent in the Y chromosomes of modern European non-Jewish men, and in male inhabitants of what was once Khazaria.

Yes, 2,000 years is a long time, and a fellow gets lonely. Genetic studies of mitochrondria – tiny intracellular power packs that have their own dollop of DNA and are always inherited matrilineally – have conflicted (contrast this with this) but, in combination with broader studies of entire genomes, suggest that a bit of canoodling transpired between Ashkenazi men and local European women, in particular Italian women, early in that two-millenia European sojourn.

I can relate. My wife is 100 percent Italian by heritage, and my daughter by my first marriage is half-Italian.

Previously: Caribbean genetic diversity explored by Stanford/University of Miami researchers, Stanford study investigates our most-recent common ancestors and Stanford study identifies molecular mechanism that triggers Parkinson’s
Photo by cod_gabriel

In the News, Myths, Sleep

What puts you to sleep? Experts weigh in

What puts you to sleep? Experts weigh in

A Huffington Post piece today surveys a panel of experts on best practices for getting a good night’s rest. The researchers advise on what worked for them (“Do boring yet challenging math”) and which tips they’ve tried and found to be overrated (“memory-foam mattresses”).

Clete A. Kushida, MD, PhD, medical director of Stanford Sleep Medicine Center, is one of the experts interviewed in the piece, which is a quick and fun read.

Previously: Tips for fighting fatigue after a sleepless nightExploring the effect of sleep loss on healthMore sleeping tips from a Stanford expert and Study estimates Americans’ insomnia costs nation $63 billion annually

In the News, Myths, Public Safety

Pew Research Center: Gun homicide rate has dropped by half since 1993

Man bites dog. As reported on the Wonkblog and elsewhere yesterday, a new analysis indicates that the rate of gun-induced homicide has plummeted by half over the past two decades.

Asked in a March Pew Research Center survey whether crimes involving guns have increased, held steady or been in remission since twenty years ago, more than half of all respondents said such crimes were on the rise.

Wrong. In 1993 – a year remembered by many of us through a Vaseline-coated lens of nostalgia – the gun-homicide rate in the United States was twice what it is today. The 49 percent drop since then is consistent with a general and steady, if unheralded, drop-off in rates of all violent crimes, as the federal Department of Justice’s Bureau of Justice Statistics confirms.

Actually, the rate of firearm-related homicides began a rapid ascent in the 1960s, peaked in the early 1990s, and has now returned to that of the early 1960s. (Gun-related suicides have also declined, but not as dramatically.)

These statistics do not bring back to life a single innocent person who has been killed, by guns or otherwise, in the past two decades. But they do provide some perspective in what has been an emotion-charged and too-often fact-challenged debate. As I’ve previously written, I fear that the debate leading to the Affordable Care Act – now proving famously tough to implement -a few years ago involved some misconceptions concerning the state of health care in the United States. People on both sides of the current debate on gun-control legislation would be well advised to get the facts straight.

Previously: U.S. health system’s sketchy WHO rating is bogus, says horse’s mouth and Rush to judgment regarding the state of U.S. health care?
Photo by ~Steve Z~

Fertility, Myths, Pediatrics, Pregnancy, Sexual Health, Women's Health

Research supports IUD use for teens

Research supports IUD use for teens

A large body of scientific research supports the safety and effectiveness of intrauterine devices and other forms of long-acting, reversible contraception (LARC) for adolescents, and physicians should offer these birth control methods to young women in their care. That’s the message behind a series of review articles published this week in a special supplemental issue of the Journal of Adolescent Health.

Stanford ob/gyn expert Paula Hillard, MD, who edited the supplement, explained to me that doctors are missing a great opportunity to prevent unwanted pregnancies by not offering young women the LARC birth control methods, which include IUDs and hormonal implants. Not only are the LARC methods very safe, the rate of unintended pregnancy with typical use of these techniques is 20 times lower than for alternate methods such as the Pill or a hormone patch.

But a design flaw in one specific IUD used in the 1970s – the Dalkon Shield – increased women’s risk for pelvic infections and gave all IUDs a bad rap. Use of IUDs among adult American women has been low ever since; it’s even lower in teens.

“Long after it was proven that the Dalkon Shield was particularly bad and newer IUDs were much safer, women were just scared,” Hillard said. “Not only did women stop asking for for them, many doctors also stopped using IUDs.”

The new review articles that Hillard edited are targeted at physicians but contain some interesting tidbits for general readers as well. The article titled “Myths and Misperceptions about Long Acting Reversible Contraception (LARC)” provides scientific evidence to refute several common myths, concluding, for instance, that IUDs don’t cause abortions or infertility, don’t increase women’s rates of ectopic pregnancy above the rates seen in the general population, and can be used by women and teens who have never had children.

And, as Hillard put it for me during our conversation, “These birth control methods are very safe and as effective as sterilization but completely reversible. They work better than anything else, and they’re so easy to use.”

Previously: Will more women begin opting for an IUD?, Promoting the use of IUDs in the developing world, and Study shows women may overestimate the effectiveness of common contraceptives
Photo, by ATIS547, shows a public sculpture on the campus of the University of California, Santa Cruz that is affectionately known as the “Flying IUD”

In the News, Myths, Nutrition, Parenting, Pediatrics

Debunking a Halloween myth: Sugar and hyperactivity

Debunking a Halloween myth: Sugar and hyperactivity

Does sugar make children hyperactive? To the surprise of many, particularly parents gearing up for tonight’s Halloween craziness, the answer is no.

A large body of scientific evidence debunks the notion of a cause-and-effect relationship between sugar consumption and children’s hyperactivity. So what’s actually going on? The San Francisco Chronicle interviewed a Stanford nutrition expert today to find out:

Dr. Tom Robinson, director of the Center for Healthy Weight Lucile Packard Children’s Hospital at Stanford, explains that because so many parents (and thus children) expect eating sweets to make them hyper, it becomes a self-fulfilling prophecy.

“The way we think we should feel has a lot to do with how we do feel,” he said.

The story mentions one of my favorite studies on the subject, in which parents who thought their kids were sugar-sensitive were asked to rate their child’s behavior after the children had consumed soda. Parents who heard that their children received sugar-sweetened sodas rated the youngsters’ behavior significantly worse than those who were told their kids drank artificially-sweetened soda. The catch? All the kids in the study consumed artificially-sweetened sodas.

Several other studies have attacked this question from different angles and reached the same conclusion that eating sugar doesn’t make children hyperactive. But as Robinson notes in the Chronicle piece, there are plenty of other good reasons, besides hyperactivity, to limit children’s sugar consumption. Two such reasons are sugar’s connections to promoting obesity and dental cavities.

Continue Reading »

Myths, Nutrition, Stanford News

Fact or fiction: Talkin' turkey and tryptophan

Fact or fiction: Talkin' turkey and tryptophan

I’m pretty sure you’ve heard of the so-called turkey coma: Tryptophan, an amino acid present in all that turkey you’re going to eat tomorrow, makes you sleepy. Heck, it was fodder for a whole Seinfeld episode. And, some of you may have even used it as an excuse to get out of doing the dishes on Thanksgiving.

Though scientists have debunked the tryptophan/turkey myth, the urban legend continues to live on. I decided to turn to Stanford neuroimmunologist Lawrence Steinman, MD, to finally put the turkey talk to rest. Back in 2005, his lab showed that tryptophan plays a pivotal role in the immune system.

So I asked Steinman: If we feel sleepy after eating a big turkey meal on Thanksgiving, is it due to tryptophan (which is allegedly very high in turkey)? He told me:

Humans cannot make tryptophan. Tryptophan is not higher in turkey than in most other muscle tissue from other animals, more commonly known as meats. When we ingest tryptophan, most is metabolized in the liver. However, some tryptophan gets to the brain, where it is metabolized into serotonin and melatonin. This uptake and conversion may take hours. The effects of alcohol are much faster.

It is not the turkey that makes us sleepy at Thanksgiving. It is the massive ingestion of calories, carbohydrates and often alcohol that results in the desire to sleep. Whatever makes you sleepy on the Thanksgiving holiday just enjoy it. Kick off your shoes, stretch out on the couch and watch a football game. Just refrain from snoring or you risk alarming the guests. But please ask someone to wake you from your nap, so you can help with the dishes!

Previously: Wherein we justify eating more cranberry sauce tomorrow and A guide to your Thanksgiving dinner’s DNA
Photo by orphanjones

Addiction, Behavioral Science, Mental Health, Myths, Neuroscience, Women's Health

Research links bulimia to disordered impulse control

Although some consider eating disorders like bulimia to be the over-hyped, Hollywoodian maladies of the wealthy and superficial, the fact is that they are serious psychiatric disorders. Bulimia seems to be particularly complex from a psychological standpoint.

A recent article in the East Bay Express focused on the disorder and discussed research by Stanford’s James Lock, MD, PhD, psychiatric director of the Comprehensive Eating Disorders Program at Lucile Packard Children’s Hospital. Lock’s research suggests that bulimia is an impulse-control disorder (where the impulse is binge eating), a class of disorders that also includes shoplifting and drug addiction:

As young women with bulimia grow older, destructive impulses like bingeing and purging may become more powerful while parts of the brain that govern impulse control may weaken. And according to… studies, the bulimic brain is more likely to succumb to a variety of self-destructive impulses, making the disorder a sort of psychological Hydra. Over time, these impulses may turn into compulsions, or bad habits, much like drug addiction.

Lock, who has been working with eating-disordered youth at Stanford’s clinic for nine years, noticed that his patients often exhibited behavior consistent with impulse-control issues. Such behavior included sexual promiscuity and kleptomania.  In a study requiring both healthy and bulimic girls to avoid performing a task, Lock noticed that bulimic girls had significantly more difficulty controlling their impulse to perform the forbidden task. Moreover, Lock noticed increased brain activity in the sections of these girls’ frontal lobes responsible for impulse control. His findings seemed to suggest that the girls’ brains were working overtime to manage impulses that healthy girls had no trouble controlling.

Eating disorders and many other mental disorders are medically elusive, since their physiological causes are practically unknown. Research like Lock’s, which considers disorders like bulimia to be serious psychiatric conditions and attempts to link them to other psychological disparities, is a crucial step in solving the mystery.

Previously: KQED health program examines causes and effects of disordered eating

Stanford Medicine Resources: