Published by
Stanford Medicine

Category

Ethics

Ethics, Genetics, In the News, Research

Cautious green light for CRISPR use in embryos in the U.K.; Stanford’s Hank Greely weighs in

Cautious green light for CRISPR use in embryos in the U.K.; Stanford's Hank Greely weighs in

balance-154516_1280Big news out of the United Kingdom today about the gene editing technology known as CRISPR/Cas9. Stanford law professor Hank Greely, JD, posted a brief take on his blog this morning applauding the move by the British Human Fertilisation and Embryology Authority to allow researcher Kathy Niakan, PhD, of the Francis Crick Institute to conduct gene editing experiments in early human embryos.

The BBC News and Nature each have good summaries of the science side of the ruling. Greely, who directs Stanford’s Center for Law and the Biosciences, breaks down the ethics. From his post:

This is important research that can only be done with human embryos, it is being done with surplus IVF embryos whose prospective parents agreed to this kind of use, and the researchers are forbidden to to try to produce human gene-edited babies.

Niakan’s experiments, tailored to increase our understanding of the very earliest stages of human development, will allow the modified embryos to develop for only 14 days, or until they consist of just a few hundred cells. She hopes that her findings will shed light on infertility and miscarriage.

Previously: Using CRISPR to investigate pancreatic cancer, CRISPR marches forward: Stanford scientists optimize use in human blood cells and CRISPR critters and CRISPR conundrums
Image by OpenClipartVectors

Ethics, Health Policy, Patient Care

Small number of physicians account for many malpractice claims

Small number of physicians account for many malpractice claims

gavel01-lgA small number of physicians account for a disproportionately large number of malpractice claims in the United States, Stanford medical and law researchers found after examining 10 years of medical data.

The ability to identify these claim-prone physicians early would be invaluable, the researchers write in a paper published today in The New England Journal of Medicine.

David Studdert, ScD, professor of law and of medicine, and Michelle Mello, JD, PhD, professor of law and of health research and policy — who are also core faculty members of Stanford Health Policy — conducted the study in collaboration with researchers from the University of Melbourne, Australia.

The team found that just 1 percent of practicing physicians accounted for 32 percent of paid malpractice claims over a decade. The study also found that claim-prone physicians had a number of distinctive characteristics. Studdert, lead author of the study, explains:

The degree to which the claims were concentrated among a small group of physicians was really striking. But the fact that these frequent flyers looked quite different from their colleagues — in terms of specialty, gender, age, and several other characteristics — was the most exciting finding. It suggests that it may be possible to identify problem physicians before they accumulate troubling track records, and then do something to stop that happening.

Male physicians had a 35 percent higher risk of recurring claims than female physicians, and the risk of recurrence among physicians younger than 35 years old was about one-third the risk among their older colleagues, the study found.

Continue Reading »

Bioengineering, Ethics, Genetics, In the News, Research, Science

Are at-home gene splicing kits a good idea? Stanford researchers weigh in

Are at-home gene splicing kits a good idea? Stanford researchers weigh in

chemist_stick_figure_by_wrpigeekAs demonstrated by the Foldscope, the uber-affordable microscope developed by Stanford bioengineer Manu Prakash, PhD — there is real fervor for bringing easy, do-it-yourself science to the masses. But what if that at-home science allows novices to dabble in some serious stuff, like splicing genes?

One Bay Area scientist has done just that: He’s marketing a $130 gene-editing kit that could bring the popular technology CRISPR into kitchens, basements and garages nationwide.

This particular kit isn’t particularly dangerous, according to a recent article in the San Jose Mercury News:

The kit has limited applications. His altered bacteria and yeast, quite harmless, lead brief and fairly dull lives. They can’t do much except change color, fragrance or live in inhospitable places. Then they die.

But two Stanford experts — infectious disease researcher David Relman, MD, and bioethicist Hank Greely, JD — agree it could place powerful technology in the hands of people who might not use it responsibly.

“I do not think that we want an unregulated, non-overseen community of freelance practitioners of this technology,” Relman told the Mercury News.

Regulation, or control, might not be possible, though, Greely cautioned. “You’ve got guys with B.S. degrees, in a garage,” he said in the article.

Kit developer Josiah Zayner doesn’t have a garage. But one version of the kit has already sold out.

Previously: CRISPR critters and CRISPR conundrums, Foldscope inventor named one of the world’s top innovators under 35 by Technology Review and Manu under the microscope
Image by WRPIgeek

Applied Biotechnology, Ethics, Medicine and Society, Public Safety, Science Policy, Stanford News

Stanford experts slam government’s myopic biosecurity oversight

Stanford experts slam government's myopic biosecurity oversight

blindfoldedJust because we can, does that mean we should?

In a hard-hitting editorial in Science, three Stanford thinkers – Stanford microbe wizard David Relman, MD; synthetic biologist Megan Palmer, PhD, of Stanford’s Center for International Security and Cooperation; and political theorist Francis Fukuyama, PhD, of the Freeman Spogli Institute for International Studies – have issued a scathing wake-up call to the scientific community and the federal government, sternly questioning the latter’s current plans for ensuring biosafety and biosecurity in the United States.

“Our strategies and institutions for managing biological risk in emerging technologies have not matured much in the last 40 years,” they write, adding:

With the advent of recombinant-DNA technology, scientific leaders resorted to halting research when confronted with uncertainty and public alarm about the risks of their work. To determine a framework for managing risk, they gathered at the now-fabled 1975 Asilomar meeting. Their conclusions led to the recombinant DNA guidelines still used today, and Asilomar is often invoked as a successful model for scientific self-governance.

But, the authors suggest, Asilomar’s legacy may not be all it’s cracked up to be:

Asilomar created risky expectations: that leading biological scientists are best suited for and wholly capable of designing their own systems of governance and that emerging issues can be treated as primarily technical matters.

“Unfortunately,” the editorial goes on to say, “today’s leadership on biological risk reflects Asilomar’s risky legacy: prioritizing scientific and technical expertise over expertise in governance, risk management, and organizational behavior.” Political leaders have largely ceded a strategic leadership role, leaving it up to the scientific community itself to judge the ethical and social implications of its own work.

“Leadership biased toward those that conduct the work in question can promote a culture dismissive of outside criticism and embolden a culture of invincibility” regarding emerging biotechnology risks,” the authors write.

The world of today is not the world of 1975. Since then, the scope and scale of biological science and technology have changed radically. To wit: The increased ease of reading and writing genetic information means that securing materials in a handful of established labs is not feasible, the editorial states. Like it or not, the tools for putting potentially dangerous knowledge into practice are increasingly portable.

For a scary scenario of what such new facility portends, please see this article I wrote a couple of years ago, which begins with the rhetorical question: “What if nuclear bombs could reproduce?”

With so much at stake, we may not want to restrict oversight of scientific advances to those who are making the advances. There’s knowledge, and there’s wisdom.

Previously: How-to manual for making bioweapons found on captured Islamic State computer, Microbial mushroom cloud: How real is the threat of bioterrorism? (Very) and Stanford bioterrorism expert comments on new review of anthrax case
Photo by Mirko Tobias Schafer

Ethics, Mental Health, Stanford News

Stanford psychiatrist’s varied pursuits garner her a top ethics prize

Stanford psychiatrist's varied pursuits garner her a top ethics prize

LSN_8739Imagine being terrified you might kill yourself. Then imagine driving 300 miles to the nearest city for psychiatric care because you’re even more afraid someone in your town will find out about your depression. Or worse yet, being so afraid of being labeled “crazy” that you don’t seek care at all.

Psychiatrists like Laura Roberts, MD, deal with situations like this frequently. Roberts, who chairs the Department of Psychiatry and Behavioral Sciences, has learned not to underestimate the strength, or pervasiveness, of the stigma that surrounds mental health issues.

“You can build resources, but stigma is such a strong and visible barrier to care. If we do not work on that issue,” we are not able to effectively treat patients, Roberts said.

Battling stigma is just one of Roberts’ passions. She is known for her work with vulnerable and often neglected populations. She has conducted extensive research on improving mental health care in isolated rural and tribal communities in New Mexico and Alaska. And she has a long history of championing patients who have been marginalized, including victims of sexual trauma, sex workers, veterans, and elders with dementia.

Roberts is also committed to improving the well-being of her fellow psychiatrists and the health of the field of psychiatry as a whole. But while her scholarly and research interests are varied, they are all united by one common focus: ethics.

Her mentor, Mark Siegler, MD, has known Roberts since she was a medical student at the University of Chicago. “From the beginning of her medical career, her central focus and commitment has been on improving the care and outcomes of her patients,” Siegler said. “She is a brilliant patient-centered physician whose scholarly work has improved the care not just of her own patients but of sick and vulnerable patients everywhere.”

He was in attendance when Roberts recently received the 2015 MacLean Center Prize in Clinical Ethics and Health Outcomes, the largest prize in clinical ethics, which includes a $50,000 award.

“The prize honors these individuals who teach us about medicine, and about life, as they face great sorrow and injustices with courage and generosity,” Roberts said in a release.

The award was given during the Dorothy J. MacLean Fellows Conference on ethics in medicine at the University of Chicago.

Previously: How people with mental illness get left out of medical research studies, “Every life is touched by suicide:” Stanford psychiatrist on the importance of prevention and Starting a new career in academic medicine? Here’s a bible for the bedside: The Academic Medicine Handbook
Photo by Bruce Powell

Ethics, FDA, Genetics, In the News, Science Policy

CRISPR critters and CRISPR conundrums

CRISPR critters and CRISPR conundrums

OLYMPUS DIGITAL CAMERA

There’s much ado about the gene-editing technique CRISPR/Cas9 this week, with a multinational summit in Washington, D.C. on human gene editing, plus the clock ticking down on congressional appropriations bills, one of which would prohibit the Food and Drug Administration from spending money to evaluate research on or conduct clinical trials of gene editing in human embryos. The American Journal of Bioethics, edited by David Magnus, PhD, director of the Stanford Center for Biomedical Ethics, wades into the fray with a special issue on the ethics of CRISPR.

CRISPR is an unusually precise, fast and cheap way of snipping out and replacing genes. It has implications for preventing and treating genetic diseases, engineering new versions of the plants and animals we eat, and knocking out genes in insects so they can’t carry viruses that could kill us. The most controversial possibility is altering human sperm, eggs or embryos, because such germline changes would be heritable in future generations of offspring.

“The overriding question is when, if ever, we will want to use gene editing to change human inheritance,” said chair David Baltimore, PhD, of Caltech in kicking off this week’s summit. Ultimately, summit participants released a statement that left the door open for human germline editing, and advocated for ongoing international discussion.

Indeed, because of the low cost of CRISPR and the variability of research ethics across the globe, an international ban or moratorium would be difficult to enforce, said then-undergraduates Niklaus Evitt and Shamik Mascharak in a paper they wrote for a Stanford class co-taught by professor Russ Altman, MD, PhD. They and Altman turned it into an article for the special issue of the bioethics journal.

They propose a model regulatory framework for CRISPR human germline editing that includes vetting research for necessity and reversibility, establishing the safety and efficacy of the treatment in multigenerational animal models and conducting clinical trials over a 15-year period. “We seek concrete policies that responsibly phase in therapeutic uses of CRISPR-Cas genome editing at a pace amenable to ethical inquiry,” they write.

Continue Reading »

Clinical Trials, Ethics, Patient Care, Research

When medical knowledge is at a crossroads, how research can take patient preferences into account

When medical knowledge is at a crossroads, how research can take patient preferences into account

4000195795_6841659fc6_z (1)Let’s say you have high blood pressure that can be treated with one of two medications.

Neither drug is experimental; both are within the standard of care. Your doctor doesn’t have any medical reason to recommend one or the other. But she’d like to help the medical community figure out which one works best, and she’s wondering if you would, too. If so, you can enroll in a randomized study, effective immediately.

The question is, what steps are necessary to ensure you understand and consent to the research?

One possibility is that your doctor informs you about the purpose of the research, its risks and benefits, and your alternatives to participating; she then documents the conversation and your decision in your medical record. A patient-friendly consent process is essential to the success of this type of research on clinical practices, say researchers at the Stanford Center for Biomedical Ethics (SCBE), and most patients are comfortable skipping a written consent form when the form might prove so cumbersome that the research couldn’t go forward. Patients also prefer to talk about participating in these studies with their own doctors, not with researchers.

But draft guidance from the federal Office for Human Research Protections (OHRP), if finalized, would require formal written consent in any study that is designed to assess a risk, even if the same risk exists in ordinary clinical care. In other words, whether you get Hypertension Medication A vs. Hypertension Medication B is characterized as a research risk, even though the risk is inherent in the doctor visit whether you participate in the study or not.

“They make it seem really risky,” said Stephanie Alessi Kraft, an SCBE clinical ethics fellow and the lead author of a paper in the American Medical Association Journal of Ethics that argues for reconsideration of the draft guidance. “We’re talking about cases of genuine uncertainty and equipoise. In our research, we use the example of a gumball machine. You know you’re going to get a gumball. You know you’re going to get a medication that works. You just don’t know what color.”

Continue Reading »

Ethics, Research, Science, Stanford News

Clues could help identify fraudulent research before publication

Clues could help identify fraudulent research before publication

4443921690_d3b8c60e91_zLiars leave behind evidence, researchers have found, whether they’re bluffing at poker or fabricating financial reports. Now, a study published in the Journal of Language and Social Psychology has identified clues left by researchers who falsify their work.

The study’s authors examined 253 primarily biomedical papers that were retracted from journals for fraud and compared them to papers from the same journals, time periods and publication topics. They developed a “obfuscation index,” which included abstract language, jargon, positive emotional terms, casual language and a reading difficulty score. Fraudulent papers had higher scores than accurate papers, the team found.

A Stanford Report article explains:

“We believe the underlying idea behind obfuscation is to muddle the truth,” said graduate student David Markowitz, the lead author on the paper. “Scientists faking data know that they are committing a misconduct and do not want to get caught. Therefore, one strategy to evade this may be to obscure parts of the paper. We suggest that language can be one of many variables to differentiate between fraudulent and genuine science.”

The results showed that fraudulent retracted papers scored significantly higher on the obfuscation index than papers retracted for other reasons. For example, fraudulent papers contained approximately 1.5 percent more jargon than unretracted papers.

“Fradulent papers had about 60 more jargon-like words per paper compared to unretracted papers,” Markowitz said in the article. “This is a non-trivial amount.”

Previously: New Stanford Medicine magazine explores bioethics, Using social media in clinical research: Case studies address ethical gray areas and “U.S. effect” leads to publication of biased research, says Stanford’s John Ioannidis
Photo by Alan Cleaver

Ethics, In the News, Parenting, Patient Care, Pediatrics, Stanford News

Parents now help doctors decide what care is right for the sickest babies

Parents now help doctors decide what care is right for the sickest babies

Today, NPR’s Morning Edition featured an in-depth story on the evolution of decision-making in neonatal intensive care units – hospital nurseries for the sickest infants. Parents now have much more say in their babies’ care than in the past, and Stanford experts who were on the front lines of the change, including William Benitz, MD, chief of neonatology at Lucile Packard Children’s Hospital Stanford, explained how it happened.

As medical care for premature and other at-risk babies advanced in the 1970s and early 1980s, doctors gained the ability to save many infants who would once have died soon after birth. But some children in the new category of survivors had lifelong disabilities, with lasting implications for them and their families.

At first, doctors did not realize that this change would affect parents’ desire to participate in planning medical decisions for fragile infants:

“It never occurred to anyone that that might be a reasonable conversation to have,” Benitz says. “We were in unexplored territory.”

As technology improved and doctors tried to save sicker babies, and some born even earlier in gestation, there were new decisions to make: Should the health team put the tiny child on a ventilator? Attempt heart surgery? Those interventions helped many infants survive. Others did not fare as well.

“A lot of them ended up with significant impairments,” Benitz recalls. And doctors started to get pushback. “In the mid-80s we began to hear from families that maybe that wasn’t consistent with their goals for their children.”

As a result, neonatologists began having in-depth conversations with parents about the possible outcomes of different treatments for their infants. The practice is now widespread, and it means a lot to parents like Karin and Chris Belluomini, whose daughter, Joy, was born in May 2015 with Down syndrome, several heart defects and fluid around her lungs.

Continue Reading »

Clinical Trials, Ethics, Health Policy, Public Health, Stanford News

Using social media in clinical research: Case studies address ethical gray areas

Using social media in clinical research: Case studies address ethical gray areas

decisions

If a public-health researcher is reviewing Facebook profiles of 14-year-old males for firearm references and discovers photos or words referencing a potentially threatening situation, should the researcher intervene? What levels of privacy should these children expect in the online world?

These are the kinds of difficult questions that ethics consultants are faced with as they attempt to provide moral and legal guidance to researchers gathering health-related data from the Internet.

To help researchers with these nascent ethics issues, the Clinical Research Ethics Consultation Collaborative, a group of almost 50 bioethicists who provide free or low-cost ethics consultations across the United States, has begun publishing case studies on its most ethically challenging cases. Thus far they’ve posted 40 case studies in the categories of behavioral/social science research, clinical trials, genetics, pediatrics, research misconduct and surrogate decision making. The site also includes information on how to participate in educational webinars and collaborative case discussions.

This effort is being led by Benjamin Wilfond, MD, at Seattle Children’s Research Institute and University of Washington, and Mildred Cho, PhD, at the Stanford Center for Biomedical Ethics.

“Our bioethics consortium has learned a great deal from the complex ethics consultations that we’ve been providing since 2005,” said Cho. “Now we have a strategy for sharing these best practices with others, to provide moral and legal guidance to researchers across the country and to better inform policymakers on evolving ethical gray areas.”

More information on the collaborative or to request a consult can be found on this website.

Previously: The challenge – and opportunity – of regulating new ideas in science and technologySocial media brings up questions, ethical unknowns for doctorsBuild it (an easy way to join research studies) and the volunteers will come
Photo by NLshop/Shutterstock

Stanford Medicine Resources: