Published by
Stanford Medicine

Category

NIH

Health Costs, Health Policy, In the News, NIH, Public Health, Science Policy

Research investment needed now, say top scientists

Top scientists made the case for continued investment in basic science and engineering earlier this week by unveiling a new report, “Restoring the Foundation: The Vital Role of Research in Preserving the American Dream” by the American Academy of Arts and Sciences.

Here’s why this is important: Federal investment is needed to power innovation engines like Stanford’s School of Medicine, and if that money gets funneled to roads, the military, Medicare, or any of a variety of other uses, fewer jobs, and fewer discoveries, could result. From the report:

Unless basic research becomes a higher government priority than it has been in recent decades, the potential for fundamental scientific breakthroughs and future technological advances will be severely constrained.

Compounding this problem, few mechanisms currently exist at the federal level to enable policy-makers and the research community to set long-term priorities in science and engi­neering research, bring about necessary reforms of policies that impede progress, or facilitate stronger cooperation among the many funders and performers of research…

Stanford President John Hennessy, PhD; biochemist Peter S. Kim, PhD; and physicist (and former U.S. Secretary of Energy) Steven Chu, PhD, are among the scientific rock-stars who co-authored the report.

For an excellent piece on the political debate surrounding the report’s release, check out the coverage in Science here. NPR also recently aired a series that colorfully illustrates the effects of research cutbacks, including a piece on a patient suffering from ALS, and a profile of several underemployed scientists.

Becky Bach is a former park ranger who now spends her time writing or practicing yoga. She’s a science writing intern in the Office of Communications and Public Affairs. 

Previously: More attention, funding needed for headache care, “Bold and game-changing” federal report calls for $4.5 billion in brain-research funding, Federal investments in research and higher education key to U.S. maintaining innovation edge

Cancer, In the News, NIH, Research, Stanford News, Women's Health

NIH Director highlights Stanford research on breast cancer surgery choices

NIH Director highlights Stanford research on breast cancer surgery choices

The director of the NIH, Francis Collins, MD, this morning weighed in on a topic that has garnered much attention lately: the type of surgery that women diagnosed with breast cancer choose. The post, found at the NIH Director’s blog, describes a recent study by Stanford researchers published earlier this month in the Journal of the American Medical Association that examined survival rates after three different types of breast cancer surgery for women diagnosed with cancer in one breast: a lumpectomy (removal of the just the affected tissue, usually followed by radiation therapy), a single mastectomy (removal of the whole affected breast), and double mastectomy (removal of the unaffected breast along with the affected one.)

In a previous post we wrote in detail about the study and the finding that the number of double mastectomies in California have increased dramatically. However, except for women with the BRCA1 or BRCA2 genes, the procedure does not appear to improve survival rates for women who undergo the surgery compared with women who choose other types of breast surgery. Collins notes:

It isn’t clear exactly what prompted this upsurge in double mastectomy, which is more expensive, risky, and prone to complications than other two surgical approaches. But [researchers] Kurian and Gomez suggest that when faced with a potentially life-threatening diagnosis of cancer in one breast—and fears about possibly developing cancer in the other—women may assume that the most aggressive surgery is the best. The researchers also said it’s also possible that new plastic surgery techniques that achieve breast symmetry through bilateral reconstruction may make double mastectomy more appealing to some women.

Despite its recent upsurge in popularity, the study found double mastectomy conferred no survival advantage over the less aggressive approach of lumpectomy followed by radiation.

Collins also points out that the slightly worse survival rates of women who undergo single mastectomies probably reflect the fact that poorer women were more likely to have this surgery and is evidence of yet another health disparity linked to economic status.

Previously: Breast cancer patients are getting more bilateral mastectomies – but not any survival benefit

Autoimmune Disease, Genetics, NIH, Research, Science

Tiny hitchhikers, big health impact: Studying the microbiome to learn about disease

Tiny hitchhikers, big health impact: Studying the microbiome to learn about disease

I don’t know about you, but I’m fascinated with the idea of the “microbiome.” If you’re unfamiliar with the term, it describes the millions upon millions of tiny, non-human hitchhikers that live on and in you (think bacteria, viruses, fungi and other microscopic life). Although the exact composition of these molecular roommates can vary from person to person, they aren’t freeloaders. Many are vitally important to your metabolism and health.

We’ve reported here on the Human Microbiome Project, launched in 2007 and supported by the National Institutes of Health’s Common Fund. Phase 2 of the project started last fall, with grants to three groups around the country to study how the composition of a person’s microbiome might affect the onset of diseases such as type 2 diabetes and inflammatory bowel disease, as well as its role in pregnancy and preterm birth. Now the researchers, which include Stanford geneticist Michael Snyder, PhD, have published an article in Cell Host & Microbe detailing what data will be gathered and how it will be shared.

As explained in a release by the National Human Genome Research Institute:

“We’re producing an incredibly rich array of data for the community from the microbiomes and hosts in these cohorts, so that scientists can evaluate for themselves with these freely available data which properties are the most relevant for understanding the role of the microbiome in the human host,” said Lita M. Proctor, Ph.D., program director of the Human Microbiome Project at NIH’s National Human Genome Research Institute (NHGRI).

“The members of the Consortium can take advantage of each other’s expertise in dealing with some very complex science in these projects,” she said. “We’re generating these data as a community resource and we want to describe this resource in enough detail so people can anticipate the data that will be produced, where they can find it and the analyses that will come out of the Consortium’s efforts.”

As I’ve recently blogged, data-sharing among researchers and groups is particularly important for research efficiency and reproducibility. And I’m excited to hear what the project will discover. More from the release:

For years the number of microbial cells on or in each human was thought to outnumber human cells by 10 to 1. This now seems a huge understatement. Dr. Proctor noted that the 10-to-1 estimate was based only on bacterial cells, but the microbiome also includes viruses, protozoa, fungi and other forms of microscopic life. “So if you really look at the entire microbial community, you’re probably looking at more like a 100-to-1 ratio,” she said.

Although thousands of bacterial species may make their homes with human beings, each individual person is host to only about 1,000 species at a time, according to the findings of the Human Microbiome Project’s first phase in 2012.

In addition, judging from the array of common functions of bacterial genes, if the bacteria are healthy, each individual’s particular suite of species appear to come together to perform roughly the same biological functions as another healthy individual. In fact, researchers found that certain bacterial metabolic pathways were always present in healthy people, and that many of those pathways were often lost or altered in people who were ill.

Stanford’s Snyder will join forces with researchers in the laboratory of George Weinstock, PhD, of the Jackson Laboratory for Genomic Medicine in Connecticut to investigate the effect of the microbiome on  the onset of Type 2 diabetes. Snyder may be uniquely positioned to investigate the causes of the condition. In 2012, he made headlines when he performed the first ever ‘omics’ profile of himself (an analysis that involves whole genome DNA sequencing with repeated measurements of the levels of RNA, proteins and metabolites in a person’s blood over time). During the process, he learned that he was on the cusp of developing type 2 diabetes. He was able to halt the progression of the disease with changes in exercise and diet.

Previously: Stanford team awarded NIH Human Microbiome Project grantElite rugby players may have more diverse gut microbiota, study shows and Could gut bacteria play a role in mental health?

Chronic Disease, NIH, Patient Care, Research

NIH network designed to diagnose, develop possible treatments for rare, unidentified diseases

doctors' tools - smallVertigo, nausea, headache, fatigue, confusion. For years someone close to me has experienced severe and periodic bouts of these symptoms. It’s clear something is wrong and yet, despite countless tests and visits with specialists in cardiology, neurology, ophthalmology, pulmonology, otolaryngology, and immunology, no one has been able to figure out what that something is. At one of his last appointments – to the great disappointment of this patient and (perhaps even more so) his worried and frustrated wife – my loved one was gently told that he may have to face the very real possibility that he’ll never get a definitive diagnosis.

Unfortunately, this patient is far from alone: Plenty of people are living with mysterious symptoms that affect their quality of life (or worse), and it’s not uncommon for patients with rare diseases to have waited years for their diagnosis. With this in mind, the National Institutes of Health launched in 2008 its Undiagnosed Diseases Program, a pilot program designed to “provide answers to patients with mysterious conditions that have long eluded diagnosis” and “advance medical knowledge about rare and common diseases.” (Since that time, 600 children and adults have been evaluated, and approximately 100 patients were given a diagnosis.)

Now, the program is being expanded into the Undiagnosed Diseases Network, with the NIH announcing last week that six medical centers – including Stanford – will be joining and contributing local medical expertise. The NIH will work with experts from these centers (including Euan Ashley, MD, PhD, Stanford’s principal investigator) to, as described in a release, “select from the most difficult-to-solve medical cases and together develop effective approaches to diagnose them.” The physicians will “collect and share high-quality clinical and laboratory data, including genomic information, clinical observations and documentation of environmental exposures,” and they’ll “benefit from common protocols designed to improve the level of diagnosis and care for patients with undiagnosed diseases.”

In our online story on the network and the $7.2 million grant that Stanford received, Matthew Wheeler, MD, medical director for the grant, notes that “Stanford was chosen for our informatics expertise, our experience with clinical interpretation of whole-exome and whole-genome data, and our scientific potential to follow up any lead.” As my colleague Erin Digitale further explained:

The team will use cutting-edge genomics and medical phenotyping techniques to diagnose patients, and will also aim to understand the underlying biology of patients’ conditions so they can generate targets for new therapies, Wheeler said. “We aim to make a deep dive into each patient’s biology,” he added.

By the summer of 2017, each new clinical site is expected to see 50 or more patients per year. Referring clinicians can submit applications on behalf of undiagnosed patients on the program website.

Previously: Using crowdsourcing to diagnose medical mysteries, New search engine designed to help physicians and the public in diagnosing rare diseases and The road to diagnosis: How to be insistent, persistent and consistent
Photo by Adrian Clark

Cancer, NIH, Public Health, Research, Stanford News, Videos

NIH associate director for data science on the importance of “data to the biomedicine enterprise”

NIH associate director for data science on the importance of "data to the biomedicine enterprise"

The 2014 Big Data in Biomedicine conference was held here last month, and interviews with keynote speakers, panelists, moderators and attendees are now available on the Stanford Medicine YouTube channel. To continue the discussion of how big data can be harnessed to benefit human health, we’ll be featuring a selection of the videos this month on Scope.

During his keynote speech at Big Data in Biomedicine 2014, Philip Bourne, PhD, the first permanent associate director for data science at the National Institutes of Health, shared how the federal agency hopes to capitalize on big data to accelerate biomedicine discovery, address scientific questions with potential societal benefit and promote open science.

In the above video, he talks about how data “is becoming increasingly important to the biomedical enterprise” and the NIH’s effort to coordinate strategies related to computation and informatics in biomedicine across its 27 institutes and centers, which effectively form the basis of improvements in health care across every major medical condition. “Our goal is to create interoperability between these entities,” he says in the interview. “We see data as the catalyst to create this cross talk across these respective institutes.”

Previously: Rising to the challenge of harnessing big data to benefit patients, Discussing access and transparency of big data in government and U.S. Chief Technology Officer kicks off Big Data in Biomedicine

Big data, Events, FDA, NIH, Stanford News, Technology

Discussing access and transparency of big data in government

Discussing access and transparency of big data in government

Bourne

The Big Data in Biomedicine conference of 2014 continued today with discussion around how troves of information are being stored, organized, accessed and applied in a way that’s useful to stakeholders across health care.

Yesterday afternoon, Stanford bioengineer Russ Altman, PhD, introduced keynote speaker Philip Bourne, PhD, who earlier this year began his post as the first permanent associate director for data science at the National Institutes of Health. Altman was part of the search committee that selected Bourne as part of an initiative of NIH Director Francis Collins, MD, PhD, to make use of biomedical research datasets and lead the way in coordinating effective use of Big Data.

Bourne discussed some of the factors motivating thinking on big data at the NIH, including open access to information, which was also a focus of U.S. Chief Technology Officer Todd Park‘s conference presentation. Bourne noted that currently 70 percent of research that’s funded cannot be reproduced – a statistic “of great concern to the NIH” that’s driving ongoing reproducibility studies there. But what worried him most, he said, is sustainability: How can growing databases be accommodated within the NIH’s flat budget? (“We can’t go on like this,” he said.) How can labs retain talent when competing with industry’s larger salaries offered to top scientists? (“It’s a loss to the field if you spend money making a biomedical scientist and they leave the field.”) Bourne also seeks to address “broken” areas of scholarship – a paper with “16,000 citations” that no one reads – and the reward system.

Among his solutions are applying business models to promote sustainability of research, introducing policies to ensure funding is allocated where it is most needed, sharing infrastructure where possible and treating biomedical scientists more like tenured academics. Bourne also described an NIH data commons to provide Dropbox-type storage and a collaborative compute environment for scientists.

Co-operating and data-sharing were key this morning as the conference audience heard from Taha Kass-Hout, MD, the U.S. Food and Drug Administration‘s first chief health informatics officer. He described the importance of big data to the regulatory agency’s mission “to protect and promote the public health” and in promoting information-sharing with transparency and protection of privacy. The new, scalable search and big-data analytics platform openFDA comprises more than 100 public access data sets within the FDA and  allows users to access data and run queries through APIs. ”It’s not just about the data,” Kass-Hout told the audience. Ask rather, “How can you build a community around that data?

Previously: U.S. Chief Technology Officer kicks off Big Data in BiomedicineBig Data in Biomedicine conference kicks off tomorrowBig Data in Biomedicine technical showcase to feature companies’ innovations related to big data and Euan Ashley discusses harnessing big data to drive innovation for a healthier world
Photo of Bourne by Saul Bromberger

Big data, NIH, Public Health, Research, Stanford News

NIH Director: “Big Data should inspire us”

NIH Director: "Big Data should inspire us"

Stanford systems-medicine chief Atul Butte, MD, PhD, is an intrepid data miner who firmly believes that analyzing vast reservoirs of public health information is the “fastest, least costly, most effective path to improving people’s health.” His latest paper shows how he and colleagues combed through mountains of medical information to identify new links among genes, diseases and traits. My colleague Bruce Goldman summed up the findings in a previous Scope post:

… by cross-referencing voluminous genetic data implicating particular gene variants in particular diseases with equally voluminous data associating the same gene variants with other, easily measured traits typically considered harmless, Butte and his associates were able to pick out a number of such connections, which they then explored further by accessing anonymized electronic medical records from Stanford Hospital and Clinics, Columbia University, and Mount Sinai School of Medicine.

A recent entry on the NIH Director’s Blog singled out Butte’s findings as an example of how “Big Data may provide priceless raw material for the next era of biomedical research.” Francis Collins, MD, PhD, director of the National Institutes of Health, writes:

What I find most noteworthy about this work is not the specific findings, but how the researchers demonstrate the feasibility of mining vast troves of existing data—genetic, phenotypic, and clinical—to test new hypotheses.

Indeed, we are at a point in history where Big Data should not intimidate, but inspire us. We are in the midst of a revolution that is transforming the way we do biomedical research. In some cases, rather than posing a question, designing experiments to answer that question, and then gathering data, we already have the needed data in hand—we just have to devise creative ways to sift through this mountain of data and make sense of it.

As a reminder, Butte and others from academia, industry and government are gathering here on May 21-23 for the Big Data in Biomedicine conference. Registration information can be found on the conference website.

Previously: Odd couples: Resemblances at molecular level connect diseases to unexpected, predictive traits, Nature/nurture study of type 2 diabetes risk unearths carrots as potential risk reducers, Mining medical discoveries from a mountain of ones and zeroes and Newly identified type-2 diabetes gene’s odds of being a false finding equal one in 1 followed by 19 zeroes

Genetics, NIH, Research, Science, Stanford News

Tissue-specific gene expression focus of Stanford research, grant

Tissue-specific gene expression focus of Stanford research, grant

It’s abundantly clear by now that the sequence of our genes can be very important to our health. Mutations in some key areas can lead to the development of diseases such as cancer. However, gene sequence isn’t everything. It’s necessary to know when and at what levels that mutated gene is expressed in the body’s cells and tissues.

This analysis is complicated by the fact that most of us have two copies of every gene – one from our father and one from our mother (the sex chromosomes X and Y would be an exception; people with conditions like Down syndrome that are caused by abnormal chromosomal copy numbers, another). The two copies, called alleles, are not always expressed in the same way (a phenomenon called allele-specific expression). In particular, structural changes or other modifications to the alleles, or the RNA that is made from them, can significantly affect levels of expression. This matters when one copy has a mutation that could cause a disease like cancer. That mutation could be very important if that allele is preferentially expressed, or less important if its partner is favored.

Understanding relative levels of allele expression is therefore critical to determining the effect of particular mutations in our genome. But it’s been very difficult to accomplish – in part because allele-specific expression can vary among our body’s tissues.

Recently, Stanford researchers Stephen Montgomery, PhD, an assistant professor of pathology and genetics, and Jin Billy Li, PhD, an assistant professor of genetics, devised a way to use microfluidic and deep-sequencing technology to measure the relative levels of expression of each allele in various tissues. (The research was published – subscription required – in January in Nature Methods.) Now they’ve taken the research one step further to look at the varying expression of potentially damaging alleles across ten tissues from a single individual. As Montgomery explained in an e-mail to me:

We were able to learn that as many as one-third of personal genome variants (that is, potentially damaging mutations that would be detected by genome sequencing within an individual) can be modified by allele-specific expression in ways that could influence individual outcomes. Therefore, just knowing a variant exists is only one step towards predicting clinical outcome in an individual. It is also necessary to know the context of that variant. Is the damaging allele in a gene that is abundantly expressed within and across an individual’s tissues?

Montgomery and Li published their most recent findings in today’s issue of PLOS Genetics. They were recently awarded a grant from the National Human Genome Research Institute to study allele-specific expression in thousands of tissues from 100 donors  during the next three years. The grant is part of the institute’s Genotype-Tissue Expression effort, or GTEx.

Previously: We are what we….aren’t? Cataloging deletions and insertions in the human genome

Aging, Genetics, NIH, Research

Sequencing a supercentenarian’s genome to unlock the secrets of longevity

DNA_043014In an effort to determine the genetic underpinnings of longevity, scientists at Stanford and elsewhere are mapping the human genomes of supercentenarians, individuals that have lived beyond 110 years old.

A recent entry on the NIH Director’s blog offers an in-depth overview of one such project involving a 115-year-old Dutch woman named Hendrikje “Hennie” van Andel-Schipper, who died in 2005 and donated her body to medical research. Scientists examined the genome of her blood and brain tissue and analyzed the number of somatic mutations, the type of DNA mutations that are acquired over the course of a lifetime rather than inherited. The results raised some interesting questions:

You might imagine that someone who reaches the extreme age of 115 may have a low number of somatic mutations because his or her cells have exceptional protection against DNA damage. [Scientists] rather expected this to be the case for Hennie, particularly because she’d never had leukemia, lymphoma, or any other type of blood cancer. To the researchers’ surprise, the DNA sequencing results showed that Hennie’s blood cells had accumulated about 450 mutations since she was born. That is consistent with a mutation rate of about four mutations per year of life, which is in line with previous work suggesting that laboratory-grown cells derived from younger, healthy people acquire about five mutations annually.

Recognizing that circulating blood cells are derived from a large pool of stem cells in the bone marrow, and that each stem cell may have acquired a different set of mutations during life, researchers thought it would be challenging to detect any mutations in a collection of millions of blood cells. After all, in healthy adults, bone marrow contains about 11,000 hematopoietic stem cells, of which about 1,300 are actively dividing and replenishing our blood cells. If just one of those stem cells had undergone a mutation of an A to a T, the sensitivity of current DNA sequencing technology would be very unlikely to discover it.

However, further study of Hennie’s blood genome revealed that most of her circulating white blood cells were derived from just two hematopoietic stem cells. Not only did that make the process of detecting Hennie’s somatic mutations much easier, it raised fascinating questions about how the aging process affects bone marrow. While the work still must be reproduced in other older people, the researchers speculate that as we age, the pool of hematopoietic stem cells may shrink, until all of our white blood cells are clones of just a few parent cells.

Previously: She’s so 19th century: Women pushing their hundred-and-teens and California’s oldest person helping geneticists uncover key to aging
Photo by Duncan Hull

Clinical Trials, NIH, Nutrition, Obesity, Research, Stanford News

Stanford seeks participants for weight-loss study

Stanford seeks participants for weight-loss study

Should diets come in different shapes and sizes? Stanford researchers are exploring that question and are seeking participants for a year-long weight-loss study that aims to understand why people may respond differently to the same diet. Titled “One Diet Does Not Fit All,” the study will examine how factors such as genetic influences and eating and sleeping habits have an impact on a diet’s effectiveness.

From a release:

Participants will be assigned randomly to either a very low-fat or very low-carbohydrate diet for 12 months. They will be required to attend weekly classes at Stanford for the first three months, once every other week for the following three months, and once a month for the remainder of the study. Participants must also be willing to have fasting blood samples drawn four times during the 12-month period and participate in online and written surveys. They will receive all test results at the end of the study.

The study is part of a five-year project funded by the National Institutes of Health and the Nutrition Science Initiative. Following an enrollment last year of 200, this spring researchers hope to enroll at least 135 men and women (pre-menopausal only) between the ages of 18 and 50 who are overweight or obese and are generally in good health.

For a complete list of inclusion criteria, click here. To determine eligibility for this study, complete a brief online survey. For more information, contact Jennifer Robinson at nutrition@stanford.edu.

Previously: How physicians address obesity may affect patients’ success in losing weight, To meet weight loss goals, start exercise and healthy eating programs at the same time, The trouble with the current calorie-counting system, Smaller plates may not be helpful tools for dieters, study suggests and Losing vitamins – along with weight – on a diet

Stanford Medicine Resources: