The importance of the physician-scientist is the focus of a new Perspective piece in the New England Journal of Medicine. Writing that an increasing number of MDs have moved away from the laboratory and into clinical practice, and calling the shortfall of new physician-researchers a “national, if not global, concern,” Michael M. Gottesman, MD, outlines how the National Institutes of Health is working to reverse the trend. And he notes that the awarding of the 2012 Nobel Prize in Chemistry (which went to Robert Lefkowitz, MD, and Stanford’s Brian Kobilka, MD, both trained in cardiology) “should remind us of the critical role that clinician-scientists have played in formulating the seminal concepts that govern modern biomedical science.”
For those who have read about studies involving zebrafish and wondered how such a tiny fish can help advance research on human health, National Institutes of Health Director Francis Collins, MD, PhD, has your answer. From a post yesterday on the NIH Director’s Blog:
…Well, it turns out that more than 75% of the genes that have been implicated in human diseases have counterparts in the zebrafish. So, if we discover a mutation in a human, we can make the corresponding mutation in the zebrafish gene—and often get a pretty good idea of how the gene works, how the mutation causes havoc, and how it causes disease in humans. We can even use the zebrafish to test potential drug candidates, to see whether they can alter or fix the symptoms before moving on to mice or humans.
I was excited last week to find myself writing about an entirely new way that cells communicate in the developing embryo. The work happened like this: Geneticist and developmental biologist Maria Barna, PhD, and her colleagues wanted to use advanced, high-resolution microscopy to investigate how cells in developing chick and mouse embryos send signals to one another across relatively large distances. When they looked at individual cells, they stumbled upon a previously invisible structure that resembles long, very thin fingers that burrow through densely packed cells to reach neighbors several cell-lengths away. (Conventional fixation and imaging techniques destroy these ‘specialized filopodia’.) They then watched as the cells used these structures to deliver and receive payloads of signaling molecules to one another. Their research is published in the current issue of Nature.
The seeming specificity of the interaction contrasts starkly with the commonly held notion that signaling molecules are released from one cell and float, or diffuse, through the intercellular space to their targets. While this finding does not preclude the use of diffusion as a signaling method, it identifies another new, surprising avenue of long-distance cellular communication.
I can’t stop marveling at how scientists are still discovering entirely new unique parts of a cell. Apparently others feel the same: The work was featured this week on the Los Angeles Times’ health and science blog (including a cool video of the filopodia grasping one another) and by the California Institute for Regenerative Medicine. Because the work was conducted while Barna was a faculty member at the University of California-San Francisco, they also wrote about the work.
Third grade is a critical year for learning arithmetic facts, but while math comes easily to some children, others struggle to master the basics.
Now, researchers at Stanford have new insight into what separates adept young math students from those who have difficulty. The difference, described in a paper published today in the Proceedings of the National Academy of Sciences, can’t be detected with traditional intelligence measures such as IQ tests. But it shows up clearly on brain scans, as the new study’s senior author explained in our press release:
“What was really surprising was that intrinsic brain measures can predict change — we can actually predict how much a child is going to learn during eight weeks of math tutoring based on measures of brain structure and connectivity,” said Vinod Menon, PhD, the study’s senior author and a professor of psychiatry and behavioral sciences.
Menon’s research team conducted structural and functional MRI brain scans before third-grade students received 8 weeks of individualized math tutoring. The tutoring followed a well-validated format, combining instruction on math concepts with practice of math problems emphasizing speed. All the children who received math tutoring improved their math performance, but the performance improvements varied a lot — from 8 percent to 198 percent.
A few specific brain characteristics were particularly good at predicting which kids would benefit most from tutoring. In particular, a larger and better-wired hippocampus predicted performance improvements. The brain structures highlighted in the study are implicated in forming memories, and differ from the portions of the brain that adults use when they are learning about math. The fact that these systems are involved helps to explain why the combination of conceptual explanations and sped-up practice that the study’s tutors used is effective, Menon explained:
“Memory resources provided by the hippocampal system create a scaffold for learning math in the developing brain,” Menon said. “Our findings suggest that, while conceptual knowledge about numbers is necessary for math learning, repeated, speeded practice and testing of simple number combinations is also needed to encode facts and encourage children’s reliance on retrieval — the most efficient strategy for answering simple arithmetic problems.” Once kids are able to pull up answers to basic arithmetic problems automatically from memory, their brains can tackle more complex problems.
Next, the researchers plan to examine how brain wiring changes over the course of tutoring. The new findings could also help educators understand the basis for math learning disabilities, and may even provide a foundation for figuring out what kind of instruction could help children overcome these problems.
In response to recent questioning of the integrity of the dissemination of results in biomedical literature, three medical researchers from Stanford and Duke University are pointing to the need for increased access to data from clinical research.
In a viewpoint article published online today in JAMA Internal Medicine, the authors, Robert Califf, MD, and Jonathan McCall of Duke, and Robert Harrington, MD, of Stanford, write that it’s time for both industry and academia to “catch up to other areas of society:”
The liberation of information once held in secret has toppled regimes and transformed societal expectations regarding progress and possibilities. Access to data from clinical research should be truly democratized.
The goal of clinical research should be to add to the body of evidence that can guide decisions about personal health and health policies, the authors write – but things like selective omission of important findings, inaccuracies in published studies, and the use of unreliable data systems are all hindering this. Harrington and his colleagues outline the critical issues that need to be addressed – issues that concern “(1) the value of the research question, (2) the quality of the execution of the research, and (3) the complete and balanced presentation of all relevant data in the publication” – and sound a hopeful note:
The good news is that powerful tools exist to address and potentially surmount these issues. These tools are the ClinicalTrials.gov registry and the ongoing movement toward data transparency.
ClinicalTrials.gov was originally created to provide researchers, physicians, and the public with ready access to information on clinical trials. More recently, legal requirements to register studies have expanded to encompass the reporting of results, including adverse events, within 1 year of ascertainment of the last primary end point. These requirements are designed to ensure that findings from almost all trials relevant to US medical practice that involve a drug or device are available in a single, accessible public registry. If the requirements of ClinicalTrials.gov and other international registries are maintained and strengthened in areas where they are currently deficient, the benefits should be substantial…
I love this: A cutely named blog that celebrates the accomplishments of older women who have worked in a STEM field (science, technology, engineering and mathematics).
Harvey Mudd College mathematician Rachel Levy, PhD, said her inspiration for starting the Grandma Got Stem blog stemmed (ha, ha) from growing “tired of hearing people say ‘how would you explain that to your grandmother?’ when they probably mean something like ‘How would you explain the idea in a clear, compelling way so that people without a technical background can understand you?’” (Er, I admit that I’m guilty of having used a similar line - just substitute “elderly aunt” for “grandmother” – in the past.) Levy’s idea was to show that plenty of women have experience in technical fields by launching “public awareness/art projects using grandmothers’ pictures+names+connections to STEM.”
In the early 1940s, she applied to Cornell’s School of Agriculture but was rejected since the school did not accept women. According to family lore, Ruth wrote to First Lady Eleanor Roosevelt, who may have interceded on her behalf, since when Ruth re-applied, she was accepted.
And I also love what her granddaughter wrote, calling Guttman both an inspiration and “one of the reasons I’ve always grown up assuming that science and medicine isn’t that unusual a field for women to work in.”
If you have suggestions for other “geeky grannies” (Levy’s words!) to be profiled on the blog, Levy is looking for submissions.
Stanford psychiatrist and bioengineer Karl Deisseroth, MD, PhD, spent much of this century’s first decade developing a revolutionary method for studying the brain: optogenetics. In 2010, Nature Methods heralded optogenetics as its “method of the year.”
It looks as though lightning has struck the Deisseroth lab again.
Suppose, just for a moment, that you’re conducting espionage on a heavily guarded multi-story building strongly suspected to be an advanced nuclear-weapons facility. The building quickly proves utterly inaccesible. Fortunately, you manage (through methods too covert to be revealed here) to procure a floor plan. Nice going. Now, you know a lot about the floors themselves and a bit of cross-sectional detail on the bases of whatever’s sitting on them. Better than nothing.
Now, imagine - in fantasyland, anything goes – that you can don goggles enabling you to peer right through the building’s outer walls and directly observe its three-dimensional structure, including its concealed laboratories and the instruments and manufacturing machinery inside of them. Payday!
An analogous technique developed by Deisseroth promises to revolutionize cell biology. Exploring connections among, and contents within, the billions of cells in a chunk of tissue often involves slicing the chunk into ultra-thin sections, exposing each slice’s top and bottom surfaces for microscopy or histochemical and electrical manipulation. Sophisticated computation can stitch the slices back together (virtually), roughly reconstructing the sample’s three-dimensional structure. (That’s the floor plan I mentioned earlier.)
Unfortunately, all this sawing disrupts key connections within the tissue and distorts its constitutent cells’ geography. Plus, while those sections are thin, they’re not infinitely thin. Light and chemicals can penetrate only so far. Volumes of valuable information about their innards remains concealed.
Deisseroth’s paradigm-shifting method, called CLARITY, renders tissue transparent while leaving it structurally intact, yet accessible to large “detective” molecules scientist use to gain information about cells’ surface features and genetic contents. In a study just published in Nature, a group led by Deisseroth (who discusses his work in the video above) converted an entire adult mouse brain into an optically transparent, histochemically permeable replica of itself. The position and structure of proteins embedded in the membranes of cells and their intracellular organelles remained intact.
Okay, step back with me for a minute. Essentially, all cells are liquid-filled bubbles of oil. (Nerve cells are better visualized as long, branching, liquid-filled tubes whose walls are made of fat.) These oil/fat (in science-speak, “lipid“) bubbles and walls (“membranes”) both house and compartmentalize their contents, so operations inside them can be carried out in relative isolation. Dotting membranes’ surfaces are all kinds of proteins performing innumerable activities key to the health of the cells they enclose and the tissues those cells compose.
Evolution designed lipid membranes to be mostly impermeable to large molecules, and they happen to be opaque (or else we’d all be transparent). In a feat of chemical engineering, Deisseroth’s team replaced the lipids with, for all purposes, clear plastic. With their work, you could literally read a newspaper through the mouse’s brain. Formerly membrane-bound proteins remained anchored in the membranes’ doppelgangers, retaining their structures (a big deal, as a protein’s structure determines its function). The tissue was also nanoporous: It permitted bulky “reporter”molecules such as stain-carrying antibodies and strips of DNA to flow deep into the transformed tissue sample and out again.
Obviously you wouldn’t want to try this on yourself, although Plastic Man certainly seems to have worked out the kinks.
Stanford professor Jonathan Osborne, PhD, believes there’s a grave misperception between how scientists learn and study the world around them and how science is taught to students in the classroom. “In science, people argue for their ideas, in terms of the evidence that they have,” Osborne said in a Stanford Reportstory published today. “There should be more opportunities [in science education] to look at why some ideas are wrong, as well as what the right ideas are.”
Strong advocates of using the “argumentation” model in science education, Osborne and colleagues are working to develop new science curriculum standards to help students better understand “science’s major ideas, why they are important and how they are justified.” More about their research from the article:
In 2007 they launched a large-scale, two-year study at four schools in the United Kingdom. Because individual teachers come and go, Osborne and his colleagues sought to embed the new method in the faculty of an entire school. “Lead teachers” learned argumentation from training videos and passed their training along to their colleagues. Researchers tracked students’ reasoning ability, argumentation skills, views about knowledge and engagement with science.
The results, recently published in the Journal of Research in Science Teaching, were surprising –measures of students’ skill and understanding did not significantly improve.
“This is more of a challenge that we thought it was,” he said.
He and his co-authors have a few theories. Perhaps even two school years is insufficient time to see a significant effect. Or maybe the assessments of students’ skills were insufficient.
“Measuring students’ skills in argumentation is something which, as a field, we have not developed,” Osborne said.
Or their teacher training methods may need re-thinking. Osborne is now working to help teachers implement argumentation hands-on during summer sessions.
In case you haven’t yet seen it, the front page of today’s New York Times features a piece on the world of “pseudo-academia,” where less-than-reputable journals and conferences “masquerade” as highly competitive ones and stand to fool both scientists and consumers. Gina Kolata writes:
Steven Goodman, a dean and professor of medicine at Stanford and the editor of the journal Clinical Trials, which has its own imitators, called this phenomenon “the dark side of open access,” the movement to make scholarly publications freely available.
The number of these journals and conferences has exploded in recent years as scientific publishing has shifted from a traditional business model for professional societies and organizations built almost entirely on subscription revenues to open access, which relies on authors or their backers to pay for the publication of papers online, where anyone can read them.
But some researchers are now raising the alarm about what they see as the proliferation of online journals that will print seemingly anything for a fee. They warn that nonexperts doing online research will have trouble distinguishing credible research from junk. “Most people don’t know the journal universe,” Dr. Goodman said. “They will not know from a journal’s title if it is for real or not.”
Freezers storing blood from thousands of generous research volunteers who donate samples when they are healthy – years or even decades before they might develop cancer, diabetes or other chronic diseases – can be found across the country. For scientists, these “pre-diagnostic” blood samples are likely to contain new biological clues of disease, perhaps molecular flags that cancerous cells are multiplying, or immunological rumblings as the immune system responds to the first signs of disease. Finding these signals is critical to future prevention, as they could represent the basis for blood tests or other means of ultra-early detection of disease.
The statistics involved in gathering enough pre-diagnostic blood samples to make them useful to research are daunting, though. For example, to study the blood of 100 women who go on to develop ovarian cancer in the next year, more than 200,000 samples from healthy women must first be stockpiled.
This month, Stanford’s partner, the Cancer Prevention Institute of California, along with their colleagues in Southern California at the City of Hope National Medical Center and UC Irvine, embark on an epic research effort: asking more than 50,000 female teachers, retired teachers and school administrators all over California – participants for the last 16 years in the long-term follow-up California Teachers Study – to provide a blood sample to be stored away for future research. This is no small logistical feat. First, teachers aged 50 to 79 from all over the state will be asked to participate and provide a convenient time and place for a phlebotomist to visit them for a blood draw. The samples will then be express shipped to a state-of-the-art biobank where they will be frozen in large banks of closely monitored freezers, alongside similar samples from other long-term studies.
The Teachers Study will continue its long-standing routines for tracking the health outcomes of each participant by continuously linking their names and other identifying information to California health databases, including death certificates, cancer registries and hospitalization discharge summaries. With time, the stored blood samples will turn into scientific gold, as we learn which of them were drawn from women who later developed cancer. In addition to looking for early proteomic markers of breast, ovarian and other cancers, the samples of women who ultimately developed cancer will undergo intense testing for chemical pollutant levels.
DNA will also be extracted from the blood, and from saliva samples donated by mail from teachers who live too far from the phlebotomists’ routes, or who volunteer to participate in that way. These DNA samples will likely be analyzed with others from very large prospective studies, like the ongoing study of more than 100,000 Northern California Kaiser Permanente members, whose saliva samples have been banked.
Some new clues to cancer can only be discovered when scientists study massive numbers of samples at the same time. To date, gene hunting has yielded a few blockbuster findings – most famously the rare BRCA1 and BRCA2 genes with very high risk for breast cancer - but no common genes or gene combinations amenable to broader risk profiling. This may be because past efforts didn’t have the statistical power to find the most likely culprits, subtle combinations of many gene mutations that together may provide some meaningful differentiator of risk. Very large datasets, containing not thousands but millions of genomes, will be required to establish reliable genomic markers of disease.
Genomic prediction for chronic disease and ultra-early blood tests for cancer aren’t here yet, but they’re getting closer. And when they do arrive, we can thank the volunteers with the foresight to file away their precious blood samples in many, many freezers.