Published by
Stanford Medicine

Category

Neuroscience

Neuroscience, Podcasts, Research, Stanford News

Young mouse to old mouse: “It’s all in the blood, baby”

Young mouse to old mouse: "It's all in the blood, baby"

A few days after his latest research hit the press, I sat with neurologist Tony Wyss-Coray, PhD, for a 1:2:1 podcast. He laughed when I mentioned the range of news headlines touting his Nature Medicine study (subscription required) that found blood plasma from young mice improves the memory and learning of old mice. One headline declared: “The Fountain of Youth is Filled with Blood.” Another flashed: “Vampires Delight? Young Blood Recharges Brains of Old Mice.”

Serendipitously Wyss-Coray’s paper coincided with the release of two similar studies from Harvard teams on the rejuvenating power of young blood. For the science press, it was a perfect confluence of red.

My colleague Bruce Goldman has followed Wyss-Coray’s research for several years. He’s also written about prior studies of Thomas Rando, MD, PhD, showing that the blood of young mice could stimulate old stem cells and rejuvenate aging tissue. Rando’s work laid the path for Wyss-Coray’s investigations.

Perhaps there’s something here that will be significant for human beings and actually lead to breakthroughs in treatments for a range of neurological brain disorders like Alzheimer’s. Wyss-Coray is circumspect. It’s a tall leap from mice to human beings, but he’s eager to make the jump in clinical trials.

Previously: The rechargeable brain: Blood plasma from young mice improves old mice’s memory and learning, Red light, green light: Simultaneous stop and go signals on stem cells’ genes may enable fast activation, provide “aging clock”, Old blood + young brain = old brain, Old blood makes young brains act older, and vice versa and Freshen up those stem cells with young blood

Neuroscience, Stanford News, Stroke, Surgery, Videos

Raising awareness of moyamoya disease

Raising awareness of moyamoya disease

Today isn’t just May 6, it’s also World Moyamoya Day. Well, not officially – but one patient is trying to change that.

Moyamoya, a rare cerebrovascular disease is often overlooked by neurologists, and its symptoms confused with those of chronic migraines. Tara MacInnes spent most of her childhood suffering from excruciatingly painful headaches and bouts of numbness and tingling in her hands, face and legs. Like many others with moyamoya disease, these episodes were overlooked by her pediatric neurologists. By age 16, when an especially bad episode led to an MRI and eventually a correct diagnosis, both sides of her brain had already suffered damage from strokes.

But MacInnes was lucky: She happened to live close to Stanford, where Gary Steinberg, MD, PhD, one of the world’s leading experts on moyamoya treatment, practiced. And like many patients, what MacInnes needed was more than just surgery – she needed a sense of belonging and the ability to interact with others who had gone through a similar experience.

Shortly after her surgery here MacInnes began volunteering at the Stanford Moyamoya Center, talking with patients and their families. The more she met with people, the quicker she realized it wasn’t just the general public that didn’t know much about the disease, but that many medical professionals had never heard of it. Now, 10 years after her successful surgery, MacInnes has become a devoted advocate and is determined to raise awareness about the disease; you can sign her petition to help spread the word and make World Moyamoya Day official.

Previously: How patients use social media to foster support systems, connect with physicians

Bioengineering, Genetics, Neuroscience, Pregnancy, Research, Stanford News

Step away from the DNA? Circulating *RNA* in blood gives dynamic information about pregnancy, health

Step away from the DNA? Circulating *RNA* in blood gives dynamic information about pregnancy, health

blood on fingertip - 260

I read a lot of scientific papers. And while they’re all interesting, they don’t all make me snap to attention like the latest from Stanford bioengineer Stephen Quake, PhD. I even remarked to my husband that it’s rare to get the immediate sense that a discovery will significantly change clinical care.

If anyone’s going to shake up the status quo, however, it would be Quake. You may remember that Quake has made waves before with his pioneering discoveries involving the analysis of tiny bits of DNA circulating in our blood. His 2008 discovery that it’s possible to non-invasively detect fetal chromosomal abnormalities with a maternal blood sample has revolutionized prenatal care in this country. It’s estimated that, in 2013, hundreds of thousands of pregnant women used a version of this test to learn more about the health of their fetuses. And, in 2012, Quake showed its possible to sequence an entire fetal genome from a maternal blood sample.

Now he and his lab have gone one step further by turning their attention to another genetic material in the blood, RNA. Although information conveyed in the form of DNA sequences is mostly static (the nucleotide sequence of genes, for example), RNA levels and messages change markedly among tissues over time and at various developmental points. The difference in available information is somewhat like comparing a still photo with a high-resolution video when it comes to sussing out what the body is actually doing at any point in time.

The study was published today in the Proceedings of the National Academy of Sciences. As I explain in my release:

In the new study, the researchers used a technique previously developed in Quake’s lab to identify which circulating RNA molecules in a pregnant woman were likely to have come from her fetus, and which were from her own organs. They found they were able to trace the development of specific tissues, including the fetal brain and liver, as well as the placenta, during the three trimesters of pregnancy simply by analyzing blood samples from the pregnant women over time.

Quake and his colleagues believe the technique could also be broadly useful as a diagnostic tool by detecting distress signals from diseased organs, perhaps even before any clinical symptoms are apparent. In particular, they found they could detect elevated levels of neuronal-specific RNA messages in people with Alzheimer’s disease as compared with the healthy participants.

Quake and the lead authors, graduate students Winston Koh and Wenying Pan, liken their technique to a “molecular stethoscope.” They believe it could be broadly useful in the clinic. More from my release:

“We’ve moved beyond just detecting gene sequences to really analyzing and understanding patterns of gene activity,” said Quake. “Knowing the DNA sequence of a gene in the blood has been shown to be useful in a few specific cases, like cancer, pregnancy and organ transplantation. Analyzing the RNA enables a much broader perspective of what’s going on in the body at any particular time.”

Previously: Whole-genome fetal sequencing recognized as one of the year’s “10 Breakthrough Technologies” and Better know a bioengineer: Stephen Quake
Photo by Alden Chadwick

Aging, Mental Health, Neuroscience, Research, Stanford News, Stem Cells

The rechargeable brain: Blood plasma from young mice improves old mice’s memory and learning

The rechargeable brain: Blood plasma from young mice improves old mice's memory and learning

brain battery“Maybe Ponce de Leon should have considered becoming a vampire,” I noted here a few years ago. In a related Stanford Medicine article, I elaborated on that point (i.e. Dracula may have been on to something):

Count Dracula may have been bloodthirsty, but nobody ever called him stupid. If that practitioner of what you could call “the Transylvanian transfusion” knew then what we know now, it’s a good bet he was keeping his wits as sharp as his teeth by restricting his treats to victims under the age of 30.

I was referring then to an amazing discovery by Stanford brain-degeneration expert Tony Wyss-Coray, PhD, and his then-graduate student Saul Villeda, PhD, who now has his own lab at the University of California-San Francisco. They’d found that something in an old mouse’s blood could somehow exert an aging effect on the capabilities of a young mouse’s brain, and you know that ain’t good. They’d even pinpointed one specific substance (eotaxin) behind this effect, implying that inhibiting this naturally produced and sometimes very useful chemical’s nefarious action – or, if you’re a vampire, laying off the old juice and  getting your kicks from preteens when available – might be beneficial to aging brains.

But I was premature. While the dynamic duo had shown that old blood is bad for young brains and had also demonstrated that old mice’s brains produce more new nerve cells (presumably a good thing) once they’ve had continuous exposure to young mice’s blood, the researchers hadn’t yet definitively proven that the latter translated into improved intellectual performance.

This time out they’ve gone and done just that, in a study (subscription required) published online yesterday in Nature Medicine. First they conducted tricky, sophisticated experiments to show that when the old mice were continuously getting blood from young mice, an all-important region in a mouse’s brain (and yours) called the hippocampus perks up biochemically, anatomically and physiologically: It looks and acts more like a younger mouse’s hippocampus. That’s big, because the hippocampus is not only absolutely essential to the formation of new memories but also the first brain region to go when the early stirrings of impending dementia such as Alzheimer’s start subtly eroding brain function, long before outwardly observable symptoms appear.

Critically, when Wyss-Coray, Villeda and their comrades then administered a mousey IQ test (a standard battery of experiments measuring mice’s ability to learn and remember) to old mice who’d been injected with plasma (the cell-free part of blood) from healthy young mice, the little codgers far outperformed their peers who got crummy old-mouse plasma instead.

Slam dunk.

“This could have been done 20 years ago,” Wyss-Coray told me when I was assembling my release on this study. “You don’t need to know anything about how the brain works. You just give an old mouse young blood and see if the animal is smarter than before. It’s just that nobody did it.”

Previously: When brain’s trash collectors fall down on the job, neurodegeneration risk picks up, Brain police: Stem cells’ fecund daughters also boss other cells around, Old blood + young brain = old brain and Might immune response to viral infections slow birth of new nerve cells in brain?
Photo by Takashi Hososhima

Neuroscience, Nutrition, Obesity, Research

Changing views on dietary fiber’s role in weight loss

Changing views on dietary fiber's role in weight loss

food girlAs the brain-gut connection comes into sharper focus, new insights into obesity are emerging. A recent study has found that dietary fiber’s role in weight loss, commonly attributed to releasing appetite-suppressing hormones in the gut, may be a matter of the mind. As Nature News reports, researchers from the UK and Spain showed in a study in mice how a product of fiber fermentation reduced food intake by influencing a region of the brain.

From the piece:

[The researchers] fed mice fibre labelled with carbon-13, which has an additional neutron from the more common carbon-12 that gives its nuclei a magnetic spin and therefore makes it easy to track as it progresses through the body’s chemical reactions. The fibre was fermented as usual into acetate, which turned up not only in the gut, but also in the hypothalamus, a part of the brain known to be involved in regulating appetite. There, the researchers found, it was metabolized through the glutamine-glutamate cycle, which is involved in controlling the release of neurotransmitters associated with appetite control. The same model has been proposed for acetate metabolism after drinking alcohol.

The mice fed with large doses of fermentable fibre ate less food, and ended up weighing less than control mice that were fed unfermentable fibre.

The article notes the researchers plan to investigate enriching fiber with acetate to aid digestion and appetite control. “It’s sort of a way of having your cake, and not eating it,” said Jimmy Bell, PhD, one of the study’s researchers and a biochemist at Imperial College London.

Previously: Examining how microbes may affect mental healthCould gut bacteria play a role in mental health? and Animal study shows a protein in the brain may regulate appetite
Photo by Harmon

Neuroscience, Research, Stanford News, Technology

This is your brain on a computer chip

20140425_neurogrid_stillHere are some numbers that blew me away when I heard them earlier last week. You brain is using just a few watts of power right now as it sees and processes these words, hears and sorts through sounds around you and makes mental notes about grocery lists, or dry cleaning that needs picking up. By contrast, a computer uses about 40,000 times more power and runs about 9,000 times slower just to model a mouse brain and a human brain is about 1,000 times more complex. Given that, it’s no surprise several groups are hard at work trying to create a computer chip with brain-like efficiency.

Stanford Bioengineer Kwabena Boahen, PhD, and his graduate student Ben Varkey Benjamin have announced a milestone in this effort: they’ve modeled one million neurons in real time on a or a circuit-board called Neurogrid that contains sixteen chips called Neurocores. Their publication, in the Proceedings of the Institute of Electrical and Electronics Engineers, goes into more detail about exactly how they are using electronics parts to mirror our own intricate collection of cells, as does this story about the work.

What I found  most interesting are the possible uses of such a chip. Obviously, it could make our personal electronics smaller, smarter and less power hungry. But the chip can also for the first time model how our brain works, and how it fails to work in some diseases. This is something that once required supercomputing capabilities, plus lots of time and power. Now anyone can do it.

The chip also makes possible the dream of interpreting signals from the brain and, in real time, using those signals to drive robotic limbs for paralyzed people. As things are now, a person would be tethered to a computer and a power supply to interpret brain signals, and the limb wouldn’t move in real time. A Neurocore-like chip could conceivably be implanted, interpreting signals and driving robots in real time with minimal power needs. Boahen is working with his Clark Center neighbor and fellow Bio-X affiliate Krishna Shenoy, PhD, who is professor of electrical engineering and neurobiology, on making that dream a reality.

This video by my colleague Kurt Hickman shows where the team is now in working with Neurogrid to drive robot movement.

Photo by Kurt Hickman

Bioengineering, In the News, Neuroscience, Stanford News, Technology

New York Times profiles Stanford’s Karl Deisseroth and his work in optogenetics

New York Times profiles Stanford's Karl Deisseroth and his work in optogenetics

Rockefeller University neurobiologist Cori Bargmann, PhD, is quoted in today’s New York Times as saying optogenetics is “the most revolutionary thing that has happened in neuroscience in the past couple of decades.” The article is a profile piece of Karl Deisseroth, MD, PhD, the Stanford researcher who helped create the field of optogenetics, and it reveals how a clinical rotation in psychiatry led him to this line of work:

It was eye-opening, he said, “to sit and talk to a person whose reality is different from yours” — to be face to face with the effects of bipolar disorder, “exuberance, charisma, love of life, and yet, how destructive”; of depression, “crushing — it can’t be reasoned with”; of an eating disorder literally killing a young, intelligent person, “as if there’s a conceptual cancer in the brain.”

He saw patient after patient suffering terribly, with no cure in sight. “It was not as if we had the right tools or the right understanding.” But, he said, that such tools were desperately needed made it more interesting to him as a specialty. He stayed with psychiatry, but adjusted his research course, getting in on the ground floor in a new bioengineering department at Stanford. He is now a professor of both bioengineering and psychiatry.

Previously: A federal push to further brain research, An in-depth look at the career of Stanford’s Karl Deisseroth, “a major name in science”, Lightning strikes twice: Optogenetics pioneer Karl Deisseroth’s newest technique renders tissues transparent, yet structurally intact, The “rock star” work of Stanford’s Karl Deisseroth and Nature Methods names optogenetics its “Method of the Year
Related: Head lights
Photo in featured-entry box by Linda Cicero/Stanford News Service

Neuroscience, Research, Stanford News

Thoughts light up with new Stanford-designed tool for studying the brain

Thoughts light up with new Stanford-designed tool for studying the brain

A 3d rendered illustration of a nerve cell.

When I talk to neuroscientists about how they study the brain I get a lesson (usually filled with acronyms) in the various ways scientists go about trying to read minds. Some of the tools they use can detect when general regions of the brain are active, but can’t detect individual nerves. Others record the activity of individual nerves, one nerve at a time, but can’t detect networks of nerves firing together. Still another tool can report the afterglow of a signal that has been sent across networks of neurons.

There hasn’t been any one way of seeing when a nerve fires and which neighbors in connects to.

I wrote recently about a new tool to do just that, developed by bioengineer Michael Lin, MD, PhD, and biologist and applied physicist Mark Schnitzer, PhD. They’ve both come up with proteins that light up when a nerve sends a signal. They can put their proteins in a group of nerves in one part of the brain then watch those signals spread across the network of neurons as they interact.

In my story I quote Lin: “You want to know which neurons are firing, how they link together and how they represent information. A good probe to do that has been on the wish list for decades.”

The proteins could be widely used to better understand the brain or develop drugs:

With these tools scientists can study how we learn, remember, navigate or any other activity that requires networks of nerves working together. The tools can also help scientists understand what happens when those processes don’t work properly, as in Alzheimer’s or Parkinson’s diseases, or other disorders of the brain.

The proteins could also be inserted in neurons in a lab dish. Scientists developing drugs, for example, could expose human nerves in a dish to a drug and watch in real time to see if the drug changes the way the nerve fires. If those neurons in the dish represent a disease, like Parkinson’s disease, a scientist could look for drugs that cause those cells to fire more normally.

Now that I’ve written about the invention of this new tool I’m looking forward to hearing more about how scientists start using it to understand our brain or develop drugs.

3D rendered illustration of a nerve cell by Sebastian Kaulitzki/Shutterstock

Aging, Genetics, Neuroscience, Podcasts, Research, Stanford News

The state of Alzheimer’s research: A conversation with Stanford neurologist Michael Greicius

The state of Alzheimer's research: A conversation with Stanford neurologist Michael Greicius

My colleague Bruce Goldman recently wrote an expansive blog entry and article based on research by Mike Greicius, MD, about how the ApoE4 variant doubles the risk of Alzheimer’s for women. I followed up Goldman’s pieces in a podcast with Greicius, who’s the medical director of the Stanford Center for Memory Disorders.

I began the conversation by asking about the state of research for Alzheimer’s: essentially, what do we know? As an aging baby boomer, I’m interested in the differences between normal, age-related cognitive decline versus cognitive declines that signal an emerging disease. Greicius said people tend to begin losing cognitive skills around middle age:

Every cognitive domain we can measure starts to decline around 40. Semantic knowledge – knowledge about the world – tends to stay pretty stable and even goes up a bit. Everything else… working memory, short term memory all tends to go down on this linear decline. The difference with something like Alzheimer’s is that the decline isn’t linear. It’s like you fall off a cliff.

Greicius’ most recent research looks at the certain increased Alzheimer’s risk ApoE4 confers on women. As described by Goldman:

Accessing two huge publicly available national databases, Greicius and his colleagues were able to amass medical records for some 8,000 people and show that initially healthy ApoE4-positive women were twice as likely to contract Alzheimer’s as their ApoE4-negative counterparts, while ApoE4-positive men’s risk for the syndrome was barely higher than that for ApoE-negative men.

In addition to the increased risk of Alzheimer’s for women with the ApoE4 variant, I asked Greicius how he advises patients coming into the clinic who ask about staving off memory loss. At this point, he concedes, effective traditional medication isn’t really at hand. “Far and away our strongest recommendations bear on things like lifestyle and particularly exercise,” he said. “We know, in this case from good animal models, that physical exercise, particularly aerobic exercise, helps brain cells do better and can stave-off various insults.” So remember, a heart smart diet along with aerobic exercise.

One last question for Greicius: What about those cognitive-memory games marketed to the elderly and touted as salves for memory loss – do they have any benefit? He’s riled now: “I get asked that all the time, and smoke starts coming out of my ears.” He says the games are nothing more than snake oil.  His advice when he gets asked the question: “Give that money to the Alzheimer’s Association or save it and get down on the floor with your grandkids and build Legos. That’s also a great cognitive exercise and more emotionally rewarding.”

Previously: Having a copy of ApoE4 gene variant doubles Alzheimer’s risk for women but not for men, Common genetic Alzheimer’s risk factor disrupts healthy older women’s brain function, but not men’s and Hormone therapy halts accelerated biological aging seen in women with Alzheimer’s genetic risk factor

Neuroscience, Research, Technology, Videos

Using Google Glass to improve quality of life for Parkinson’s patients

Using Google Glass to improve quality of life for Parkinson's patients

Researchers at Newcastle University are exploring ways that Google Glass could improve Parkinson’s patients’ quality of life by assisting them in placing phone calls, reminding them to take their medications or giving them behavioral prompts, such as speaking louder. In the video above, Roisin McNaney, a PhD student in the university’s Digital Interaction Group, explains how using Glass could ease patients’ anxiety about encountering a symptom-related problem while in public, raise patients’ confidence and, ultimately, make them more independent.

Previously: Abraham Verghese uses Google Glass to demonstrate how to begin a patient exam, Revealed: The likely role of Parkinson’s protein in the healthy brain and Stanford study identifies molecular mechanism that triggers Parkinson’s
Via Medgadget

Stanford Medicine Resources: