Published by
Stanford Medicine



Evolution, Genetics, History, Research, Science, Stanford News

On the hunt for ancient DNA, Stanford researchers improve the odds

On the hunt for ancient DNA, Stanford researchers improve the odds

110427-N-YY9999-002On the surface, it’s perfect Halloween fodder: Ancient Peruvian mummies, Bronze and Iron Age human teeth from Bulgaria and a thousands-of-years old hair sample from Denmark. In fact, one attendee of Stanford geneticist Carlos Bustamante’s talk this morning at the annual conference of the American Society of Human Genetics in Boston quipped that his introduction sounded like “the start of a joke.”

But really old human DNA (we’re talking thousands of years) holds amazing secrets about our distant past. What did we look like? Where did our ancestors come from? What diseases may we have had? Unfortunately, it’s much more difficult than it seems to unlock these mysteries.

Stanford postdoctoral scholar Meredith Carpenter, PhD, explained the problem in an e-mail to me yesterday:

From Neandertals to mammoths to Otzi the Iceman, discoveries in ancient DNA sequencing have been making headlines.  But what you might not realize is that most of the ancient genomes sequenced to date have come from exceptionally well-preserved specimens – Otzi, for example, was literally frozen in ice for 5000 years.

Ancient DNA specimens from temperate environments, in contrast, are much trickier to sequence because they contain high levels of environmental contamination, primarily derived from bacteria and other microbes inhabiting the ancient bone. This contamination often makes it too expensive to sequence the tiny amounts of endogenous DNA (which degrades over the years due to exposure to the elements) remaining in a sample.

Now, Carpenter and Bustamante, PhD, and their colleagues have hit upon a way to enrich, or increase the proportion of ancient human DNA in an environmental sample from about 1.2 percent to nearly 60 percent–rendering it vastly easier to sequence and analyze. They do so by exposing the sample to a genome-wide panel of human-specific RNA molecules to which the degraded DNA in the sample can bind. The effect is somewhat like stirring a pile of iron-rich dirt with a powerful magnet to isolate the metal from the soil.

This isn’t Bustamante’s first foray into the secrets of ancient DNA. Last year he published very interesting results showing that the ancestors of the famous Iceman likely came not from mainland Italy, as previously thought, but instead from the islands of Corsica or Sardinia. This new technique should enable researchers to learn even more about our ancestors, including those oh-so-intriguing mummies.

According to Carpenter:

We hope that this new method will enable ancient DNA researchers to more cheaply sequence a larger number of specimens, providing broader insight into historical populations rather than just a few well-preserved individuals.

The research is published online today in the American Journal of Human Genetics. If you’re interested in following tweets from the conference, which goes through tomorrow, you can do so by following hashtag #ASHG2013.

Previously: Iceman’s origins discovered at Stanford, Stanford study investigates our most-recent common ancestors  and Recent shared ancestry between Southern Europe and North Africa identified by Stanford researchers.
Photo by Official US Navy Imagery

Evolution, In the News, Science

419 million year-old fish fossil may reveal origins of the human jaw

6971051776_9c4c9a71a6_nAs a kid, I used to ponder the origins of my unusually square jawline while looking in the bathroom mirror. After reading this story in Nature News, I wonder if I should have pondered my jaw’s origins while walking the halls of the nearest aquarium instead. Researchers from the Chinese Academy of Sciences in Beijing discovered a well-preserved 419 million year-old fossil that suggests the armor-plated fish, called Entelognathus primordialis, may be the earliest known species with a jaw like ours.

From Nature News:

Entelognathus primordialis is a new addition to the placoderms, a class of armour-plated fishes that lived from about 430 million to 360 million years ago. Like most vertebrates, including mammals, placoderms had a bony skull and jaw, but most of them had simple beak-like jaws built out of bone plates.

If you’re stifling a yawn, consider this: Researchers have seen fossils of this fish species before, but they were in less than pristine shape. Based on these fossils, most scientists concluded that this species was unrelated to humans.

Now, the discovery of this well-preserved fossil described in the journal Nature by paleontologist Min Zhu, PhD, and his team may upend this view of our family tree. Eliot Barford of Nature News explains:

There is a serious possibility that the modern bony visage originated with E. primordialis’s ancestors. This would mean that humans look more like the last common ancestor of living jawed vertebrates than we thought, and that sharks are less primitive than palaeontologists assumed, having done away with their bones as an adaptation.

However, the rearranged family tree is not yet quite conclusive, write the authors of a related News & Views article. There remains a chance that E. primordialis evolved its jaw independently from the bony fish, so that we did not inherit it, and the resemblance is an illusion.

Holly MacCormick is a writing intern in the medical school’s Office of Communication & Public Affairs. She is a graduate student in ecology and evolutionary biology at University of California-Santa Cruz.

Previously: Stanford study investigates our most-recent common ancestorsRecent shared ancestry between Southern Europe and North Africa identified by Stanford researchers and Stanford engineer studies bones that aid hearing
Photo by cuatrok77

Evolution, otolaryngology, Research, Stanford News

Stanford engineer studies bones that aid hearing

Stanford engineer studies bones that aid hearing

What distinguishes us from the dinosaurs? Three middle ear bones, for starters. Stanford mechanical engineer Sunil Puria, PhD, studies inner- and middle-ear biomechanics and the role of bone conduction in hearing, and he’s among a number of scientists who are curious why we have the tiny malleus, incus and stapes but reptiles and birds don’t.

Read more from Stanford Report on what Puria’s research on bone conduction hearing could mean for treating hearing loss, enhancing sound technology, and understanding evolutionary biology.

Previously: Battling hearing loss on and off the battlefield and Hearing loss patient discusses why Stanford research gives her hope for an eventual cure

Evolution, Genetics, Research, Stanford News

Stanford study investigates our most-recent common ancestors

Stanford study investigates our most-recent common ancestors

skullsGenealogy buffs know the thrill that comes with identifying ancestors further and further up the family tree. Reaching back through the sands of time to learn when, where and how our relatives lived is like a very personal treasure hunt.

Now geneticist Carlos Bustamante, PhD, and his colleagues have done something similar for our Y-chromosomal “Adam” and mitochondrial “Eve”: two individuals who have passed down a portion of their genomes to the vast expanse of humanity. These people are known as our most-recent common ancestors, or MRCAs. From our release:

“Previous research has indicated that the male MRCA lived much more recently than the female MRCA,” said Bustamante… “But now our research shows that there’s no discrepancy.” Previous estimates for the male MRCA ranged from between 50,000 to 115,000 years ago.

Bustamante’s research indicates the two MRCAs roughly overlapped during evolutionary time: from between 120,000 to 156,000 years ago for the man, and between 99,000 and 148,000 years ago for the woman. More from the release :

Despite the Adam and Eve monikers, which evoke a single couple whose children peopled the world, it is extremely unlikely that the male and female MRCAs were exact contemporaries. And they weren’t the only man and woman alive at the time, or the only people to have present-day descendants. These two individuals simply had the good fortune of successfully passing on specific portions of their DNA, called the Y chromosome and the mitochondrial genome, through the millennia to most of us, while the corresponding sequences of others have largely died out due to natural selection or a random process called genetic drift.

The researchers used high-throughput sequencing technology to sequence the Y chromosomes of 69 men from nine globally distinct regions. They identified about 11,000 differences among the sequences, which allowed them to determine phylogentic relationships and timelines with unprecedented accuracy. As graduate student and study co-author David Poznik described:

Essentially, we’ve constructed a family tree for the Y chromosome. Prior to high-throughput sequencing, the tree was based on just a few hundred variants. Although these variants had revealed the main topology, we couldn’t say much about the length of any branch — the number of variants shared by all of its descendants. We now have a more complete structure, including meaningful branch lengths, which are proxies for the periods of time between specific branching events.

Previously: Recent shared ancestry between Southern Europe and North Africa identified by Stanford researchers, Cracking the code of 1000 (make that 1092!) genomes and Blond hair evolved more than once-and why it matters
Photo by tup wanders

Evolution, Research, Stanford News

Mammals can "choose" the sex of their offspring, Stanford study finds

Mammals can "choose" the sex of their offspring, Stanford study finds

monkey and mama

A Stanford researcher and his colleagues have produced a surprising new study which shows that mammals can effectively “choose” the sex of their offspring and they’re doing it for their own selfish reasons – so they can have more grandchildren.

The researchers used data on more than 2,300 animals at the San Diego Zoo to track three generations for nearly 200 mammal species. Their numbers confirm what evolutionary biologists have theorized for decades – that females are strategically evaluating their options and making decisions that will serve their own reproductive interests, said Joseph Garner, PhD, associate professor of comparative medicine at Stanford and senior author of the study. Garner told me:

You can think of this as being girl power at work in the animal kingdom. We like to think of reproduction as being all about the males competing for females, females dutifully picking the winner. But in reality females have much more invested than males, and they are making highly strategic decisions about their reproduction based on the environment, their condition, and the quality of their mate. Amazingly, the female is somehow picking the sperm that will produce the sex that will serve her interests the most: the sperm are really just pawns in a game that plays out over generations.

What the researchers found is that grandparents could strategically choose to give birth to sons, if the sons would be of high enough quality to give them more grandchildren. In fact, they found that when females did have mostly sons, those sons had 2.7 times more children per capita than those who had equal numbers of male and female offspring. The same was true for male grandparents, with the study showing that when grandfathers produced mostly sons, those sons on average had 2.4 times more children per capita. But again, it’s most likely the females controlling the process, Garner said.

How they do it remains a bit of a mystery, he said, though one theory is that females control the sperm as it moves through the reproductive tract, selectively slowing down or speeding up the sperm they want to select.

Garner noted that there have been studies showing some similar patterns among humans. For instance, one found that billionaires are more likely to have sons than daughters because sons tend to retain the family’s wealth, or so the theory goes.

To read more on this tantalizing subject, see the study, which appears online today in PLOS ONE.

Photo by EvaSwensen

Evolution, Genetics, Immunology, In the News, Infectious Disease, Research

All in the family: Uncovering the genetic history of the world's most lethal pathogens

All in the family: Uncovering the genetic history of the world's most lethal pathogens
They’re tiny terrors that are best known for the millions of people they’ve killed. Few us of would want to meet them, or their relatives. But for researchers like Verena J. Schuenemann, PhD, and Johannes Krause, PhD, uncovering the pedigrees of the world’s most lethal pathogens is an important step in combating the diseases we avoid like the plague.

Earlier this week in Forbes, writer John Farrell told the tale of the two scientists from the University of Tübingen in Germany, and how they hunt down some of the most notorious pathogens in history.

Studies such as work done by Schuenemann, Krause, and a team of international researchers on potato blight, the Black Death, and (most recently) leprosy, are changing our understanding of diseases that were once buried in the past. As Ann Gibbons of Science recently wrote (subscription required):

Awash in data, several labs are racing neck-and-neck to cull DNA from a Most Wanted list of legendary killers: tuberculosis (TB), plague, cholera, Leishmania, leprosy, the potato blight, and AIDS. They gather traces of these culprits from ancient teeth, bones, hair, feces, and—in the case of potato blight—from skin and leaves, then unleash the sequencers. The work, which began in earnest 3 years ago, adds a new dimension to our understanding of historical events, revealing the true nature of the villains responsible for humanity’s worst epidemics. “There are a lot of diseases described in the historical record that we don’t know what the pathogen is,” says molecular anthropologist Anne Stone of Arizona State University, Tempe.

One persistent question is how the deadliest pathogens evolve over time. As Farrell outlines in his piece, Schuenemann’s team found that the bacteria (Yersinia pestis) responsible for plagues in Africa today are genetically similar to the bacteria that unleashed the Black Death on Europeans in 1347. But, today’s plagues are less lethal – which suggests that evolution knocked a few teeth out of Black Death’s lethal bite.

Understanding when and how evolution changes a pathogen’s virulence is important because the plague is a re-emerging disease and history could repeat itself.

Holly MacCormick is a writing intern in the medical school’s Office of Communication & Public Affairs. She is a graduate student in ecology and evolutionary biology at University of California-Santa Cruz.

Previously: A journalist’s experience with tuberculosis, the “greatest infectious killer in human history”, Image of the Week: Leprosy bacteria and interferon-beta and Tropical disease treatments need more randomized, controlled trials, say Stanford researchers
Image, of the skull of a 25-year-old woman with leprosy, from Ben Krause-Kyora, PhD, with Kiel University. The genetic material extracted from the skeleton enabled the decoding of the genome of the leprosy pathogen.

Behavioral Science, Evolution, Neuroscience, Research, Stanford News

We've got your number: Exact spot in brain where numeral recognition takes place revealed

We've got your number: Exact spot in brain where numeral recognition takes place revealed

Your brain and my brain are shaped slightly differently. But, it’s a good bet, in almost the identical spot within each of them sits a clump of perhaps 1 to 2 million nerve cells that gets much more excited at the sight of numerals (“5,” for example) than when we see their spelled-out equivalents (“five”), lookalike letters (“5″ versus “S”) or scrambled symbols composed of rearranged components of the numerals themselves.

Josef Parvizi, MD, PhD, director of Stanford’s Human Intracranial Cognitive Electrophysiology Program, and his colleagues identified this numeral-recognition module by recording electrical activity directly from the brain surfaces of epileptic volunteers. Their study describing these experiments was just published in The Journal of Neuroscience.

As I explained in my release about the work:

[A]s a first step toward possible surgery to relieve unremitting seizures that weren’t responding to therapeutic drugs, [the patients had] had a small section of their skulls removed and electrodes applied directly to the brain’s surface. The procedure, which doesn’t destroy any brain tissue or disrupt the brain’s function, had been undertaken so that the patients could be monitored for several days to help attending neurologists find the exact location of their seizures’ origination points. While these patients are bedridden in the hospital for as much as a week of such monitoring, they are fully conscious, in no pain and, frankly, a bit bored.

Seven patients, in whom electrodes happened to be positioned near the area Parvizi’s team wanted to explore, gave the researchers permission to perform about an hour’s worth of tests. In the first, they watched a laptop screen on which appeared a rapid-fire random series of letters or numerals, scrambled versions of them, or foreign number symbols with which the experimental subjects were unfamiliar. In a second test, the experimental subjects viewed, again in thoroughly mixed-up sequence, numerals along with words for them as well as words that sounded the same (1″, “one”, “won”, “2”, “two”, “too”, etc.).

A region within a part of the brain called the inferior temporal gyrus showed activity in response to all kinds of squiggly lines, angles and curves. But within that area a small spot measuring about one-fifth of an inch across lit up preferentially in response to numerals compared with all the other stimuli.

The fact that this spot is embedded in a larger brain area generally responsive to lines, angles, and curves testifies to the human brain’s “plasticity:” its ability to tailor its form and function according to the dictates of experience.

“Humans aren’t born with the ability to recognize numbers,” says Parvizi. He thinks evolution may have generated, in the brains of our tree-dwelling primate ancestors, a brain region particularly adept at computing lines, angles and curves, facilitating snap decisions required for swinging quickly from one branch to the next.

Apparently, one particular spot within that larger tree-branch-interesection recognition area is easily diverted to the numeral-recognition activity constantly rewarded by parents and teachers during the numeracy boot camp called childhood.

Nobody can say those little monkeys don’t learn anything in kindergarten.

Previously: Metamorphosis: At the push of a button, a familiar face becomes a strange one and Why memory and math don’t mix: They require opposing states of the same brain circuitry
Photo by qthomasbower

Evolution, Immunology, Infectious Disease, Pediatrics, Research, Science, Stanford News

Deja Vu: Adults' immune systems "remember" microscopic monsters they've never seen before

Deja Vu: Adults' immune systems "remember" microscopic monsters they've never seen before

Probably no human whose age consists of two digits hasn’t at one time or another experienced a case of deja vu, the uncanny sense of having been through this (whatever  “this” may be) before.

Well, it turns out that (as the scary narrator of a kitchy sci-fi TV series I inhaled with both nostrils as a kid might say about UFOs and the like), “We’re not alone…” Our own immune systems, among whose chief functions is to fight off invading pathogens, also entertain “memories” of infectous microbes they’ve never, ever encountered. And that’s a lucky thing.

A human has only 20,000 or so genes, so it’s tough to imagine just how our immune systems are able to recognize potentially billions of differently shaped microbial body parts (or “epitopes” in immune-speak). Stanford immunologist Mark Davis, PhD, tore the cover off of immunology in the early 1980s by solving that riddle.

Now, in a just published study in Immunity, Davis and his team have used an advanced technique developed in his lab in the 1990s to  show that a surprising percentage of adult humans’ workhorse immune cells targeting one or another microbial epitope are poised to pounce on the particular epitope they target (and the bug it rode in with) despite having never come across it before. This hypervigilant configuration, called the “memory” state, was previously supposed to be limited to immune cells that have previously had a run-in with the epitope of interest.

Davis think’s he’s got the dirt on what’s behind the phenomenon: Dirt. He reasons that the kind of immune cells in question have more flexibility than has been thought, so each of them can “cross-react” to a small set of similarly but not identically shaped “lookalike” epitopes it’s never experienced. Our daily exposures to ubiquitous, mostly harmless micro-organisms that dwell in dirt, on doorknobs, and in our diets gradually produces an aggregate immune “memory” of not only these microbes’ body parts, but those of other bugs as well – including some nasty ones like HIV (the virus that causes AIDS), herpesvirus, and more.

Because cells in the “memory” configuration can react much, much faster to an infectious pathogen than “naive” cells targeting the exact same pathogen, this eerie foreknowledge can spell the difference between life and death.

But this immune memory still depends on having been exposed to something. In the study, the immune cells in blood from newborns’ umbilical cords showed no “memory” of anything at all.

As I wrote in my release on the new findings:

[This discovery] could explain why young children are so much more vulnerable to infectious diseases than adults. Moreover, the findings suggest a possible reason why vaccination against a single pathogen, measles, appears to have reduced overall mortality among African children more than can be attributed to the drop in measles deaths alone.

“It may even provide an evolutionary clue about why kids eat dirt,” Davis told me.

Previously: Immunology escapes from the mouse trap,  Age-related drop in immune responsiveness may be reversible and Common genetic Alzheimer’s risk factor disrupts healthy older women’s brain function, but not men’s
Photo by Damian Gadal

Evolution, Genetics, Immunology, Infectious Disease, Pregnancy, Research, Stanford News

Revealed: Epic evolutionary struggle between reproduction and immunity to infectious disease

Revealed: Epic evolutionary struggle between reproduction and immunity to infectious disease

Can’t blame us if our feet hurt. We humans have been walking erect for well over 3 million years. 

That new style of locomotion necessitated “considerable anatomical changes that altered the size and shape of the human female pelvis and the dimensions of the birth canal,” write Stanford and evolutionary theorist Peter Parham, PhD, and pathologist Ashley Moffitt, MD, of the University of Cambridge, in a just-released review article in Nature Reviews Immunology.

Walking upright, along with the development of our bigger brains, allowed us to head out out of Africa into Eurasia. Successive migration events of this nature have occurred a number of times since – leading to the emergence of Neanderthals in Europe around 600,000 years ago and the arrival there of anatomically modern humans (that’s us) a scant 67,000 years ago, give or take a few weeks.

But those bigger brains caused problems, too, the authors write:

The size of the human baby’s head increased until it reached the limit defined by the birth canal… [A]t full term, a modern human baby’s head just fits into the birth canal… [I]n the course of human evolution, birthing became a difficult, dangerous and frequently fatal process…

Bigger brains need more nourishment in utero, putting greater demands on the blood supply to the placenta. Plus, insufficient blood supply to the placenta can lead to pre-eclampsia, stillbirth or low birth weight. But if an emerging baby’s head is too big, it could kill both mother and child on the way out of the womb. It’s a delicate balancing act.

To the rescue come specialized immune cells called natural killer (or NK) cells, which play an important role in our front-line defense against infectious pathogens. NK cells play a key role in reproduction, too, by carefully regulating the development of placental blood vessels. This keeps fetal growth in bounds.

NK cells feature a particular surface molecule that comes in two versions. One of these versions turns out to be somewhat better suited for the task of managing fetal growth, the other for fighting infectious pathogens. Both of these versions are found in every human population ever studied, suggesting that a group’s survival over evolutionary time is favored by some optimal balance between the two.

Parham and Moffett conjure up a vision of just how such a compromise between these two versions might arise:

When an epidemic infection passed through a population, causing disease, death (particularly of the young) and social disruption, selection favored [one version]. When the epidemic subsided, the surviving and now smaller population was immune to further infection and enriched for [that version]. At this juncture, survival of the current generation was no longer the issue, and the priority became production of the next generation. [This favored the evolutionary selection of] factors that enhance the generation of larger and more robust progeny… [H]uman history has always involved successive cycles of [this] type…

Thus, all human populations maintain a mix of both surface-molecule versions. Any distinction between safe, efficient reproduction and vulnerability to infectious disease may seem nonexistent to people who consider babies to be invading organisms – which I suppose they are, in a way. (But cute, too.)

Previously: Our species’ twisted family tree, Humans owe important disease-fighting genes to trysts with cavemen and Humans share history – and a fair amount of genetic material – with Neanderthals
Photo by Lord Jim

Scope will be on a limited publishing schedule in honor of Martin Luther King, Jr. Day. We’ll resume our normal schedule tomorrow.

Evolution, Genetics, Research, Stanford News

Dumb, dumber and dumbest? Stanford biologist suggests humans on a downward slide

Dumb, dumber and dumbest? Stanford biologist suggests humans on a downward slide

I laughed out loud when I saw the many news reports today about the latest articles by Stanford developmental biologist Gerald Crabtree, MD. Not because the reports were wrong or because the research is poor. It’s just that the topic is guaranteed to make nearly anyone either laugh or cry — particularly someone (ahem, ME) who has recently been feeling less and less smart with each birthday.

Crabtree hypothesized in two articles published today in Trends in Genetics that humans are slowly accumulating genetic mutations that will have a deleterious effect on both our intellect and emotional stability. The reason, he believes, is the relative lack of selective pressure during the past 3,000 years. He begins boldly:

I would wager that if an average citizen from Athens of 1000 BC were to appear suddenly among us, he or she would be among the brightest and most intellectually alive of our colleagues and companions, with a good memory, a broad range of ideas, and a clear-sighted view of important issues. Furthermore, I would guess that he or she would be among the most emotionally stable of our friends and colleagues.

It’s an intriguing point. I’ve recently been re-educating myself about ancient Greek history,and I’ve been newly amazed about the breadth and depth of the philosophy and ideas propounded by people thousands of years ago.

Crabtree calculates that between 2,000 to 5,000 genes are likely required to maintain optimal intellectual acuity and emotional well-being. Extending his theory, it’s likely that we’ve each accumulated at least two harmful mutations during the intervening millennia. (Case in point? I just had to look up how to spell that last word.) Why? Well, according to Crabtree:

It is also likely that the need for intelligence was reduced as we began to live in supportive societies that made up for lapses of judgment or failures of  comprehension. Community life would, I believe, tend to reduce the selective pressure placed on every individual, every day of their life. Indeed that is why I prefer to live in such a society.

Don’t we all? But although it’s disheartening to think that we’re locked in a downward spiral, it’s way too soon to panic. Crabtree emphasizes that our demise will be slow:

However, if such a study found accelerating rates of accumulation of deleterious alleles over the past several thousand years then we would have to think about these issues more seriously. But we would not have to think too fast. One does not need to imagine a day when we could no longer comprehend the problem, or counteract the slow decay in the genes underlying our intellectual fitness, or have visions of the world population docilely watching reruns on televisions they can no longer build. It is exceedingly unlikely that a few hundred years will make any difference for the rate of change that might be occurring.


Photo by CollegeDegrees360

Stanford Medicine Resources: