Published by
Stanford Medicine

Category

Evolution

Evolution, otolaryngology, Research, Stanford News

Stanford engineer studies bones that aid hearing

Stanford engineer studies bones that aid hearing

What distinguishes us from the dinosaurs? Three middle ear bones, for starters. Stanford mechanical engineer Sunil Puria, PhD, studies inner- and middle-ear biomechanics and the role of bone conduction in hearing, and he’s among a number of scientists who are curious why we have the tiny malleus, incus and stapes but reptiles and birds don’t.

Read more from Stanford Report on what Puria’s research on bone conduction hearing could mean for treating hearing loss, enhancing sound technology, and understanding evolutionary biology.

Previously: Battling hearing loss on and off the battlefield and Hearing loss patient discusses why Stanford research gives her hope for an eventual cure

Evolution, Genetics, Research, Stanford News

Stanford study investigates our most-recent common ancestors

Stanford study investigates our most-recent common ancestors

skullsGenealogy buffs know the thrill that comes with identifying ancestors further and further up the family tree. Reaching back through the sands of time to learn when, where and how our relatives lived is like a very personal treasure hunt.

Now geneticist Carlos Bustamante, PhD, and his colleagues have done something similar for our Y-chromosomal “Adam” and mitochondrial “Eve”: two individuals who have passed down a portion of their genomes to the vast expanse of humanity. These people are known as our most-recent common ancestors, or MRCAs. From our release:

“Previous research has indicated that the male MRCA lived much more recently than the female MRCA,” said Bustamante… “But now our research shows that there’s no discrepancy.” Previous estimates for the male MRCA ranged from between 50,000 to 115,000 years ago.

Bustamante’s research indicates the two MRCAs roughly overlapped during evolutionary time: from between 120,000 to 156,000 years ago for the man, and between 99,000 and 148,000 years ago for the woman. More from the release :

Despite the Adam and Eve monikers, which evoke a single couple whose children peopled the world, it is extremely unlikely that the male and female MRCAs were exact contemporaries. And they weren’t the only man and woman alive at the time, or the only people to have present-day descendants. These two individuals simply had the good fortune of successfully passing on specific portions of their DNA, called the Y chromosome and the mitochondrial genome, through the millennia to most of us, while the corresponding sequences of others have largely died out due to natural selection or a random process called genetic drift.

The researchers used high-throughput sequencing technology to sequence the Y chromosomes of 69 men from nine globally distinct regions. They identified about 11,000 differences among the sequences, which allowed them to determine phylogentic relationships and timelines with unprecedented accuracy. As graduate student and study co-author David Poznik described:

Essentially, we’ve constructed a family tree for the Y chromosome. Prior to high-throughput sequencing, the tree was based on just a few hundred variants. Although these variants had revealed the main topology, we couldn’t say much about the length of any branch — the number of variants shared by all of its descendants. We now have a more complete structure, including meaningful branch lengths, which are proxies for the periods of time between specific branching events.

Previously: Recent shared ancestry between Southern Europe and North Africa identified by Stanford researchers, Cracking the code of 1000 (make that 1092!) genomes and Blond hair evolved more than once-and why it matters
Photo by tup wanders

Evolution, Research, Stanford News

Mammals can “choose” the sex of their offspring, Stanford study finds

Mammals can "choose" the sex of their offspring, Stanford study finds

monkey and mama

A Stanford researcher and his colleagues have produced a surprising new study which shows that mammals can effectively “choose” the sex of their offspring and they’re doing it for their own selfish reasons – so they can have more grandchildren.

The researchers used data on more than 2,300 animals at the San Diego Zoo to track three generations for nearly 200 mammal species. Their numbers confirm what evolutionary biologists have theorized for decades – that females are strategically evaluating their options and making decisions that will serve their own reproductive interests, said Joseph Garner, PhD, associate professor of comparative medicine at Stanford and senior author of the study. Garner told me:

You can think of this as being girl power at work in the animal kingdom. We like to think of reproduction as being all about the males competing for females, females dutifully picking the winner. But in reality females have much more invested than males, and they are making highly strategic decisions about their reproduction based on the environment, their condition, and the quality of their mate. Amazingly, the female is somehow picking the sperm that will produce the sex that will serve her interests the most: the sperm are really just pawns in a game that plays out over generations.

What the researchers found is that grandparents could strategically choose to give birth to sons, if the sons would be of high enough quality to give them more grandchildren. In fact, they found that when females did have mostly sons, those sons had 2.7 times more children per capita than those who had equal numbers of male and female offspring. The same was true for male grandparents, with the study showing that when grandfathers produced mostly sons, those sons on average had 2.4 times more children per capita. But again, it’s most likely the females controlling the process, Garner said.

How they do it remains a bit of a mystery, he said, though one theory is that females control the sperm as it moves through the reproductive tract, selectively slowing down or speeding up the sperm they want to select.

Garner noted that there have been studies showing some similar patterns among humans. For instance, one found that billionaires are more likely to have sons than daughters because sons tend to retain the family’s wealth, or so the theory goes.

To read more on this tantalizing subject, see the study, which appears online today in PLOS ONE.

Photo by EvaSwensen

Evolution, Genetics, Immunology, In the News, Infectious Disease, Research

All in the family: Uncovering the genetic history of the world’s most lethal pathogens

All in the family: Uncovering the genetic history of the world's most lethal pathogens
They’re tiny terrors that are best known for the millions of people they’ve killed. Few us of would want to meet them, or their relatives. But for researchers like Verena J. Schuenemann, PhD, and Johannes Krause, PhD, uncovering the pedigrees of the world’s most lethal pathogens is an important step in combating the diseases we avoid like the plague.

Earlier this week in Forbes, writer John Farrell told the tale of the two scientists from the University of Tübingen in Germany, and how they hunt down some of the most notorious pathogens in history.

Studies such as work done by Schuenemann, Krause, and a team of international researchers on potato blight, the Black Death, and (most recently) leprosy, are changing our understanding of diseases that were once buried in the past. As Ann Gibbons of Science recently wrote (subscription required):

Awash in data, several labs are racing neck-and-neck to cull DNA from a Most Wanted list of legendary killers: tuberculosis (TB), plague, cholera, Leishmania, leprosy, the potato blight, and AIDS. They gather traces of these culprits from ancient teeth, bones, hair, feces, and—in the case of potato blight—from skin and leaves, then unleash the sequencers. The work, which began in earnest 3 years ago, adds a new dimension to our understanding of historical events, revealing the true nature of the villains responsible for humanity’s worst epidemics. “There are a lot of diseases described in the historical record that we don’t know what the pathogen is,” says molecular anthropologist Anne Stone of Arizona State University, Tempe.

One persistent question is how the deadliest pathogens evolve over time. As Farrell outlines in his piece, Schuenemann’s team found that the bacteria (Yersinia pestis) responsible for plagues in Africa today are genetically similar to the bacteria that unleashed the Black Death on Europeans in 1347. But, today’s plagues are less lethal – which suggests that evolution knocked a few teeth out of Black Death’s lethal bite.

Understanding when and how evolution changes a pathogen’s virulence is important because the plague is a re-emerging disease and history could repeat itself.

Holly MacCormick is a writing intern in the medical school’s Office of Communication & Public Affairs. She is a graduate student in ecology and evolutionary biology at University of California-Santa Cruz.

Previously: A journalist’s experience with tuberculosis, the “greatest infectious killer in human history”, Image of the Week: Leprosy bacteria and interferon-beta and Tropical disease treatments need more randomized, controlled trials, say Stanford researchers
Image, of the skull of a 25-year-old woman with leprosy, from Ben Krause-Kyora, PhD, with Kiel University. The genetic material extracted from the skeleton enabled the decoding of the genome of the leprosy pathogen.

Behavioral Science, Evolution, Neuroscience, Research, Stanford News

We’ve got your number: Exact spot in brain where numeral recognition takes place revealed

We've got your number: Exact spot in brain where numeral recognition takes place revealed

Your brain and my brain are shaped slightly differently. But, it’s a good bet, in almost the identical spot within each of them sits a clump of perhaps 1 to 2 million nerve cells that gets much more excited at the sight of numerals (“5,” for example) than when we see their spelled-out equivalents (“five”), lookalike letters (“5″ versus “S”) or scrambled symbols composed of rearranged components of the numerals themselves.

Josef Parvizi, MD, PhD, director of Stanford’s Human Intracranial Cognitive Electrophysiology Program, and his colleagues identified this numeral-recognition module by recording electrical activity directly from the brain surfaces of epileptic volunteers. Their study describing these experiments was just published in The Journal of Neuroscience.

As I explained in my release about the work:

[A]s a first step toward possible surgery to relieve unremitting seizures that weren’t responding to therapeutic drugs, [the patients had] had a small section of their skulls removed and electrodes applied directly to the brain’s surface. The procedure, which doesn’t destroy any brain tissue or disrupt the brain’s function, had been undertaken so that the patients could be monitored for several days to help attending neurologists find the exact location of their seizures’ origination points. While these patients are bedridden in the hospital for as much as a week of such monitoring, they are fully conscious, in no pain and, frankly, a bit bored.

Seven patients, in whom electrodes happened to be positioned near the area Parvizi’s team wanted to explore, gave the researchers permission to perform about an hour’s worth of tests. In the first, they watched a laptop screen on which appeared a rapid-fire random series of letters or numerals, scrambled versions of them, or foreign number symbols with which the experimental subjects were unfamiliar. In a second test, the experimental subjects viewed, again in thoroughly mixed-up sequence, numerals along with words for them as well as words that sounded the same (1″, “one”, “won”, “2″, “two”, “too”, etc.).

A region within a part of the brain called the inferior temporal gyrus showed activity in response to all kinds of squiggly lines, angles and curves. But within that area a small spot measuring about one-fifth of an inch across lit up preferentially in response to numerals compared with all the other stimuli.

The fact that this spot is embedded in a larger brain area generally responsive to lines, angles, and curves testifies to the human brain’s “plasticity:” its ability to tailor its form and function according to the dictates of experience.

“Humans aren’t born with the ability to recognize numbers,” says Parvizi. He thinks evolution may have generated, in the brains of our tree-dwelling primate ancestors, a brain region particularly adept at computing lines, angles and curves, facilitating snap decisions required for swinging quickly from one branch to the next.

Apparently, one particular spot within that larger tree-branch-interesection recognition area is easily diverted to the numeral-recognition activity constantly rewarded by parents and teachers during the numeracy boot camp called childhood.

Nobody can say those little monkeys don’t learn anything in kindergarten.

Previously: Metamorphosis: At the push of a button, a familiar face becomes a strange one and Why memory and math don’t mix: They require opposing states of the same brain circuitry
Photo by qthomasbower

Evolution, Immunology, Infectious Disease, Pediatrics, Research, Science, Stanford News

Deja Vu: Adults’ immune systems “remember” microscopic monsters they’ve never seen before

Deja Vu: Adults' immune systems "remember" microscopic monsters they've never seen before

Probably no human whose age consists of two digits hasn’t at one time or another experienced a case of deja vu, the uncanny sense of having been through this (whatever  “this” may be) before.

Well, it turns out that (as the scary narrator of a kitchy sci-fi TV series I inhaled with both nostrils as a kid might say about UFOs and the like), “We’re not alone…” Our own immune systems, among whose chief functions is to fight off invading pathogens, also entertain “memories” of infectous microbes they’ve never, ever encountered. And that’s a lucky thing.

A human has only 20,000 or so genes, so it’s tough to imagine just how our immune systems are able to recognize potentially billions of differently shaped microbial body parts (or “epitopes” in immune-speak). Stanford immunologist Mark Davis, PhD, tore the cover off of immunology in the early 1980s by solving that riddle.

Now, in a just published study in Immunity, Davis and his team have used an advanced technique developed in his lab in the 1990s to  show that a surprising percentage of adult humans’ workhorse immune cells targeting one or another microbial epitope are poised to pounce on the particular epitope they target (and the bug it rode in with) despite having never come across it before. This hypervigilant configuration, called the ”memory” state, was previously supposed to be limited to immune cells that have previously had a run-in with the epitope of interest.

Davis think’s he’s got the dirt on what’s behind the phenomenon: Dirt. He reasons that the kind of immune cells in question have more flexibility than has been thought, so each of them can “cross-react” to a small set of similarly but not identically shaped ”lookalike” epitopes it’s never experienced. Our daily exposures to ubiquitous, mostly harmless micro-organisms that dwell in dirt, on doorknobs, and in our diets gradually produces an aggregate immune “memory” of not only these microbes’ body parts, but those of other bugs as well - including some nasty ones like HIV (the virus that causes AIDS), herpesvirus, and more.

Because cells in the “memory” configuration can react much, much faster to an infectious pathogen than “naive” cells targeting the exact same pathogen, this eerie foreknowledge can spell the difference between life and death.

But this immune memory still depends on having been exposed to something. In the study, the immune cells in blood from newborns’ umbilical cords showed no “memory” of anything at all.

As I wrote in my release on the new findings:

[This discovery] could explain why young children are so much more vulnerable to infectious diseases than adults. Moreover, the findings suggest a possible reason why vaccination against a single pathogen, measles, appears to have reduced overall mortality among African children more than can be attributed to the drop in measles deaths alone.

“It may even provide an evolutionary clue about why kids eat dirt,” Davis told me.

Previously: Immunology escapes from the mouse trap,  Age-related drop in immune responsiveness may be reversible and Common genetic Alzheimer’s risk factor disrupts healthy older women’s brain function, but not men’s
Photo by Damian Gadal

Evolution, Genetics, Immunology, Infectious Disease, Pregnancy, Research, Stanford News

Revealed: Epic evolutionary struggle between reproduction and immunity to infectious disease

Revealed: Epic evolutionary struggle between reproduction and immunity to infectious disease

Can’t blame us if our feet hurt. We humans have been walking erect for well over 3 million years. 

That new style of locomotion necessitated “considerable anatomical changes that altered the size and shape of the human female pelvis and the dimensions of the birth canal,” write Stanford and evolutionary theorist Peter Parham, PhD, and pathologist Ashley Moffitt, MD, of the University of Cambridge, in a just-released review article in Nature Reviews Immunology.

Walking upright, along with the development of our bigger brains, allowed us to head out out of Africa into Eurasia. Successive migration events of this nature have occurred a number of times since – leading to the emergence of Neanderthals in Europe around 600,000 years ago and the arrival there of anatomically modern humans (that’s us) a scant 67,000 years ago, give or take a few weeks.

But those bigger brains caused problems, too, the authors write:

The size of the human baby’s head increased until it reached the limit defined by the birth canal… [A]t full term, a modern human baby’s head just fits into the birth canal… [I]n the course of human evolution, birthing became a difficult, dangerous and frequently fatal process…

Bigger brains need more nourishment in utero, putting greater demands on the blood supply to the placenta. Plus, insufficient blood supply to the placenta can lead to pre-eclampsia, stillbirth or low birth weight. But if an emerging baby’s head is too big, it could kill both mother and child on the way out of the womb. It’s a delicate balancing act.

To the rescue come specialized immune cells called natural killer (or NK) cells, which play an important role in our front-line defense against infectious pathogens. NK cells play a key role in reproduction, too, by carefully regulating the development of placental blood vessels. This keeps fetal growth in bounds.

NK cells feature a particular surface molecule that comes in two versions. One of these versions turns out to be somewhat better suited for the task of managing fetal growth, the other for fighting infectious pathogens. Both of these versions are found in every human population ever studied, suggesting that a group’s survival over evolutionary time is favored by some optimal balance between the two.

Parham and Moffett conjure up a vision of just how such a compromise between these two versions might arise:

When an epidemic infection passed through a population, causing disease, death (particularly of the young) and social disruption, selection favored [one version]. When the epidemic subsided, the surviving and now smaller population was immune to further infection and enriched for [that version]. At this juncture, survival of the current generation was no longer the issue, and the priority became production of the next generation. [This favored the evolutionary selection of] factors that enhance the generation of larger and more robust progeny… [H]uman history has always involved successive cycles of [this] type…

Thus, all human populations maintain a mix of both surface-molecule versions. Any distinction between safe, efficient reproduction and vulnerability to infectious disease may seem nonexistent to people who consider babies to be invading organisms – which I suppose they are, in a way. (But cute, too.)

Previously: Our species’ twisted family tree, Humans owe important disease-fighting genes to trysts with cavemen and Humans share history – and a fair amount of genetic material – with Neanderthals
Photo by Lord Jim

Scope will be on a limited publishing schedule in honor of Martin Luther King, Jr. Day. We’ll resume our normal schedule tomorrow.

Evolution, Genetics, Research, Stanford News

Dumb, dumber and dumbest? Stanford biologist suggests humans on a downward slide

Dumb, dumber and dumbest? Stanford biologist suggests humans on a downward slide

I laughed out loud when I saw the many news reports today about the latest articles by Stanford developmental biologist Gerald Crabtree, MD. Not because the reports were wrong or because the research is poor. It’s just that the topic is guaranteed to make nearly anyone either laugh or cry — particularly someone (ahem, ME) who has recently been feeling less and less smart with each birthday.

Crabtree hypothesized in two articles published today in Trends in Genetics that humans are slowly accumulating genetic mutations that will have a deleterious effect on both our intellect and emotional stability. The reason, he believes, is the relative lack of selective pressure during the past 3,000 years. He begins boldly:

I would wager that if an average citizen from Athens of 1000 BC were to appear suddenly among us, he or she would be among the brightest and most intellectually alive of our colleagues and companions, with a good memory, a broad range of ideas, and a clear-sighted view of important issues. Furthermore, I would guess that he or she would be among the most emotionally stable of our friends and colleagues.

It’s an intriguing point. I’ve recently been re-educating myself about ancient Greek history,and I’ve been newly amazed about the breadth and depth of the philosophy and ideas propounded by people thousands of years ago.

Crabtree calculates that between 2,000 to 5,000 genes are likely required to maintain optimal intellectual acuity and emotional well-being. Extending his theory, it’s likely that we’ve each accumulated at least two harmful mutations during the intervening millennia. (Case in point? I just had to look up how to spell that last word.) Why? Well, according to Crabtree:

It is also likely that the need for intelligence was reduced as we began to live in supportive societies that made up for lapses of judgment or failures of  comprehension. Community life would, I believe, tend to reduce the selective pressure placed on every individual, every day of their life. Indeed that is why I prefer to live in such a society.

Don’t we all? But although it’s disheartening to think that we’re locked in a downward spiral, it’s way too soon to panic. Crabtree emphasizes that our demise will be slow:

However, if such a study found accelerating rates of accumulation of deleterious alleles over the past several thousand years then we would have to think about these issues more seriously. But we would not have to think too fast. One does not need to imagine a day when we could no longer comprehend the problem, or counteract the slow decay in the genes underlying our intellectual fitness, or have visions of the world population docilely watching reruns on televisions they can no longer build. It is exceedingly unlikely that a few hundred years will make any difference for the rate of change that might be occurring.

Whew.

Photo by CollegeDegrees360

Evolution, Genetics, Research, Stanford News

How the cheetah gets its… stripes? Stanford geneticist cracks the code

How the cheetah gets its... stripes? Stanford geneticist cracks the code

Cheetahs with stripes? Tabby cats with blotches? Researchers in the laboratory of Stanford geneticist Greg Barsh, MD, PhD, have pinpointed the cause of the unique coat patterns that give big and little cats their familiar periodic markings that allow even small children to distinguish a tiger from a cheetah. The research will be published tomorrow in Science. From our release:

The scientists found that the two felines share a biological mechanism responsible for both the elegant stripes on the tabby cat and the cheetah’s normally dappled coat. Dramatic changes to the normal patterns occur when this pathway is disrupted: The resulting house cat has swirled patches of color rather than orderly stripes, and the normally spotted cheetah sports thick, dark lines down its back.

“Mutation of a single gene causes stripes to become blotches, and spots to become stripes,” said [Barsh], emeritus professor of genetics and of pediatrics at Stanford and an investigator at the HudsonAlpha Institute.

The differences are so pronounced that biologists at first thought that cheetahs with the mutated gene belonged to an entirely different species. The rare animals became known as “king cheetahs,” while affected tabby cats received the less-regal moniker of “blotched.” (The more familiar, striped cat is known as a mackerel tabby.)

The Stanford researchers collaborated with colleagues at the National Cancer Institute and HudsonAlpha Institute for Biotechnology in Huntsville, Alabama to conduct the study on feral cats in Northern California and captive and wild cheetah populations in North America, South African and Namibia. The results have implications beyond natural history. Again from our release:

Barsh and his lab members have spent decades investigating how traditional laboratory animals such as mice develop specific coat colors. His previous work identified a variety of biologically important pathways that control more than just hair or skin color, and have been linked to brain degeneration, anemia and bone marrow failure. But laboratory mice don’t display the pattern variation seen in many mammals.

Photo by Greg Barsh

Evolution, Public Health, Research

Do placebos provide a mental cue to kickstart the immune system?

British scientists have uncovered some interesting insights into the body’s healing process that could help explain the placebo effect.

In a new study (subscription required), researchers developed a computer model to test a decade’s old theory (.pdf) originally put forth by former London School of Economics psychologist Nicholas Humphrey, PhD. Humphrey proposed that patients subconsciously respond to treatment, even if it’s a sugar pill, because of their strong belief in the medicine’s ability to fight the infection and aid in recovery without further depleting the body’s resources.

A New Scientist story published today describes the latest findings and how the new evidence supports Humphrey’s earlier paper:

It all starts with the observation that something similar to the placebo effect occurs in many animals, says Peter Trimmer, a biologist at the University of Bristol, UK. For instance, Siberian hamsters do little to fight an infection if the lights above their lab cage mimic the short days and long nights of winter. But changing the lighting pattern to give the impression of summer causes them to mount a full immune response.

Likewise, those people who think they are taking a drug but are really receiving a placebo can have a response which is twice that of those who receive no pills (Annals of Family Medicine, doi.org/cckm8b). In Siberian hamsters and people, intervention creates a mental cue that kick-starts the immune response.

Trimmer’s simulation is built on this assumption – that animals need to spend vital resources on fighting low-level infections. The model revealed that, in challenging environments, animals lived longer and sired more offspring if they endured infections without mounting an immune response. In more favourable environments, it was best for animals to mount an immune response and return to health as quickly as possible (Evolution and Human Behavior, doi.org/h8p). The results show a clear evolutionary benefit to switching the immune system on and off depending on environmental conditions.

Previously: Debating the placebo as treatmentThe puzzling powers of the palacebo effect and The increasing power of the placebo
Photo by Lucy Reynell

Stanford Medicine Resources: