Category: Science News

  • Brain regulates social behavior differently in males and females, study reveals

    {The brain regulates social behavior differently in males and females, according to a new study published in the Proceedings of the National Academy of Sciences.}

    A team of researchers led by Dr. Elliott Albers, director of the Center for Behavioral Neuroscience and Regents’ Professor of Neuroscience at Georgia State University, and graduate student Joseph I. Terranova, has discovered that serotonin (5-HT) and arginine-vasopressin (AVP) act in opposite ways in males and females to influence aggression and dominance. Because dominance and aggressiveness have been linked to stress resistance, these findings may influence the development of more effective gender-specific treatment strategies for stress-related neuropsychiatric disorders.

    “These results begin to provide a neurochemical basis for understanding how the social brain works quite differently in males and females,” said Albers.

    Prominent sex differences occur in the incidence, development and clinical course of many neuropsychiatric disorders. Women, for example, have higher rates of depression and anxiety disorders such as posttraumatic stress disorder (PTSD), while men more frequently suffer from autism and attention deficit disorder. Despite profound sex differences in the expression of social behavior and the incidence of these psychiatric disorders, little is known about how the brain mechanisms underlying these phenomena differ in females and males. Further, limited knowledge exists regarding sex differences in the efficacy of treatments for these disorders. As a result, current treatment strategies are largely the same for both sexes.

    In this study conducted in hamsters, the researchers investigated the hypothesis that 5-HT promotes and AVP inhibits aggression and dominance in females and that 5-HT inhibits and AVP promotes aggression and dominance in males. Their data show strong support for this hypothesis with the discovery that 5-HT and AVP act in opposite ways within the hypothalamus to regulate dominance and aggression in females and males.

    This study also found that administration of the 5-HT reuptake inhibitor fluoxetine, one of the most commonly prescribed drugs for psychiatric disorders, increased aggression in females and inhibited aggression in males. These studies raise the possibility that stress-related neuropsychiatric disorders such as PTSD may be more effectively treated with 5-HT-targeted drugs in women and with AVP-targeted drugs in men.

    The research team involved in this discovery included Dr. Zhimin Song, Tony E. Larkin, Nathan Hardcastle Alisa Norvelle and Ansa Riaz from Georgia State’s Neuroscience Institute.

    The next step will be to investigate whether there are sex differences in the efficacy of 5-HT- and AVP-active drugs in reducing social stress.

    Serotonin and arginine-vasopressin act in opposite ways in males and females to influence aggression and dominance, new research indicates.
  • Brain volume predicts successful weight loss in the elderly

    {If you’re trying to lose weight, what are your chances of success? Your brain may hold the key. Scientists at Wake Forest Baptist Medical Center believe they may have found a way to predict who will be successful in their weight-loss efforts with a quick, non-invasive brain scan.}

    In findings from a small study published in the current online issue of the journal Obesity, the researchers were able to predict weight loss success with 78 percent accuracy based on the brain volume of the study participants. “A simple test that can predict intentional weight loss success using structural brain characteristics could ultimately be used to tailor treatment for patients,” said Jonathan Burdette, M.D., professor of radiology at Wake Forest School of Medicine, part of Wake Forest Baptist, and co-author of the study.

    “For example, people identified at high risk for failure might benefit from intensive treatment and close guidance. People identified as having a high probability for success might best respond to less intensive treatment.” In the study, 52 participants, age 60 to 79, were recruited from the Cooperative Lifestyle Interventions Programs II (CLIP-II) project. The participants were overweight or obese (BMI greater than 28 and less than 42) and had a history of either cardiovascular disease or metabolic syndrome. All participants had a baseline MRI scan and then were randomized to one of three groups — diet only, diet plus aerobic exercise training or diet plus resistance exercise training. The goal of the 18-month diet and exercise program was a weight loss of 7 to 10 percent of body mass.

    Basic brain structure information garnered from the MRIs was classified using a support vector machine, a type of computerized predictive algorithm. Predictions were based on baseline brain gray and white matter volume from the participants’ MRIs and compared to the study participants’ actual weight loss after the 18 months. Brain gray matter volume provided higher prediction accuracy compared with white matter and the combination of the two outperformed either one alone, Burdette said.

    The study’s small sample size was a limitation, Burdette said, but the researchers hope to include more people in follow-up studies and broaden the types of interventions to help improve the predictive nature of the test. “Future studies will investigate whether functional brain networks in association with patterns of brain anatomy may improve prediction, as our recent research has demonstrated that brain circuits are associated with food craving and the self-regulation of eating behavior,” he said.

  • Brain’s multi-track road to long-term memory

    {Our brain has a tough task every time we experience something new — it must be flexible to take in new information instantly, but also stable enough to store it for a long time. And new memories may not be allowed to alter or overwrite old ones. The brain solves this problem by storing new information in two separate places — the hippocampus, a short-term storage site with high plasticity and capacity that can absorb information quickly; and in a part of the cerebral cortex, the neocortex. This is slower to take in the information, but protects it for the long term and does not allow it to be overwritten. Researchers from the Institute of Medical Psychology and Behavioral Neurobiology at the University of Tübingen have been working with colleagues from Munich to discover how these two systems work together as we learn. Their findings have been published in the latest issue of PNAS.}

    The hippocampus has been the focus of intense scrutiny by memory researchers since the late 1950s, when it was surgically removed from a patient known as H.M. — who was thereafter unable to form new memories. It was largely unknown what role the neocortex played in memory or how the two regions interacted. In their experiments, the Tübingen researchers placed test subjects at a computer screen and into a virtual maze, where they had to find hidden objects. The longer the test persons moved through the maze, the better they became at understanding how it was set out and where the hidden objects were. While the test subjects were carrying out the task, their brain activity was recorded by an MRI scanner.

    In order to identify the brain region responsible for spatial memory, the researchers had a special trick. During one part of the experiment the maze did not change. This enabled the participants to slowly build up a spatial representation of it in their memories. In another part of the experiment, the maze changed constantly, so that the test subjects could not recognize it or learn a set path around it. “The comparison of the MRI images from the two mazes reveals which brain regions were specifically contributing to the formation of spatial memories,” says Svenja Brodt, a doctoral candidate at the Graduate Training Center of Neuroscience and lead author of the study. “We were suprised that the activity of the precuneus, a region at the back of the neocortex, steadily increased, while the activity in the hippocampus steadily fell,” Brodt explains. And communication between the two regions also fell during the learning process, according to Brodt.

    “These results enable us to demonstrate that the long-term, neocortical traces of memory are formed right when the information is first gathered,” says Dr. Monika Schönauer, who supervised the study. She said the pace of this process was astounding. Researchers had always assumed that the process took place very slowly, lasting weeks or months. Professor Steffen Gais explains: “The astonishing thing is that the hippocampus ceases to participate in learning after such a short time.” The number of repetitions appeared to have a key influence on how quickly a long-term, stable memory was formed in the neocortex.

    “An independent representation of the memory is formed in the precuneus,” according to Brodt. “When the MRI showed activity in the precuneus of a test subject, we could predict whether the per-son would find one of the hidden objects in the maze or not.” These latest findings provide important information about which regions store long-term memory. This could help doctors in the future to come up with better treatments for patients with dementia or disorders of the hippocampus. “But even for school situations, these results are important when it comes to learning straightforward material, such as vocab or times tables, both quickly and for the long term. According to our findings, there is no getting around the frequent repetition of material to be learned,” Brodt says.

    How do different brain regions interact when long-term memories are formed?
  • Nanobionic spinach plants can detect explosives

    {After sensing dangerous chemicals, the carbon-nanotube-enhanced plants send an alert.}

    Spinach is no longer just a superfood: By embedding leaves with carbon nanotubes, MIT engineers have transformed spinach plants into sensors that can detect explosives and wirelessly relay that information to a handheld device similar to a smartphone.

    This is one of the first demonstrations of engineering electronic systems into plants, an approach that the researchers call “plant nanobionics.”

    “The goal of plant nanobionics is to introduce nanoparticles into the plant to give it non-native functions,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT and the leader of the research team.

    In this case, the plants were designed to detect chemical compounds known as nitroaromatics, which are often used in landmines and other explosives. When one of these chemicals is present in the groundwater sampled naturally by the plant, carbon nanotubes embedded in the plant leaves emit a fluorescent signal that can be read with an infrared camera. The camera can be attached to a small computer similar to a smartphone, which then sends an email to the user.

    “This is a novel demonstration of how we have overcome the plant/human communication barrier,” says Strano, who believes plant power could also be harnessed to warn of pollutants and environmental conditions such as drought.

    Strano is the senior author of a paper describing the nanobionic plants in the Oct. 31 issue of Nature Materials. The paper’s lead author is Min Hao Wong, an MIT graduate student who has started a company called Plantea to further develop this technology.

    Environmental monitoring

    Two years ago, in the first demonstration of plant nanobionics, Strano and former MIT postdoc Juan Pablo Giraldo used nanoparticles to enhance plants’ photosynthesis ability and to turn them into sensors for nitric oxide, a pollutant produced by combustion.

    Plants are ideally suited for monitoring the environment because they already take in a lot of information from their surroundings, Strano says.

    “Plants are very good analytical chemists,” he says. “They have an extensive root network in the soil, are constantly sampling groundwater, and have a way to self-power the transport of that water up into the leaves.”

    Strano’s lab has previously developed carbon nanotubes that can be used as sensors to detect a wide range of molecules, including hydrogen peroxide, the explosive TNT, and the nerve gas sarin. When the target molecule binds to a polymer wrapped around the nanotube, it alters the tube’s fluorescence.

    In the new study, the researchers embedded sensors for nitroaromatic compounds into the leaves of spinach plants. Using a technique called vascular infusion, which involves applying a solution of nanoparticles to the underside of the leaf, they placed the sensors into a leaf layer known as the mesophyll, which is where most photosynthesis takes place.

    They also embedded carbon nanotubes that emit a constant fluorescent signal that serves as a reference. This allows the researchers to compare the two fluorescent signals, making it easier to determine if the explosive sensor has detected anything. If there are any explosive molecules in the groundwater, it takes about 10 minutes for the plant to draw them up into the leaves, where they encounter the detector.

    To read the signal, the researchers shine a laser onto the leaf, prompting the nanotubes in the leaf to emit near-infrared fluorescent light. This can be detected with a small infrared camera connected to a Raspberry Pi, a $35 credit-card-sized computer similar to the computer inside a smartphone. The signal could also be detected with a smartphone by removing the infrared filter that most camera phones have, the researchers say.

    “This setup could be replaced by a cell phone and the right kind of camera,” Strano says. “It’s just the infrared filter that would stop you from using your cell phone.”

    Using this setup, the researchers can pick up a signal from about 1 meter away from the plant, and they are now working on increasing that distance.

    “A wealth of information”

    In the 2014 plant nanobionics study, Strano’s lab worked with a common laboratory plant known as Arabidopsis thaliana. However, the researchers wanted to use common spinach plants for the latest study, to demonstrate the versatility of this technique. “You can apply these techniques with any living plant,” Strano says.

    So far, the researchers have also engineered spinach plants that can detect dopamine, which influences plant root growth, and they are now working on additional sensors, including some that track the chemicals plants use to convey information within their own tissues.

    “Plants are very environmentally responsive,” Strano says. “They know that there is going to be a drought long before we do. They can detect small changes in the properties of soil and water potential. If we tap into those chemical signaling pathways, there is a wealth of information to access.”

    These sensors could also help botanists learn more about the inner workings of plants, monitor plant health, and maximize the yield of rare compounds synthesized by plants such as the Madagascar periwinkle, which produces drugs used to treat cancer.

    “These sensors give real-time information from the plant. It is almost like having the plant talk to us about the environment they are in,” Wong says. “In the case of precision agriculture, having such information can directly affect yield and margins.”

    By embedding spinach leaves with carbon nanotubes, MIT engineers have transformed spinach plants into sensors that can detect explosives and wirelessly relay that information to a handheld device similar to a smartphone.
  • Making sense of the seneses: ‘Context’ matters when the brain interprets sounds

    {The brain’s interpretation of sound is influenced by cues from other senses, explaining more precisely how we interpret what we hear at a particular moment, according to a report published in Nature Neuroscience online Oct. 31.}

    In the new study in mice, researchers at NYU Langone Medical Center found that nerve cells dedicated to hearing also rely on surrounding context to properly interpret and react to familiar sounds.

    “What the brain ‘hears’ depends on what is ‘seen’ in addition to specific sounds, as the brain calculates how to respond,” says study senior investigator and neuroscientist Robert Froemke, PhD, an assistant professor at NYU Langone and its Skirball Institute of Biomolecular Medicine.

    Froemke says his team’s latest findings reveal that while mammals recognize sounds in the auditory cortex of their brains, the signaling levels of nerve cells in this brain region are simultaneously being strengthened or weakened in response to surrounding context.

    “Our study shows how the same sound can mean different things inside the brain depending on the situation,” says Froemke. “We know, for instance, that people learn to respond without alarm to the honk of a car horn if heard from the safety of their homes, but are startled to hear the same honk while crossing a busy street.”

    If further experiments find similar activity in human brains, the researchers say their work may lead to precise explanations of situation-specific behaviors, such as anxiety brought on during math exams; sudden post-traumatic stress among combat veterans hearing a car backfire; and the ability of people with dementia to better remember certain events when they hear a familiar voice or see a friend’s face.

    To map how the same sense can be perceived differently in the brain, the NYU Langone team, led by postdoctoral fellow Kishore Kuchibhotla, PhD, monitored nerve circuit activity in mice when the animals expected, and did not expect, to get a water reward through a straw-like tube (that they see) after the ringing of a familiar musical note.

    When mice were exposed to specific auditory cues, researchers observed patterns based on a basic divide in the nature of nerve cells. Each nerve cell “decides” whether a message travels onward in a nerve pathway. Nerve cells that emit chemicals which tell the next cell in line to amplify a message are excitatory; those that stop messages are inhibitory. Combinations of the two strike a counterbalance critical to the function of the nervous system, with inhibitory cells sculpting “noise” from excitatory cells into the arrangements behind thought and memory.

    Furthermore, the processing of incoming sensory information is achieved in part by adjusting signaling levels through each type of nerve cell. Theories hold that the brain may attach more importance to a given signal by turning up or down excitatory signals, or by doing the same with inhibitory nerve cells.

    In the current study, researchers found to their surprise that most of the nerve cells in auditory cortex neurons that stimulate brain activity (excitatory) had signaled less (had “weaker” activity) when the mice expected and got a reward. Meanwhile, and to the contrary, a second set of remaining “excitatory” neurons saw greater signaling activity when mice expected a reward based on exposure to the two sensory cues and got one.

    Further tests showed that the activation of specific inhibitory neurons — parvalbumin, somatostatin, and vasointestinal peptide — was responsible for these changes and was in turn controlled by the chemical messenger, or neurotransmitter, acetylcholine. Chemically shutting down acetylcholine activity cut in half the number of times mice successfully went after their water reward when prompted by a ring tone. Some studies in humans have linked acetylcholine depletion to higher rates of Alzheimer’s disease.

    Froemke, who is also a faculty scholar at the Howard Hughes Medical Institute, says the team next plans to assess how the hormones noradrenaline and dopamine affect auditory cortex neurons under different situations.

    “If we can sort out the many interactions between these chemicals and brain activity based on sensory perception and context, then we can possibly target specific excitatory and inhibitory neurological pathways to rebalance and influence behaviors,” says Froemke.

    "What the brain 'hears' depends on what is 'seen' in addition to specific sounds, as the brain calculates how to respond," says study senior investigator and neuroscientist Robert Froemke, PhD, an assistant professor at NYU Langone and its Skirball Institute of Biomolecular Medicine.
  • Treadmill running with heavier shoes tied to slower race times

    {3,000-meter runners ran roughly 1 percent slower when shoes were just 3.5 ounces heavier.}

    It makes sense that running with heavier shoes on will cause you to exert more energy than running with lighter shoes. That was proven several decades ago.

    But does using more energy while running with heavier shoes translate into slower running times? That’s also a yes, say University of Colorado Boulder researchers from the Department of Integrative Physiology, who designed a clever study to show that running times slow when running shoe weight is increased, even if only by a few ounces.

    For the study, the researchers brought 18 runners into CU Boulder’s Locomotion Laboratory, directed by Professor Rodger Kram, a study co-author. To measure running economy, each participant ran on a treadmill using three pairs of nearly identical shoes, with subtle differences.

    Unbeknownst to the runners, the researchers added small lead pellets inside the tongues of two of the three pairs of shoes to be used by each runner. While one pair was normal, each shoe of another pair was made 100 grams heavier and a third pair was loaded with 300 grams of lead pellets per shoe. For comparison, an apple or deck of cards is about 100 grams, or 3.5 ounces.

    Each of the runners — all sub-20-minute 5K performers — ran treadmill tests in which oxygen consumption and carbon dioxide production were measured with all three differently-weighted shoe pairs. The treadmill tests compared well with previous treadmill evaluations, showing energy costs of the runners rose by about 1 percent with each extra 100 grams of shoe weight.

    Later, the runners ran 3,000-meter (about 2-mile) time trials on a CU Boulder indoor track in each of the three shoe pairs once a week for three weeks. Unaware of the differences in shoe weight (the researchers insisted on putting on and taking off the shoes for the test subjects), the runners ran roughly 1 percent slower for each 100 grams of lead added to the shoes in the 3,000-meter race.

    “For me, both as a runner and as a scientist, the most interesting part of this study is that our data show that changes we can reliably measure in the lab translate to similar changes in running performance,” said CU Boulder postdoctoral researcher Wouter Hoogkamer, who led the study.

    “Our results indicate that to evaluate the effects of equipment or technique changes, athletes don’t need to run several races at maximum intensity — we can predict performance based on just a few five-minute bouts of less-than-maximum running effort in the lab on a treadmill,” Hoogkamer said.

    A paper on the subject was published online in the journal Medicine & Science in Sports & Exercise, the flagship journal of the American College of Sports Medicine. Other co-authors include CU Boulder graduate student Shalaya Kipp (a former CU Boulder All-American and Olympian in the steeplechase) and Barry Spiering of the Nike Sport Research Lab. Nike Inc. funded the study, and Kram is a paid consultant for Nike.

    “In exercise physiology class, I learned the classic theory that oxygen delivery determines endurance performance,” said Kipp. “It’s cool to be able to show that the theory actually holds true in the lab and on the track.”

    One interesting implication, said the researchers, is that elite marathon runners wearing shoes 100 grams lighter than normal could potentially run about 57 seconds faster. The current men’s world record is 2:02:57, set by Dennis Kimetto of Kenya in 2014 while wearing shoes that weighed about 230 grams — just over eight ounces.

    However, when shoe mass is reduced, by compromising cushioning for example, it doesn’t mean you will run faster, said Hoogkamer. Prior studies in Kram’s lab have shown that proper cushioning also reduces the energetic cost of running. So when selecting footwear, be aware of this trade-off between shoe mass and cushioning.

    “Lighter is not always better,” said Hoogkamer.

  • The transition from daylight saving time to standard time leads to depressions

    {“The year has 16 months: November, December, January, February, March, April, May, June, July, August, September, October, November, November, November, November,” writes the Danish poet Henrik Nordbrandt in a disheartening comment on the month we are about to enter.}

    And Nordbrandt is not the only one suffering in November. A recently published study documents that the number of people who are diagnosed with depression at psychiatric hospitals in Denmark increases immediately after the transition from daylight saving time to standard time. More specifically, the number of depression diagnoses during the month after the transition from daylight saving time is approximately eight per cent higher than expected based on the development in the number of diagnoses up to the transition.

    The study is based on analysis of 185,419 depression diagnoses registered in The Central Psychiatric Research Register between 1995 and 2012.

    According to Associate Professor Søren D. Østergaard from Aarhus University Hospital in Risskov, which is part of The Department of Clinical Medicine at Aarhus University, the increase in depression rates is too pronounced to be coincidental.

    Søren D. Østergaard is one of the five researchers behind the study, which is the result of a collaboration between departments of psychiatry and political science at the universities of Aarhus, Copenhagen and Stanford.

    “We are relatively certain that it is the transition from daylight saving time to standard time that causes the increase in the number of depression diagnoses and not, for example, the change in the length of the day or bad weather. In fact, we take these phenomena into account in our analyses,” says Søren D. Østergaard.

    He also points out that even though the study is based on analysis of relatively severe depressions diagnosed at psychiatric hospitals, there is no reason to believe that the time transition only affects the propensity to develop more severe forms of depression.

    “We expect that the entire spectrum of severity is affected by the transition from daylight saving time to standard time, and since depression is a highly prevalent illness, an increase of eight per cent corresponds to many cases,” says Søren D. Østergaard.

    The study does not identify the underlying mechanism triggering the marked increase, but the researchers point to some possible causes. In Denmark, the transition from daylight saving time to standard time ‘moves’ one hour of daylight from the afternoon between 5:00 pm — 6:00 pm to the morning between 7:00 am — 8:00 am.

    “We probably benefit less from the daylight in the morning between seven and eight, because many of us are either in the shower, eating breakfast or sitting in a car or bus on the way to work or school. When we get home and have spare time in the afternoon, it is already dark,” explains Søren D. Østergaard.

    “Furthermore, the transition to standard time is likely to be associated with a negative psychological effect as it very clearly marks the coming of a period of long, dark and cold days,” says Søren D. Østergaard.

    Why are the results of the study important? The researcher from Aarhus University is not in doubt:

    “Our results should give rise to increased awareness of depression in the weeks following the transition to standard time. This is especially true for people with a tendency towards depression — as well as their relatives. Furthermore the healthcare professionals who diagnose and treat depression should also take our results into consideration,” says Søren D. Østergaard.

    The number of people who are diagnosed with depression at psychiatric hospitals in Denmark increases immediately after the transition from daylight saving time to standard time.
  • Underfed worms program their babies to cope with famine

    {Findings are consistent with decades-old hypothesis for humans.}

    Going hungry at an early age can cause lifelong health problems. But the extent of malnutrition’s damage depends on mom’s diet too — at least in worms.

    A Duke University study of the tiny nematode worm C. elegans finds that young worms that don’t get anything to eat in the first few days of life are buffered from early starvation’s worst effects if their mothers had also been underfed.

    The study appeared October 26 in the journal PLOS Genetics.

    Millimeter-long C. elegans worms live in soils and rotting vegetation, where they feed on microbes such as bacteria. A team led by Duke assistant professor L. Ryan Baugh fed one group of pregnant worms a normal diet of bacterial broth, and another group of expectant worms a watered-down version.

    The researchers then reared the offspring of both groups without food for the first eight days of life, and monitored their growth and fertility over their lifespan. As expected, after eight days of starvation, the deprived larvae grew slower and were less fertile than worms that had a healthier start in life.

    But surprisingly, starved worms whose mothers ate watered-down food during pregnancy weren’t as stunted as the offspring of well-fed mothers.

    The differences lasted throughout their lifetimes. Baby worms born to underfed moms continued to make a better recovery long after the famine ended.

    “They didn’t completely escape the adverse effects of early life starvation, but they were buffered from them,” Baugh said.

    The results are consistent with an idea from research on humans called the thrifty phenotype hypothesis. First proposed more than 20 years ago to explain the rise in type 2 diabetes in Western countries, the thinking is that pregnant women who don’t get enough to eat program their babies with “thrifty” metabolisms that are good at rationing nutrients and storing fat.

    Thrifty metabolisms can spell trouble later in life if mothers raised in scarcity have children raised in abundance, as is thought to be the case for many modern humans. But the Duke study shows that reprogramming an unborn baby’s metabolic thermostat can be a good thing for babies that grow up in the same scarce conditions their mothers did.

    In Baugh’s study, the effects of prenatal nutrition were only apparent for worms born in lean times. Baby worms reared in plentiful conditions did well whether their moms got enough to eat during pregnancy or not.

    “These animals are able to anticipate adverse conditions based on their mothers’ experience,” Baugh said. “But if the environment actually changes for the better, the worms are able to sense and respond to good conditions and make the best of it.”

    The molecular mechanisms behind the buffering effects of maternal diet are still unclear.

    Dwindling food supplies during pregnancy seem to trigger worm mothers to make bigger, better-provisioned eggs for the lean times that may lie ahead, the study shows.

    One possibility is that hunger during pregnancy slows the rate of ovulation, so that the developing egg has more time to grow before it gets fertilized.

    It’s also possible that maternal diet causes changes in gene expression that are passed down to her offspring.

    Baugh’s team is now focusing on a network of genes known from other C. elegans studies to regulate the movement of fats between eggs and other cells in the worm’s body.

    “Mom somehow provisions the embryo, or programs it,” Baugh said.

    C. elegans worms whose mothers didn’t get enough to eat during pregnancy cope better with famine.
  • Autism spectrum disorder linked to mutations in some mitochondrial DNA

    {Children diagnosed with autism spectrum disorder (ASD) have greater numbers of harmful mutations in their mitochondrial DNA than family members, report Zhenglong Gu of Cornell University in Ithaca, New York, and colleagues, in a study published October 28th, 2016 in PLOS Genetics.}

    Increasingly, studies point to malfunctions in mitochondria — the powerhouses of the cell — as a cause of autism spectrum disorder, but the biological basis for this relationship is unclear. To see if a genetic link exists between mitochondrial malfunction and ASD, the scientists analyzed mitochondrial DNA sequences from 903 children with ASD, along with their unaffected siblings and mothers. They discovered a unique pattern of heteroplasmic mutations, where both mutant and normal mitochondrial DNA sequences exist in a single cell. Children with ASD had more than twice as many potentially harmful mutations compared to unaffected siblings, and 1.5 times as many mutations that would alter the resulting protein. The researchers went on to show that these mutations can be inherited from the mother, or the result of spontaneous mutation during development.

    The scientists noted that the risk associated with these mutations is most pronounced in children with lower IQ and poor social behavior compared to their unaffected siblings. Carrying harmful mutations in mitochondrial DNA is also associated with increased risk of neurological and developmental problems among children with ASD. Because mitochondria play a central role in metabolism, these findings may help explain the metabolic disorders commonly associated with ASD and other neurodevelopmental disorders. Evaluating mutations in the mitochondrial DNA of high-risk families could help improve the diagnosis and treatment of these diseases.

    Zhenglong Gu says “The result of our study synergizes with recent work on ASD, calling attention to children diagnosed with ASD who have one or more developmental abnormalities or related co-morbid clinical conditions for further testing on mitochondrial DNA and mitochondrial function. Since many neurodevelopmental disorders and related childhood disorders show abnormalities that converge upon mitochondrial dysfunction, and may have mtDNA defects as a common harbinger, future research is needed to elucidate the mitochondrial mechanisms underpinning to these diseases. Ultimately, understanding the energetic aspects of neurodevelopmental disorders may lead to entirely new kinds of treatments, and preventative strategies that would target mitochondria.”

    To see if a genetic link exists between mitochondrial malfunction and ASD, scientists analyzed mitochondrial DNA sequences from 903 children with ASD, along with their unaffected siblings and mothers.
  • Genome sequencing reveals ancient interbreeding between chimpanzees and bonobos

    {For the first time, scientists have revealed ancient gene mixing between chimpanzees and bonobos, humankind’s closest relatives, showing parallels with Neanderthal mixing in human ancestry. Published in the journal Science, the study from scientists at the Wellcome Trust Sanger Institute and their international collaborators showed that one percent of chimpanzee genomes are derived from bonobos.}

    The study also showed that genomics could help reveal the country of origin of individual chimpanzees, which has strong implications for chimpanzee conservation.

    Chimpanzees and bonobos are great apes found only in tropical Africa. They are endangered species and are supposedly fully protected by law, yet many chimpanzees and bonobos are captured and held illegally.

    To aid the conservation effort, researchers analysed the whole genome sequences of 75 chimpanzees and bonobos, from 10 African countries, and crucially included 40 new wild-born chimpanzees from known geographic locations. They discovered that there was a strong link between the genetic sequence of a chimpanzee, and their geographic origin.

    Dr Chris Tyler Smith, from the Wellcome Trust Sanger Institute, said: “This is the largest analysis of chimpanzee genomes to date and shows that genetics can be used to locate quite precisely where in the wild a chimpanzee comes from. This can aid the release of illegally captured chimpanzees back into the right place in the wild and provide key evidence for action against the captors.”

    Chimpanzees and bonobos are the closest living relatives of human beings. They diverged from a common ancestor between 1.5 and 2 million years ago and live in different areas of tropical Africa. Until now, it was thought that gene flow between the species would have been impossible, as they were physically separated by the Congo River.

    The study confirmed a main separation between chimpanzees and bonobos approximately 1.5 million years ago, and the presence of four chimpanzee subspecies in different regions. However, the researchers also found there were two additional gene flow events between the chimpanzee and bonobo populations, indicating that at least some individuals found their way across the river.

    Dr Yali Xue, from the Sanger Institute, said: “We found that central and eastern chimpanzees share significantly more genetic material with bonobos than the other chimpanzee subspecies. These chimpanzees have at least 1% of their genomes derived from bonobos. This shows that there wasn’t a clean separation, but that the initial divergence was followed by occasional episodes of mixing between the species.

    The study also included researchers from Spain, Copenhagen Zoo and the University of Cambridge and showed that there have been at least two phases of secondary contact, 200-550 thousand years ago and around 150 thousand years ago, mirroring what is believed to have happened during the last 100 thousand years of the evolution of humans.

    Dr Tomàs Marquès-Bonet, leader of the study from the Institute of Biological Evolution (University Pompeu Fabra and CSIC), Barcelona, said: “This is the first study to reveal that ancient gene flow events happened amongst the living species closest to humans — the bonobos and chimpanzees. It implies that successful breeding between close species might have been actually widespread in the ancestors of humans and living apes.”

    Bonobos