Category: Science News

  • Earth may be home to one trillion species

    {Earth could contain nearly 1 trillion species, with only one-thousandth of 1 percent now identified, according to a study from biologists at Indiana University.}

    The estimate, based on the intersection of large datasets and universal scaling laws, appears May 2 in the Proceedings of the National Academy of Sciences. The study’s authors are Jay T. Lennon, associate professor in the IU Bloomington College of Arts and Sciences’ Department of Biology, and Kenneth J. Locey, a postdoctoral fellow in the department.

    The IU scientists combined microbial, plant and animal community datasets from government, academic and citizen science sources, resulting in the largest compilation of its kind. Altogether, these data represent over 5.6 million microscopic and nonmicroscopic species from 35,000 locations across all the world’s oceans and continents, except Antarctica.

    “Estimating the number of species on Earth is among the great challenges in biology,” Lennon said. “Our study combines the largest available datasets with ecological models and new ecological rules for how biodiversity relates to abundance. This gave us a new and rigorous estimate for the number of microbial species on Earth.

    “Until recently, we’ve lacked the tools to truly estimate the number of microbial species in the natural environment,” he added. “The advent of new genetic sequencing technology provides an unprecedentedly large pool of new information.”

    The work is funded by an effort of the National Science Foundation to transform, by 2020, understanding about the scope of life on Earth by filling major gaps in humanity’s knowledge about the planet’s biodiversity.

    “This research offers a view of the extensive diversity of microbes on Earth,” said Simon Malcomber, director of the NSF’s Dimensions of Biodiversity program. “It also highlights how much of that diversity still remains to be discovered and described.”

    Microbial species are all forms of life too small to be seen with the naked eye, including all single-celled organisms, such as bacteria and archaea, as well as certain fungi. Many earlier attempts to estimate the number of species on Earth simply ignored microorganisms or were informed by older datasets that were based on biased techniques or questionable extrapolations, Lennon said.

    “Older estimates were based on efforts that dramatically under-sampled the diversity of microorganisms,” he added. “Before high-throughput sequencing, scientists would characterize diversity based on 100 individuals, when we know that a gram of soil contains up to a billion organisms, and the total number on Earth is over 20 orders of magnitude greater.”

    The realization that microorganisms were significantly under-sampled caused an explosion in new microbial sampling efforts over the past several years, including the collection of human-related microorganisms by the National Institutes of Health’s Human Microbiome Project; marine microorganisms by the Tara Oceans Expedition; and aquatic, terrestrial and host-related microorganisms by the Earth Microbiome Project.

    These data sources — and many others — were compiled to create the inventory in the IU study, which pulls together 20,376 sampling efforts on bacteria, archaea and microscopic fungi and 14,862 sampling efforts on communities of trees, birds and mammals. All of these sources were either publically available or provided access to IU.

    “A massive amount of data has been collected from these new surveys,” said Locey, whose work included programming required to compile the inventory. “Yet few have actually tried to pull together all the data to test big questions.

    “We suspected that aspects of biodiversity, like the number of species on Earth, would scale with the abundance of individual organisms,” he added. “After analyzing a massive amount of data, we observed simple but powerful trends in how biodiversity changes across scales of abundance. One of these trends is among the most expansive patterns in biology, holding across all magnitudes of abundance in nature.”

    Scaling laws, like those discovered by the IU scientists, are known to accurately predict species numbers for plant and animal communities. For example, the number of species scales with the area of a landscape.

    “Until now, we haven’t known whether aspects of biodiversity scale with something as simple as the abundance of organisms,” Locey said. “As it turns out, the relationships are not only simple but powerful, resulting in the estimate of upwards of 1 trillion species.”

    The study’s results also suggest that actually identifying every microbial species on Earth is an almost unimaginably huge challenge. To put the task in perspective, the Earth Microbiome Project — a global multidisciplinary project to identify microscope organisms — has so far cataloged less than 10 million species.

    “Of those cataloged species, only about 10,000 have ever been grown in a lab, and fewer than 100,000 have classified sequences,” Lennon said. “Our results show that this leaves 100,000 times more microorganisms awaiting discovery — and 100 million to be fully explored. Microbial biodiversity, it appears, is greater than ever imagined.”

    This research was also supported in part by the U.S. Army Research Office.

    Rendering of bacterium. The Earth Microbiome Project -- a global multidisciplinary project to identify microscope organisms -- has so far cataloged less than 10 million species of the estimated one trillion living on Earth.
  • Bird Brain? Crows Can Be as Clever as Chimps

    {Despite their small brains, ravens and crows may be just as clever as chimps, research suggests}

    Study shows how these birds parallel great apes in motor self-regulation.

    A study led by researchers at Lund University in Sweden suggests that ravens can be as clever as chimpanzees, despite having much smaller brains, indicating that rather than the size of the brain, the neuronal density and the structure of the birds’ brains play an important role in terms of their intelligence.

    “Absolute brain size is not the whole story. We found that corvid birds performed as well as great apes, despite having much smaller brains,” says Can Kabadayi, doctoral student in Cognitive Science.

    Intelligence is difficult to test, but one aspect of being clever is inhibitory control, and the ability to override animal impulses and choose a more rational behaviour.

    Researchers at Duke University, USA, conducted a large-scale study in 2014, where they compared the inhibitory control of 36 different animal species, mainly primates and apes. The team used the established cylinder test, where food is placed in a transparent tube with openings on both sides. The challenge for the animal is to retrieve the food using the side openings, instead of trying to reach for it directly. To succeed, the animal has to show constraint and choose a more efficient strategy for obtaining the food.

    The large-scale study concluded that great apes performed the best, and that absolute brain size appeared to be key when it comes to intelligence. However, they didn’t conduct the cylinder test on corvid birds.

    Can Kabadayi, together with researchers from the University of Oxford, UK and the Max Planck Institute for Ornithology in Germany, therefore had ravens, jackdaws and New Caledonian crows perform the same cylinder test to better understand their inhibitory control.

    The team first trained the birds to obtain a treat in an opaque tube with a hole at each end. Then they repeated the test with a transparent tube. The animal impulse would naturally be to go straight for the tube as they saw the food. However, all of the ravens chose to enter the tube from the ends in every try. The performance of the jackdaws and the crows came very close to 100%, comparable to a performance by bonobos and gorillas.

    “This shows that bird brains are quite efficient, despite having a smaller absolute brain size. As indicated by the study, there might be other factors apart from absolute brain size that are important for intelligence, such as neuronal density,” says Can Kabadayi, and continues:

    “There is still so much we need to understand and learn about the relationship between intelligence and brain size, as well as the structure of a bird’s brain, but this study clearly shows that bird brains are not simply birdbrains after all!”

    Crow.
  • Newly discovered titanosaurian dinosaur from Argentina, Sarmientosaurus

    {Approximately 95-million-year-old complete sauropod skull examined, possibly exceptional sensory capabilities.}

    Scientists have discovered Sarmientosaurus musacchioi, a new species of titanosaurian dinosaur, based on an complete skull and partial neck fossil unearthed in Patagonia, Argentina, according to a new study.

    Scientists have discovered Sarmientosaurus musacchioi, a new species of titanosaurian dinosaur, based on an complete skull and partial neck fossil unearthed in Patagonia, Argentina, according to a study published April 26, 2016 in the open-access journal PLOS ONE by Rubén Martínez from the Laboratorio de Paleovertebrados of the Universidad Nacional de la Patagonia San Juan Bosco (UNPSJB), Argentina, and colleagues.

    Titanosaurs, a type of sauropod, ranged in size from the weight of a cow to that of the largest sperm whale. These plant-eaters have long necks and tails and may have been the most common large herbivores in the Southern Hemisphere landmasses during the Cretaceous. Despite their abundance, the skulls of these animals, critical to deciphering certain aspects of their biology, are exceedingly rare. Of the 60-plus named titanosaurs, only four are represented by nearly complete or semi-complete skulls. Using computerized tomography (CT) imaging, the authors of this study closely examined well-preserved, anatomically ‘primitive’ skull and neck fossils from Sarmientosaurus.

    The researchers found that the Sarmientosaurus brain was small relative to its enormous body, typical of sauropods. However, they also found evidence of greater sensory capabilities than most other sauropods. They suggest that Sarmientosaurus had large eyeballs and good vision, and that the inner ear may have been better tuned for hearing low-frequency airborne sounds compared to other titanosaurs. Moreover, the balance organ of the inner ear indicates that this dinosaur may have habitually held its head with the snout facing downward, possibly to feed primarily on low-growing plants. “Discoveries like Sarmientosaurus happen once in a lifetime,” says study leader Rubén Martínez. “That’s why we studied the fossils so thoroughly, to learn as much about this amazing animal as we could.”

    Sarmientosaurus musacchioi is named for the town of Sarmiento in Chubut Province, which is close to the discovery site. The species name also honors the late Dr. Eduardo Musacchio, a paleontologist and professor at the UNPSJB and friend to Dr. Martínez and other team members.

    Sarmientosaurus head posture, brain & eye (WitmerLab): Digital renderings of the skull and reconstructed brain endocast and eye of the new titanosaurian dinosaur species Sarmientosaurus musacchioi.
  • Investigating world’s oldest human footprints with software designed to decode crime scenes

    {Software has unearthed new information about Laetoli’s lost tracks, revealing hints of a previously undiscovered fourth track-maker at the site. The Laetoli tracks were discovered by Mary Leakey in 1976 and are thought to be around 3.6 million years old.}

    Researchers at Bournemouth University have developed a new software technique to uncover ‘lost’ tracks, hidden in plain sight at the world’s oldest human footprint site in Laetoli (Tanzania). The software has revealed new information about the shape of the tracks and has found hints of a previously undiscovered fourth track-maker at the site.

    The software was developed as part of a Natural Environments Research Council (NERC) Innovation Project awarded to Professor Matthew Bennett and Dr Marcin Budka in 2015 for forensic footprint analysis. They have been developing techniques to enable modern footwear evidence to be captured in three-dimensions and analysed digitally to improve crime scene practice.

    Footprints reveal much about the individuals who made them; their body mass, height and their walking speed. “Footprints contain information about the way our ancestors moved,” explains Professor Bennett. “The tracks at Laetoli are the oldest in the world and show a line of footprints from our early ancestors, preserved in volcanic ash. They provide a fascinating insight into how early humans walked. The techniques we have been developing for use at modern crime scenes can also reveal something new about these ancient track sites.”

    The Laetoli tracks were discovered by Mary Leakey in 1976 and are thought to be around 3.6 million years old. There are two parallel trackways on the site, where two ancient hominins walked across the surface. One of these trackways was obscured when a third person followed the same path. The merged trackway has largely been ignored by scientists over the last 40 years and the fierce debate about the walking style of the track-makers has predominately focused on the undisturbed trackway.

    By using the software developed through the NERC Innovation Project Professor Bennett and his colleagues have been able to decouple the tracks of this merged trail and reveal for the first time the shape of the tracks left by this mysterious third track-maker. There is also an intriguing hint of a fourth track-maker at the site.

    “We’re really pleased that we can use our techniques to capture new data from these extremely old footprints,” says Dr Marcin Budka who developed the software used in the study.

    “It means that we have effectively doubled the information that the palaeo-anthropological community has available for study of these hominin track-makers,” continues Dr Reynolds one of the co-authors of the study.

    “As well as making new discoveries about our early ancestors, we can apply this science to help modern society combat crime. By digitising tracks at a crime scene we can preserve, share and study this evidence more easily,” says Sarita Morse who helped conceive the original analysis.

    The Laetoli tracks were discovered by Mary Leakey in 1976 and are thought to be around 3.6 million years old. There are two parallel trackways on the site, where two ancient hominins walked across the surface. One of these trackways was obscured when a third person followed the same path.
  • First happiness genes have been located

    {For the first time in history, researchers have isolated the parts of the human genome that could explain the differences in how humans experience happiness. These are the findings of a large-scale international study in over 298,000 people, conducted by VU Amsterdam professors Meike Bartels (Genetics and Wellbeing) and Philipp Koellinger (Genoeconomics). The researchers found three genetic variants for happiness, two variants that can account for differences in symptoms of depression, and eleven locations on the human genome that could account for varying degrees of neuroticism. The genetic variants for happiness are mainly expressed in the central nervous system and the adrenal glands and pancreatic system. The results were published in the journal Nature Genetics.}

    {{Genetic influences on happiness}}

    Prior twin and family research using information from the Netherlands Twin Register and other sources has shown that individual differences in happiness and well-being can be partially ascribed to genetic differences between people. Happiness and wellbeing are the topics of an increasing number of scientific studies in a variety of academic disciplines. Policy makers are increasingly focusing on wellbeing, drawing primarily on the growing body of evidence suggesting that wellbeing is a factor in mental and physical health.

    VU Amsterdam professor Meike Bartels explains: “This study is both a milestone and a new beginning: A milestone because we are now certain that there is a genetic aspect to happiness and a new beginning because the three variants that we know are involved account for only a small fraction of the differences between human beings. We expect that many variants will play a part.” Locating these variants will also allow us to better study the interplay between nature and nurture, as the environment is certainly responsible — to some extent — for differences in the way people experience happiness.”

    {{Further research is now possible}}

    These findings, which resulted from a collaborative project with the Social Science Genetic Association Consortium, are available for follow-up research. This will create an increasingly clearer picture of what causes differences in happiness. Professor Bartels points out that “The genetic overlap with depressive symptoms that we have found is also a breakthrough. This shows that research into happiness can also offer new insights into the causes of one of the greatest medical challenges of our time: depression.” The research effort headed by professors Bartels and Koellinger is the largest ever study into the genetic variants for happiness. It was successfully completed thanks to the assistance of 181 researchers from 145 scientific institutes, including medical centres in Rotterdam, Groningen, Leiden and Utrecht, and the universities of Rotterdam and Groningen.

    Researchers have found three genetic variants for happiness, two variants that can account for differences in symptoms of depression, and eleven locations on the human genome that could account for varying degrees of neuroticism.
  • Risks of harm from spanking confirmed by analysis of 5 decades of research

    {The more children are spanked, the more likely they are to defy their parents and to experience increased anti-social behavior, aggression, mental health problems and cognitive difficulties, according to a new meta-analysis of 50 years of research on spanking by experts at the University of Texas at Austin and the University of Michigan.}

    The study, published in this month’s Journal of Family Psychology, looks at five decades of research involving over 160,000 children. The researchers say it is the most complete analysis to date of the outcomes associated with spanking, and more specific to the effects of spanking alone than previous papers, which included other types of physical punishment in their analyses.

    “Our analysis focuses on what most Americans would recognize as spanking and not on potentially abusive behaviors,” says Elizabeth Gershoff, an associate professor of human development and family sciences at The University of Texas at Austin. “We found that spanking was associated with unintended detrimental outcomes and was not associated with more immediate or long-term compliance, which are parents’ intended outcomes when they discipline their children.”

    Gershoff and co-author Andrew Grogan-Kaylor, an associate professor at the University of Michigan School of Social Work, found that spanking (defined as an open-handed hit on the behind or extremities) was significantly linked with 13 of the 17 outcomes they examined, all in the direction of detrimental outcomes.

    “The upshot of the study is that spanking increases the likelihood of a wide variety of undesired outcomes for children. Spanking thus does the opposite of what parents usually want it to do,” Grogan-Kaylor says.

    Gershoff and Grogan-Kaylor tested for some long-term effects among adults who were spanked as children. The more they were spanked, the more likely they were to exhibit anti-social behavior and to experience mental health problems. They were also more likely to support physical punishment for their own children, which highlights one of the key ways that attitudes toward physical punishment are passed from generation to generation.

    The researchers looked at a wide range of studies and noted that spanking was associated with negative outcomes consistently and across all types of studies, including those using the strongest methodologies such as longitudinal or experimental designs. As many as 80 percent of parents around the world spank their children, according to a 2014 UNICEF report. Gershoff notes that this persistence of spanking is in spite of the fact that there is no clear evidence of positive effects from spanking and ample evidence that it poses a risk of harm to children’s behavior and development.

    Both spanking and physical abuse were associated with the same detrimental child outcomes in the same direction and nearly the same strength.

    “We as a society think of spanking and physical abuse as distinct behaviors,” she says. “Yet our research shows that spanking is linked with the same negative child outcomes as abuse, just to a slightly lesser degree.”

    Gershoff also noted that the study results are consistent with a report released recently by the Centers for Disease Control and Prevention that called for “public engagement and education campaigns and legislative approaches to reduce corporal punishment,” including spanking, as a means of reducing physical child abuse. “We hope that our study can help educate parents about the potential harms of spanking and prompt them to try positive and non-punitive forms of discipline.”

  • ‘Dirty’ mice better than lab-raised mice for studying human disease

    {Immune system behaves differently for rodents kept in sterile environments.}

    Don’t blame lab mice for shortfalls in their ability to mimic human immune systems — blame their upbringing.

    Mice with more experience fighting pathogens have immune system reactions more like humans’, conclude two studies published online April 20. “Dirty” mice bought from pet stores or caught in the wild have more humanlike immune systems than clean lab mice do, researchers report in Nature. And in Cell Host & Microbe, scientists find that infecting lab mice with a series of viruses and parasites alters their immune responses to be similar to those of dirty mice and humans.

    In recent years, scientists have debated whether mice are adequate stand-ins for humans. Some say mice are poor substitutes, and that money should instead be spent on bolstering human studies (SN: 3/23/13, p. 10). Others look at the same data and conclude that mice do a pretty good job of representing humans (SN: 9/20/14, p. 14). Plus, many important studies could not be done with humans, so mice are a necessity.

    But even mouse fans recognize there is room for improvement. “All science is an approximation of the real situation,” says immunologist Andrew Macpherson of the University Hospital of Bern, Switzerland, who relies on mice models. “I don’t think anybody doubts that the models don’t always accurately recapitulate what is happening in humans.” The new papers show where mice fall short and suggest ways to improve their ability to mimic people, he says.

    Lab mice’s immune system responses “really do look different” from that of humans’, says immunologist David Masopust, coauthor of both studies. Masopust, of the University of Minnesota in Minneapolis, and colleagues wondered whether those dissimilarities are due to irreconcilable differences in the genetic makeup of mice and humans or if the environment plays a role.

    His group counted immune cells in blood from adult lab mice, adult humans and human umbilical cords. Of special interest were “memory CD8+ T cells,” which cull body cells that are infected with viruses or bacteria or that have become cancerous. Lab mice and human infants have few of these memory cells, while adult humans have a plethora. That indicates that lab mice have inexperienced immune systems, much like human babies.

    The finding, “is one of those things that once you know it, it’s incredibly obvious,” says E. John Wherry, an immunologist at the University of Pennsylvania. “Mice are like humans raised in a bubble.”

    Masopust agrees. “They live a preposterously hygienic existence.” Even mice with severe immune deficiencies can thrive in immaculately clean labs.

    Ultraclean lab mice can’t emulate the sort of history most human immune systems experience, says Tiffany Reese, a viral immunologist at the University of Texas Southwestern Medical Center in Dallas. Adults carry an average of eight to 12 chronic viruses, such as Epstein-Barr virus (the cause of mononucleosis). Worm parasites infect about 2 billion people worldwide. And by adulthood, people have usually fought off multiple colds, flus and other infections.

    Masopust’s team found that the memory T cell profiles of wild and pet-store mice more closely resembled that of adult humans than lab mice’s did. Housing lab mice next to pet-store mice for a month caused their immune system to change, making the lab mice resemble the dirty mice, the researchers reported in Nature. In discrepancies between studies of lab mice and humans, “the mouse may not be at fault,” Masopust says. “It’s the way that they are cared for.”

    An experienced immune system not only looks different, it also works differently from an inexperienced one, Reese and colleagues report in Cell Host & Microbe. Reese infected lab mice with two types of herpesviruses, gave them the flu and inoculated them with an intestinal parasite. She then compared how uninfected mice reacted to a yellow fever vaccine with how chronically infected mice reacted. Uninfected mice made more antibodies against the vaccine. The result might help explain why some vaccines that look promising in animal studies don’t pan out in human trials.

    Controlled infections may increase understanding of how pathogens interact with each other, with friendly microbes that live in the body and with the host’s immune system, says Reese’s coauthor Herbert Virgin, a viral immunologist at Washington University School of Medicine in St. Louis.

    Researchers have a bias that mice are not humans, says Virgin, “But I think that’s too simplistic a view. We shouldn’t be asking whether the mouse is a perfect model for humans, but whether we can make the mouse emulate more closely the basic nature of human physiology.”

    EXPERIENCE MATTERS  Wild mice (one shown left) and pet-store mice have dealt with infections, which trained their immune systems to react similarly to an adult human’s. Lab mice (one shown right) are kept in sterile environments. Their immune systems react more like a newborn baby’s or a person in a hermetically sealed room.
  • Researchers pinpoint part of the brain that recognizes facial expressions

    {Researchers at The Ohio State University have pinpointed the area of the brain responsible for recognizing human facial expressions.}

    It’s on the right side of the brain behind the ear, in a region called the posterior superior temporal sulcus (pSTS).

    In a paper published today in the Journal of Neuroscience, the researchers report that they used functional magnetic resonance imaging (fMRI) to identify a region of pSTS as the part of the brain activated when test subjects looked at images of people making different facial expressions.

    Further, the researchers have discovered that neural patterns within the pSTS are specialized for recognizing movement in specific parts of the face. One pattern is tuned to detect a furrowed brow, another is tuned to detect the upturn of lips into a smile, and so on.

    “That suggests that our brains decode facial expressions by adding up sets of key muscle movements in the face of the person we are looking at,” said Aleix Martinez, a cognitive scientist and professor of electrical and computer engineering at Ohio State.

    Martinez said that he and his team were able to create a machine learning algorithm that uses this brain activity to identify what facial expression a person is looking at based solely on the fMRI signal.

    “Humans use a very large number of facial expressions to convey emotion, other non-verbal communication signals and language,” Martinez said.

    “Yet, when we see someone make a face, we recognize it instantly, seemingly without conscious awareness. In computational terms, a facial expression can encode information, and we’ve long wondered how the brain is able to decode this information so efficiently.

    “Now we know that there is a small part of the brain devoted to this task.”

    Using this fMRI data, the researchers developed a machine learning algorithm that has about a 60 percent success rate in decoding human facial expressions, regardless of the facial expression and regardless of the person viewing it.

    “That’s a very powerful development, because it suggests that the coding of facial expressions is very similar in your brain and my brain and most everyone else’s brain,” Martinez said.

    The study doesn’t say anything about people who exhibit atypical neural functioning, but it could give researchers new insights, said study co-author Julie Golomb, assistant professor of psychology and director of the Vision and Cognitive Neuroscience Lab at Ohio State.

    “This work could have a variety of applications, helping us not only understand how the brain processes facial expressions, but ultimately how this process may differ in people with autism, for example,” she said.

    Doctoral student Ramprakash Srinivasan, Golomb and Martinez placed 10 college students into an fMRI machine and showed them more than 1,000 photographs of people making facial expressions. The expressions corresponded to seven different emotional categories: disgusted, happily surprised, happily disgusted, angrily surprised, fearfully surprised, sadly fearful and fearfully disgusted.

    While some of the expressions were positive and others negative, they all had some commonalities among them. For instance, “happily surprised,” “angrily surprised” and “fearfully surprised” all include raised eyebrows, though other parts of the face differ when we express these three emotions.

    fMRI detects increased blood flow in the brain, so the research group was able to obtain images of the part of the brain that was activated when the students recognized different expressions. Regardless of the expression they were looking at, all the students showed increased activity in the same region — the pSTS.

    Then the research group used a computer to cross-reference the fMRI images with the different facial muscle movements shown in the test photographs. They were able to create a map of regions within the pSTS that activated for different facial muscle groups, such as the muscles of the eyebrows or lips.

    First, they constructed maps using the fMRIs of 9 of the participants. Then, they fed the algorithm the fMRI images from the 10th student, and asked it to identify the expressions that student was looking at. Then they repeated the experiment, creating the map from scratch with data from nine of the students, but using a different student as the 10th subject.

    About 60 percent of the time, the algorithm was able to accurately identify the facial expression that the 10th person was looking at, based solely on that person’s fMRI image.

    Martinez called the results “very positive,” and said that they indicate that the algorithm is making strides toward an understanding of what happens in that region of the brain.

    The researchers will continue the work, which was funded by the National Institutes of Health and the Alfred P. Sloan Foundation.

    Test subjects in an Ohio State University study were shown a series of photographs of different facial expressions. Researchers pinpointed an area of the brain that is specifically attuned to picking up key muscle movements (here, labeled AU for 'action units') that combine to express emotion.
  • Children of older mothers do better

    {The benefits associated with being born in a later year outweigh the biological risks associated with being born to an older mother.}

    Children of older mothers are healthier, taller and obtain more education than the children of younger mothers. The reason is that in industrialized countries educational opportunities are increasing, and people are getting healthier by the year. In other words, it pays off to be born later.

    Most previous research suggests that the older women are when they give birth, the greater the health risks are for their children. Childbearing at older ages is understood to increase the risk of negative pregnancy outcomes such as Down syndrome, as well as increase the risk that the children will develop Alzheimer’s disease, hypertension, and diabetes later in life.

    However, despite the risks associated with delaying childbearing, children may also benefit from mothers delaying childbearing to older ages. These are the findings from a new study conducted by Mikko Myrskylä, the director of the Max Planck Institute for Demographic Research (MPIDR),) and his colleague Kieron Barclay at the London School of Economics, that has been published today in Population and Development Review.

    Both public health and social conditions have been improving over time in many countries. Previous research on the relationship between maternal age and child outcomes has ignored the importance of these macro-level environmental changes over time. From the perspective of any individual parent, delaying childbearing means having a child with a later birth year. For example, a ten-year difference in maternal age is accompanied by a decade of changes to social and environmental conditions. Taking this perspective, this new MPIDR-study shows that when women delay childbearing to older ages their children are healthier, taller, and more highly educated. It shows that despite the risks associated with childbearing at older ages, which are attributable to aging of the reproductive system, these risks are either counterbalanced, or outweighed, by the positive changes to the environment in the period during which the mother delayed her childbearing.

    For example, a woman born in 1950 who had a child at the age of 20 would have given birth in 1970. If that same woman had a child at 40, she would have given birth in 1990. “Those twenty years make a huge difference,” explains Mikko Myrskylä. A child born in 1990, for example, had a much higher probability of going to a college or university than somebody born 20 years earlier.

    Barclay and Myrskylä used data from over 1.5 million Swedish men and women born between 1960 and 1991 to examine the relationship between maternal age at the time of birth, and height, physical fitness, grades in high school, and educational attainment of the children. Physical fitness and height are good proxies for overall health, and educational attainment is a key determinant of occupational achievement and lifetime opportunities.

    They found that when mothers delayed childbearing to older ages, even as old as 40 or older, they had children who were taller, had better grades in high school, and were more likely to go to university. For example, comparing two siblings born to the same mother decades apart, on average the child born when the mother was in her early 40s spends more than a year longer in the educational system than his or her sibling born when the mother was in her early 20s.

    In their statistical analyses, Barclay and Myrskylä compared siblings who share the same biological mother and father. Siblings share 50% of their genes, and also grow up in the same household environment with the same parents. “By comparing siblings who grew up in the same family it was possible for us to pinpoint the importance of maternal age at the time of birth independent of the influence of other factors that might bias the results” said Kieron Barclay.

    “The benefits associated with being born in a later year outweigh the individual risk factors arising from being born to an older mother. We need to develop a different perspective on advanced maternal age. Expectant parents are typically well aware of the risks associated with late pregnancy, but they are less aware of the positive effects” said Myrskylä.

  • How much do we really see?

    {Glance out the window and then close your eyes. What did you see? Maybe you noticed it’s raining and there was a man carrying an umbrella. What color was it? What shape was its handle? Did you catch those details? Probably not. Some neuroscientists would say that, even though you perceived very few specifics from the window scene, your eyes still captured everything in front of you. But there are flaws to this logic, MIT researchers argue in an Opinion published April 19, 2016 in Trends in Cognitive Sciences. It may be that our vision only reflects the gist of what we see.}

    “A ton of work supports that this perception that our visual experience is so rich and vivid is just totally wrong,” says first author Michael A. Cohen, a postdoctoral fellow in the Nancy Kanwisher Lab at MIT’s McGovern Institute for Brain Research. “But even if we can just see a handful of items, we definitely have an understanding of the world around us — a sense of what kind of scene we’re in.”

    A staple study researchers use to quantify our visual consciousness involves showing people flashes of different shapes or objects on a computer screen and asking how many details they can remember. In most cases, subjects report back four or five correct answers. The exception is when subjects are primed to look for something in advance, which changes what they pay attention to. This selective focus is part of why cognitive scientists can’t agree on what we actually “see,” because sight should not be so variable.

    For Cohen, however, consciousness is a combination of several processes, including focus and memory, that helps us make decisions about future actions. He points to studies that suggest that our brains are hardwired to quickly take in large objects and scenes (e.g., a highway, a park, a store) within fractions of a second. Glimpse out that window and you take in the depth, navigability, openness, and temperature of the surroundings. The brain does capture some details — for example, you don’t just see a man and an umbrella, but that the man is carrying the umbrella. But most of our visual perception may quite literally be focused on the “big picture.”

    “One of the useful things about this field of study is that there are many instances in which your subjective experience is misguided and science can reveal a bunch of things about your own consciousness that you weren’t necessarily aware of,” Cohen says. “There are many experiments in which people are very much surprised by the limits of their own cognitive experiences.”

    If we see less than we think that we do, the other senses likely follow similar rules. There’s evidence that audio perception also relies on gists of all of the sounds that we hear. From the window, you take in the sounds of the falling rain, singing birds, and car engines, but what you’re tuning out is the hum of streetlamps or the conversation taking place on the sidewalk. Again, the ears only capture the gist of the environment.

    Other researchers will likely disagree with how Cohen and co-authors — Kanwisher and Tufts University cognitive scientist Daniel Dennett — limit consciousness by the bandwidth of memory and decision making. Not to mention that they can’t disprove that we don’t unconsciously “see” all in view.

    “It’s very difficult to measure consciousness objectively without conflating reportablity with subjective experience,” Cohen says. “I think this paper gives us hope that we can bridge the gap between what we as scientists can quantify and the subjective impressions that people have when they open their eyes.”

    In this figure, the red crosses represent the fixation spot and the focus of attention for these particular snapshots. The left column shows two images that have not been doctored in any way (left column, 'Original image'). The middle column shows how the world does not fade into darkness when it is not fixated or focally attended. Instead, the right column shows how the parts of the world that are not fixated and unattended are represented as an ensemble statistic or "gist."