Category: Science News

  • Technological and cultural innovations amongst early humans not sparked by climate change

    {Environmental records obtained from archaeological sites in South Africa’s southern Cape suggest climate may not have been directly linked to cultural and technological innovations of Middle Stone Age humans in southern Africa after all.}

    A study published July 6, 2016 in the open-access journal PLOS ONE by an international team of researchers, led by Dr Patrick Roberts from the University of Oxford and including researchers from the Evolutionary Studies Institute at Wits University, shows that while climate shifts may have influenced human subsistence strategies, it may not have been the driving factor behind innovation.

    The Middle Stone Age marked a period of dramatic change amongst early humans in southern Africa, and climate change has been postulated as a primary driver for the appearance of technological and cultural innovations such as bone tools, ochre production, and personal ornamentation.

    While some researchers suggest that climate instability may have directly inspired technological advances, others postulate that environmental stability may have provided a stable setting that allowed for experimentation. However, the disconnection of palaeoenvironmental records from archaeological sites makes it difficult to test these alternatives.

    The authors of this study carried out analyses of animal remains, shellfish taxa and the stable carbon and oxygen isotope measurements in ostrich eggshell, from two archaeological sites, Blombos Cave and Klipdrift Shelter, spanning 98,000 to 73,000 years ago and 72,000 to 59,000 years ago, respectively, to acquire data regarding possible palaeoenvironmental conditions in southern Africa at the time.

    For instance, ostrich eggshell carbon and oxygen stable isotope levels may reflect vegetation and water consumption, which in turn vary with rainfall seasonality and amount in this region.

    The researchers found that climatic and environmental variation, reflected in ostrich eggshell stable isotope measurements, faunal records, and shellfish indicators, may not have occurred in phase with Middle Stone Age human technological and cultural innovation at these two sites.

    “While acknowledging that climate and environmental shifts may have influenced human subsistence strategies, the research suggests climate change may not have been the driving factor behind cultural and technological innovations in these localities and encourage context-specific evaluation of the role of climate change in driving early human experimentation,” says Professor Chris Henshilwood, one of the lead researchers from Wits University.

    “Our results suggest that although climate and environmental changes occurred, they were not coincident with cultural innovations, including personal ornamentation, or the appearance of complex tool-types. This suggests that we have to consider that other factors drove human innovation at this stage in our species’ evolution,” says Dr Patrick Roberts.

    {{About Blombos Cave}}

    Blombos Cave is an archaeological site located in Blombosfontein Nature Reserve, about 300 km east of Cape Town on the Southern Cape coastline, South Africa. The cave contains Middle Stone Age deposits currently dated at between c. 100,000 and 70,000 years before present (BP), and a Late Stone Age sequence dated at between 2000 and 300 years BP. The cave site was first excavated in 1991 and field work has been conducted there on a regular basis since 1997 — and is ongoing. The excavations at Blombos Cave have yielded important new information on the behavioural evolution of modern humans. The archaeological record from this cave site has been central in the ongoing debate on the cognitive and cultural origin of early humans and to the current understanding of when and where key behavioural innovations emerged among Homo sapiens in southern Africa during the Late Pleistocene.

    {{About Klipdrift Shelter}}

    Klipdrift Shelter is one of two Middle Stone Age sites situated in the De Hoop Nature Reserve where Professor Chris Henshilwood is leading new excavations. The other is Klipdrift Cave Lower. In 2011 deposits from the Howiesons Poort period (c. 66 000 — 60 000 years) were discovered at the shelter and at Klipdrift Cave Lower an age of 70 000 years is possible. These projects are contributing, and will in future contribute significantly to the international debate on the origins of what is considered modern human behaviour.

    Researchers working inside the Klipdrift complex.
  • Middle-age memory decline a matter of changing focus

    {Research sheds new light on what constitutes healthy aging of the brain}

    The inability to remember details, such as the location of objects, begins in early midlife (the 40s) and may be the result of a change in what information the brain focuses on during memory formation and retrieval, rather than a decline in brain function, according to a study by McGill University researchers.

    Senior author Natasha Rajah, Director of the Brain Imaging Centre at McGill University’s Douglas Institute and Associate Professor in McGill’s Department of Psychiatry, says this reorientation could impact daily life. “This change in memory strategy with age may have detrimental effects on day-to-day functions that place emphasis on memory for details such as where you parked your car or when you took your prescriptions.”

    Brain changes associated with dementia are now thought to arise decades before the onset of symptoms. So a key question in current memory research concerns which changes to the aging brain are normal and which are not. But Dr. Rajah says most of the work on aging and memory has concentrated on understanding brain changes later in life. “So we know little about what happens at midlife in healthy aging and how this relates to findings in late life. Our research was aimed at addressing this issue.”

    In this study, published in the journal, NeuroImage, 112 healthy adults ranging in age from 19 to 76 years were shown a series of faces. Participants were then asked to recall where a particular face appeared on the screen (left or right) and when it appeared (least or most recently). The researchers used functional MRI to analyze which parts of brain were activated during recall of these details.

    Rajah and colleagues found that young adults activated their visual cortex while successfully performing this task. As she explains, “They are really paying attention to the perceptual details in order to make that decision.” On the other hand, middle-aged and older adults didn’t show the same level of visual cortex activation when they recalled the information. Instead, their medial prefrontal cortex was activated. That’s a part of the brain known to be involved with information having to do with one’s own life and introspection.

    Even though middle-aged and older participants didn’t perform as well as younger ones in this experiment, Rajah says it may be wrong to regard the response of the middle-aged and older brains as impairment. “This may not be a ‘deficit’ in brain function per se, but reflects changes in what adults deem ‘important information’ as they age.” In other words, the middle-aged and older participants were simply focusing on different aspects of the event compared to those in the younger group.

    Rajah says that middle-aged and older adults might improve their recall abilities by learning to focus on external rather than internal information. “That may be why some research has suggested that mindfulness meditation is related to better cognitive aging.”

    Rajah is currently analyzing data from a similar study to discern if there are any gender differences in middle-aged brain function as it relates to memory. “At mid-life women are going through a lot of hormonal change. So we’re wondering how much of these results is driven by post-menopausal women.”

    This research was supported by the Canadian Institutes of Health Research and by a grant from the Alzheimer’s Society of Canada.

    When middle-aged and older adults were shown a series of faces, red regions of the brain were more active; these include an area in the medial prefrontal cortex that is associated with self-referential thinking. In young adults, by contrast, blue regions -- which include areas important for memory and attention -- were more active during this task.
  • Monkeys in Brazil ‘have used stone tools for hundreds of years at least’

    {Was early human behavior influenced by their observations of monkeys using stone tools?}

    New archaeological evidence suggests that Brazilian capuchins have been using stone tools to crack open cashew nuts for at least 700 years. Researchers say, to date, they have found the earliest archaeological examples of monkey tool use outside of Africa. In their paper, published in Current Biology, they suggest it raises questions about the origins and spread of tool use in New World monkeys and, controversially perhaps, prompts us to look at whether early human behaviour was influenced by their observations of monkeys using stones as tools. The research was led by Dr Michael Haslam of the University of Oxford, who in previous papers presents archaeological evidence showing that wild macaques in coastal Thailand used stone tools for decades at least to open shellfish and nuts.

    This latest paper involved a team from Oxford and the University of São Paulo in Brazil, who observed groups of modern capuchins at Serra da Capivara National Park in northeast Brazil, and combined this with archaeological data from the same site. Researchers watched wild capuchins use stones as hand-held hammers and anvils to pound open hard foods such as seeds and cashew nuts, with young monkeys learning from older ones how to do the same. The capuchins created what the researchers describe as ‘recognisable cashew processing sites’, leaving stone tools in piles at specific places like the base of cashew trees or on tree branches after use. They found that capuchins picked their favourite tools from stones lying around, selecting those most suitable for the task. Stones used as anvils were over four times heavier than hammer stones, and hammers four times heavier than average natural stones. The capuchins also chose particular materials, using smooth, hard quartzite stones as hammers, while flat sandstones became anvils.

    Using archaeological methods, the researchers excavated a total of 69 stones to see if this tool technology had developed at all over time. They dug to a depth of 0.7 metres at a site close to cashew trees where they had seen modern capuchins frequently using their stone tools. They identified the tools from inspecting the size and shape of the stones, as well as the distinctive damage on the stone surface caused by capuchin pounding. Through mass spectrometry, the researchers were able to confirm that dark-coloured residues on the tools were specifically from cashew nuts. They also carbon dated small pieces of charcoal discovered with the stones to establish the oldest were least 600 to 700 years old — meaning the tools predate the arrival of Europeans in the New World.

    In the paper, the researchers estimate that around 100 generations of capuchins have used this tradition of stone tools. They compared tools used by modern capuchins with the oldest excavated examples, finding they are similar in terms of weight and materials chosen. This apparent lack of change over hundreds of years suggests monkeys are ‘conservative’, preferring not to change the technology used, unlike humans living in the same region, says the paper.

    Lead author Dr Michael Haslam, from the School of Archaeology at the University of Oxford, said: ‘Until now, the only archaeological record of pre-modern, non-human animal tool use comes from a study of three chimpanzee sites in Cote d’Ivoire in Africa, where tools were dated to between 4,300 and 1,300 years old. Here, we have new evidence that suggests monkeys and other primates out of Africa were also using tools for hundreds, possibly thousands of years. This is an exciting, unexplored area of scientific study that may even tell us about the possible influence of monkeys’ tool use on human behaviour. For example, cashew nuts are native to this area of Brazil, and it is possible that the first humans to arrive here learned about this unknown food through watching the monkeys and their primate cashew-processing industry.’

    Tool use by monkeys has featured in other research led by Dr Haslam in recently published papers. In a study in the Journal of Human Evolution (published in June 2016), the team noted how groups of macaques in the marine national park on Piak Nam Yai Island, Thailand, selected stones as tools to crush marine snails, nuts and crabs. They also identified 10 tools in excavations at the site, which they dated as between 10 and 50 years old. In another research paper detailing fieldwork at the same site, they say the modern macaques typically moved their tools a metre or less from where they picked them up, but the longest distance they observed was around 87m. The macaques ate nine oysters at a time, on average, and generally carried the same tool over short distances. In one case, however, the researchers saw a hungry macaque eat 63 oysters one after the other, using the same stone tool to open all the shells, says the paper.

    Capuchin monkey
  • Changes in primate teeth linked to rise of monkeys

    {Discovery of inherited dental trait allows tracking of monkey, ape and human evolution}

    Searching for simple inherited dental characteristics that could lead to genes controlling tooth development, researchers have uncovered an easy-to-measure trait that tracks primate evolution over the last 20 million years, shedding light on the mysterious decline of apes and the rise of monkeys 8 million years ago. The research shows that monkeys diversified and took over the dentition niche of the majority of apes. Apes with outlying dentition, including human ancestors, remained.

    University of California, Berkeley paleontologists have identified distinctive features of primate teeth that allow them to track the evolution of our ape and monkey ancestors, shedding light on a mysterious increase in monkey species that occurred during a period of climate change 8 million years ago.

    The inherited dental features will also help the researchers track down the genes that control tooth development, assisting scientists intent on regrowing rather than replacing teeth.

    The features were discovered after detailed study of the shapes of molars and premolars inherited by baboons in a long-studied colony at the Southwest National Primate Research Center in San Antonio, Texas. Once it became clear that the relative lengths of the molars and premolars are an inherited trait much like eye color, the researchers measured these traits in the teeth of other primates, sifting through museum collections around the world.

    The measurement data prove that the feature is inherited in a similar way in all primates — humans included — and varies across different species and genera in a way that mirrors the evolutionary relationships worked out earlier by analyzing bones and comparing genes.

    “This shows that we can use the power of evolutionary history to unlock what is going on genetically in animals on whom you can’t experiment, such as humans,” said study leader Leslea Hlusko, a UC Berkeley associate professor of integrative biology. “We found two inherited traits, but identitying the traits is only the first step. We now have to figure out what the genome sequences are that underlie these traits, which will enable us to figure out what caused these evolutionary changes in dentition.”

    Hlusko and her UC Berkeley colleagues — former postdoctoral fellow Christopher Schmitt, now at Boston University, and graduate students Tesla Monson and Marianne Brasil — along with Michael Mahaney of the University of Texas Rio Grande Valley in Brownsville, will publish their analysis next week in the Proceedings of the National Academy of Sciences.

    The rise of monkeys and the decline of apes

    When Hlusko and her colleagues looked at how the two newly identified traits changed in primates over the last 20 million years, they noticed an unusual shift in tooth shape at the same time apes began to die out and monkeys to proliferate. This took place about 8 million years ago, in the Miocene epoch, as Earth began to warm, the Mediterranean Sea dried up and Africa’s thick forests transitioned to grasslands and savannah. At the same time, numerous species of apes, which had lived across Africa and southern Europe, began to disappear, and monkeys evolved more lineages. Today there are 19 monkey genera, while apes have dwindled to only six: humans, chimpanzees, bonobos, gorillas, organutans and gibbons.

    “If you go back into the Miocene, it was an ape world with essentially no or very few monkeys,” Hlusko said. “Now it is exactly the opposite: we have only a handful of apes and a whole lot of monkeys.”

    The change in dentition suggests that monkeys took over a niche that apes previously occupied, though whether that was a dietary niche or had more to do with primate life cycle remains to be figured out, she said. Dentition can evolve not only because of changes in the types of food animals chew, but also as a result of changes in when and how they erupt during development.

    “Monkeys moved into that ape niche, in terms of a dental pattern, but what exactly that means I don’t know yet,” said Hlusko, a member of UC Berkeley’s Human Evolution Research Center. “It may be something more complicated than just diet. Maybe it was the kind of food they ate at different stages of their lives, or it might have to do with the timing of major life history events that can influence the timing of tooth eruption, such as when they reached sexual maturity.”

    Seeking simple inherited traits

    Hlusko’s interest in the evolution of bones and teeth, and the genes that control them, comes out of her interest in paleontology and human origins. Over millions of years, our human and primate ancestors left a trail of teeth that she and other paleontologists have followed for clues to our evolutionary history.

    Unfortunately, what is known about the 300 or more genes that control tooth development comes mostly from mouse studies, and their teeth are a lot different from those of primates, from which they diverged about 70 million years ago.

    “Mice are a good model for understanding how teeth are made, but not a good model for understanding the finer details of how teeth evolved,” she said. “This is important from a dentistry perspective, because people are really interested in regenerating teeth so we don’t have to have fake teeth implanted into our jaws. To induce growth of a new one, however, it would be really useful to know how they grow naturally to begin with.”

    In her search for genes important in dental development, Hlusko uses a classical quantitative genetics approach that applies the rules of heredity, discovered 150 years ago in Gregor Mendel’s studies of simple traits in peas, to not-so-simple traits like the sizes and shapes of teeth. Geneticists classify traits like tooth size and shape as “complex” because they are affected by DNA variations within many genes.

    “Although biologically real, the individual effects of many of those genes, or variations within them, are quite small,” said Mahaney. “One of Hlusko’s goals is to detect and identify a few of these genes with major effects on variation in tooth size and shape in general. Such key genes likely would serve as signposts to biological pathways important to normal and disordered dental development.”

    Several years ago, Hlusko used thousands of pedigreed mouse skeletons in UC Berkeley’s Museum of Vertebrate Zoology to show that the front of the mouth — the canines and incisors — is under totally separate genetic control from the back of the mouth — the molars and premolars. This means that, in adapting to changes in environment or diet, teeth in the front of the mouth can evolve independently of teeth in the back, or vice versa.

    Now, with the help of 632 members of the baboon colony, she has found two measures of teeth in the back of the mouth that seem to be under the control of just a few genes. One is the ratio between the lengths of the third molar and the first molar; the other the ratio of the lengths of the second molar to the fourth premolar.

    “Instead of thinking of each tooth as a separate unit, we found that the relative size of the first compared to last molar is an inherited trait, like earlobe attachment. The same goes for the relative size of the last premolar relative to the molars,” she said. “We’ve turned the dentition into simple traits, and from that, elucidated a major diversity shift in primate evolution. The next step is to look back to the genome and see if we can find the genes that underlie this.”

    Based on measurements of teeth from 723 extant Old World monkeys from Africa and Asia — many of them measured by some 50 UC Berkeley undergraduate premed or predental students — she found that this group of monkeys also inherits the two traits as simply as eye color. When she looked further — measuring the teeth of 199 extant apes, 56 fossil Old World monkeys and 165 fossil apes — the pattern persisted. In all, the species she sampled represented 20 genera and 37 species of living and extinct monkeys and apes.

    The analysis revealed that 20 million years ago, apes varied enormously in these dental traits. But monkeys began to evolve some of the same dental traits as apes, and replaced apes that had similar traits. Only a small number of apes with dental traits unlike today’s monkeys survived, including our ancestors.

    “Only those apes with a specific range of dental features had a shot, and one of them was our ancestor,” she said. “That shift was an important first step in our lineage’s evolutionary history, and it set the stage for the evolution of standing up on two legs and pair bonding. And once you start pair bonding you open the door for all kinds of things, like extended childhood and more complex social interactions — all the kinds of things that humans do today.”

    Hlusko has expanded her study to look at a broad range of mammal teeth — including many dire wolves from the La Brea tar pits in Los Angeles — to see whether these two traits are independently inherited in all animals.

    The work was supported by the National Science Foundation.The Texas colony of Hamadryas baboons (Papio hamadryas) is funded in part by the National Institutes of Health.

    A collage of mandibles from extant and extinct Old World monkeys and apes that were included in the study.
  • Super-sniffer mice could detect land mines and decode human olfactory system

    {Researchers at Hunter College, part of the City University of New York, have created super-sniffer mice that have an increased ability to detect a specific odor, according to a study published July 7 in Cell Reports.The mice, which can be tuned to have different levels of sensitivity to any smell by using mouse or human odor receptors, could be used as land-mine detectors or as the basis for novel disease sensors.}

    The technology, a transgenic approach to engineering the mouse genome, could also provide researchers with a way to study human odor receptors. “This is one of our five basic senses, yet we have almost no clue how odors are coded by the brain,” says lead investigator Paul Feinstein, an associate professor of biological sciences at Hunter. “It’s still a black box.”

    The nature of the odor receptors was discovered in 1991, a Nobel Prize winning feat, but exactly how the olfactory system wires itself still isn’t well understood. The noses of mammals contain a collection of sensory neurons, each equipped with a single chemical sensor called a receptor that detects a specific odor. In mice, as in humans, each neuron selects only one receptor. Collectively, neurons choose an even distribution of receptors, so each of the thousand distinct receptors is represented in about 0.1 percent of neurons.

    In an effort to understand the mechanism these neurons use to choose a specific receptor, Feinstein tinkered with the mouse genome. He introduced the DNA for an odor receptor gene transgenically, by injection into the nucleus of a fertilized egg cell. He also added an extra string of DNA to the gene sequence to see if it would alter the probability of the gene being chosen. After a few attempts, he found a string that, when copied four or more times, worked.

    More copies of this extra string of DNA resulted in a series of super-sniffer mice with increasing numbers of neurons expressing the selected receptor, a well-characterized receptor that detects acetophenone, which has a sweet smell similar to jasmine. The mice still maintain a relatively even distribution of other odor receptors. “We don’t know how the neuron performs singular gene choice yet, but we can increase the probability of a given choice occurring,” says Feinstein.

    In parallel, post-doctoral researcher Charlotte D’Hulst was trying to replace a mouse receptor gene with a human one. Even though such gene swapping is standard practice in other fields, it did not work. It wasn’t the first time researchers had been stymied by olfaction. Repeated attempts in the field to study odor receptors by growing them in cells in Petri dishes have also led to dead ends.

    As a result, human olfaction receptors are poorly understood. “Without understanding how odors bind to receptors, people have no rational way of designing new odors,” says D’Hulst. “They also have no way of boosting the diminished smell capacity in patients with diseases such as Parkinson’s.”

    So D’Hulst abandoned gene swapping and tried Feinstein’s transgenic super-sniffer technique to insert a human receptor gene into the mouse. It worked. “We have developed a system where we can study human odor receptors and finally determine how human odor coding works,” says Feinstein.

    The team validated that the mice do indeed have an amplified sense of smell for the given receptor. They first used fluorescent imaging in live mice to trace the activation of the amplified odor receptor in response to the receptor’s corresponding odor. These tests gave visual confirmation that the receptors are functional and present in greater numbers than others.

    In a standard behavioral test in which animals were trained to avoid an odor known to bind the transgenic receptor, the super-sniffer mice were able to detect the presence of this unpleasant odor in water at levels two orders of magnitude lower than those detectable by mice without super-sniffer abilities. “The animals could smell the odor better because of the increased presence of the receptor,” says D’Hulst.

    The team is now working towards commercializing their technology and has founded a company called MouSensor, LLC. The Feinstein lab has received funding from the Department of Defense to develop super-sniffing rats that can be trained to detect TNT and potentially find land mines. The researchers also envision applications of the MouSensor for developing a type of nose-on-a-chip as a means of diagnosing disease using chemical detection profiling. “We have these millions-of-years-old receptors that are highly tuned to detect chemicals,” says Feinstein. “We think we can develop them into tools and use them to detect disease.”

    Red fluorescence represents super-sniffer receptors connecting to the brain in the mouse olfactory system while the green fluorescence marks all other odor receptor populations.
  • Surprising planet with three suns discovered

    {A team of astronomers have used the SPHERE instrument on ESO’s Very Large Telescope to image the first planet ever found in a wide orbit inside a triple-star system. The orbit of such a planet had been expected to be unstable, probably resulting in the planet being quickly ejected from the system. But somehow this one survives. This unexpected observation suggests that such systems may actually be more common than previously thought. The results will be published online in the journal Science on 7 July 2016.}

    Luke Skywalker’s home planet, Tatooine, in the Star Wars saga, was a strange world with two suns in the sky, but astronomers have now found a planet in an even more exotic system, where an observer would either experience constant daylight or enjoy triple sunrises and sunsets each day, depending on the seasons, which last longer than human lifetimes.

    This world has been discovered by a team of astronomers led by the University of Arizona, USA, using direct imaging at ESO’s Very Large Telescope (VLT) in Chile. The planet, HD 131399Ab, is unlike any other known world — its orbit around the brightest of the three stars is by far the widest known within a multi-star system. Such orbits are often unstable, because of the complex and changing gravitational attraction from the other two stars in the system, and planets in stable orbits were thought to be very unlikely.

    Located about 320 light-years from Earth in the constellation of Centaurus (The Centaur), HD 131399Ab is about 16 million years old, making it also one of the youngest exoplanets discovered to date, and one of very few directly imaged planets. With a temperature of around 580 degrees Celsius and an estimated mass of four Jupiter masses, it is also one of the coldest and least massive directly-imaged exoplanets.

    “HD 131399Ab is one of the few exoplanets that have been directly imaged, and it’s the first one in such an interesting dynamical configuration,” said Daniel Apai, from the University of Arizona, USA, and one of the co-authors of the new paper.

    “For about half of the planet’s orbit, which lasts 550 Earth-years, three stars are visible in the sky; the fainter two are always much closer together, and change in apparent separation from the brightest star throughout the year,” adds Kevin Wagner, the paper’s first author and discoverer of HD 131399Ab.

    Kevin Wagner, who is a PhD student at the University of Arizona, identified the planet among hundreds of candidate planets and led the follow-up observations to verify its nature.

    The planet also marks the first discovery of an exoplanet made with the SPHERE instrument on the VLT. SPHERE is sensitive to infrared light, allowing it to detect the heat signatures of young planets, along with sophisticated features correcting for atmospheric disturbances and blocking out the otherwise blinding light of their host stars.

    Although repeated and long-term observations will be needed to precisely determine the planet’s trajectory among its host stars, observations and simulations seem to suggest the following scenario: the brightest star is estimated to be eighty percent more massive than the Sun and dubbed HD 131399A, which itself is orbited by the less massive stars, B and C, at about 300 au (one au, or astronomical unit, equals the average distance between Earth and the Sun). All the while, B and C twirl around each other like a spinning dumbbell, separated by a distance roughly equal to that between the Sun and Saturn (10 au).

    In this scenario, planet HD 131399Ab travels around the star A in an orbit with a radius of about 80 au, about twice as large as Pluto’s in the Solar System, and brings the planet to about one third of the separation between star A and the B/C star pair. The authors point out that a range of orbital scenarios is possible, and the verdict on the long-term stability of the system will have to wait for planned follow-up observations that will better constrain the planet’s orbit.

    “If the planet was further away from the most massive star in the system, it would be kicked out of the system,” Apai explained. “Our computer simulations have shown that this type of orbit can be stable, but if you change things around just a little bit, it can become unstable very quickly.”

    Planets in multi-star systems are of special interest to astronomers and planetary scientists because they provide an example of how the mechanism of planetary formation functions in these more extreme scenarios. While multi-star systems seem exotic to us in our orbit around our solitary star, multi-star systems are in fact just as common as single stars.

    “It is not clear how this planet ended up on its wide orbit in this extreme system, and we can’t say yet what this means for our broader understanding of the types of planetary systems, but it shows that there is more variety out there than many would have deemed possible,” concludes Kevin Wagner. “What we do know is that planets in multi-star systems have been studied far less often, but are potentially just as numerous as planets in single-star systems.”

    This artist's impression shows a view of the triple star system HD 131399 from close to the giant planet orbiting in the system. The planet is known as HD 131399Ab and appears at the lower-left of the picture.
  • Does one drink always lead to two, three or four? Scientists have identified the neurons in the brain that (should) tell us to stop drinking

    {No one ever seems to go for ‘just one drink’.}

    And scientists claim to have discovered the reason why.

    According to a new study, the human brain contains particular neurons, called D2, that tell us to stop drinking.

    But problematically, D2 neurons tend to become deactivated when we drink too much alcohol, the Texas A&M Health Science Center College of Medicine study concludes.
    This deactivation means we drink more, in a self-perpetuating cycle.

    And according to lead author Professor Jun Wang, binge drinking balanced out by a dry spell only serves to weaken these neurons.

    However, researchers hope the findings could provide insight into another mechanism underlying the complicated disease of alcoholism, leading to more specific targeted treatment.

    By activating particular neurons, scientists may be able to influence drinking behavior.
    ‘At least from the addiction point of view, D2 neurons are good,’ said Jun Wang, the corresponding author on the paper and assistant professor in the Department of Neuroscience and Experimental Therapeutics at the Texas A&M College of Medicine.

    ‘When they are activated, they inhibit drinking behavior, and therefore activating them is important for preventing problem drinking behavior.’

    The study backed by the National Institute on Alcohol Abuse and Alcoholism (NIAAA) comes after Dr Wang identified another neurons, called D1, which incite cravings.

    The previous study, published in September 2015, found alcohol consumption alters the physical structure and function of neurons, called medium spiny neurons, in the dorsomedial striatum.

    Essentially, they found that activation of one type of neuron, called D1, determines whether one drink leads to two.

    Each neuron has one of two types of dopamine receptors – D1 or D2 – and so can be thought of as either D1 or D2 neurons.

    D1 neurons are informally called part of a ‘go’ pathway in the brain. Therefore, they are the ones that incite alcohol cravings.

    D2 neurons, meanwhile, are in the ‘no-go’ pathway, meaning that when D2 neurons are activated, they discourage action – telling you to do nothing.

    Dr Wang and his team tested the theory on animals.

    They found that in animal models, repeated cycles of excessive alcohol intake, followed by abstaining from alcohol, changed the strength of these neuronal connections.

    In others words, regular binge drinking separated by dry spells actually trained the brain to crave drinking.

    ‘Think of the binge drinking behavior of so many young adults,’ Wang said.

    ‘Essentially they are probably doing the same thing that we’ve shown leads to inhibition of these so-called “good” neurons and contributes to greater alcohol consumption.’

    Just one drink? A study says the human brain contains particular neurons, called D2, that tell us to stop drinking - but these neurons are weakened by alcohol and stop working when we drink too much
  • Cannibalism among late Neanderthals in northern Europe

    {Grisly evidence has been uncovered by researchers that Neanderthals butchered their own kind some 40,000 years ago, opening up many possibilities regarding the way late Neanderthals dealt with their dead in this last period before they died out.}

    Tübingen researchers in international team uncover grisly evidence that Neanderthals butchered their own kind some 40,000 years ago.

    Neanderthal bones from an excavation in Belgium have yielded evidence of intentional butchering. The findings, from the Goyet caves near Namur, are the first evidence of cannibalism among Neanderthals north of the Alps. The skeletal remains were radiocarbon-dated to an age of around 40,500 to 45,500 years. Remarkably, this group of late Neanderthals also used the bones of their kind as tools, which were used to shape other tools of stone.

    Professors Hervé Bocherens and Johannes Krause of Tübingen’s Senckenberg Center for Human Evolution and Palaeoenvironment, along with Cosimo Posth and Christoph Wissing, also of the University of Tübingen, took part in the investigations. A review of the finds from the Troisième caverne of Goyet combined results from various disciplines; it identified 99 previously uncertain bone fragments as Neanderthal bones. That means Goyet has yielded the greatest amount of Neanderthal remains north of the Alps.

    By making a complete analysis of the mitochondrial DNA of ten Neanderthals, the researchers doubled the existing genetic data on this species of humans which died out some 30,000 years ago. They confirmed earlier studies’ results, which showed relatively little genetic variation in late European Neanderthals — in other words, that they were closely related to one another. The findings have been published in the latest Scientific Reports.

    The Troisième caverne of Goyet was excavated nearly 150 years ago. Today, researchers are able to extract vast amounts of information using current methods — such as precise digital measurement and categorization of the bones, examination of the conditions in which the bone fragments were preserved, as well as isotopic and genetic analysis.

    Some Neanderthal remains from Goyet have been worked by human hands, as evidenced by cut marks, pits and notches. The researchers see this as an indication that the bodies from which they came were butchered. This appears to have been done thoroughly; the remains indicate processes of skinning, cutting up, and extraction of the bone marrow. “These indications allow us to assume that Neanderthals practised cannibalism,” says Hervé Bocherens. But he adds that it is impossible to say whether the remains were butchered as part of some symbolic act, or whether the butchering was carried out simply for food. “The many remains of horses and reindeer found in Goyet were processed the same way,” Bocherens says. Researchers have long debated the evidence of cannibalism among Neanderthals, which until now focused on the sites of El Sidrón and Zafarraya in Spain and two French sites, Moula-Guercy and Les Pradelles. The Troisième caverne of Goyet is the first example of this phenomenon from more northern parts of Europe.

    Four bones from Goyet clearly indicate that Neanderthals used their deceased relatives’ bones as tools; one thigh bone and three shinbones were used to shape stone tools. Animal bones were frequently used as knapping tools. “That Neanderthal bones were used for this purpose — that’s something we had seen at very few sites, and nowhere as frequently as in Goyet,” Bocherens says.

    The new findings open up many possibilities regarding the way late Neanderthals dealt with their dead in this last period before they died out. Bocherens says none of the other Neanderthal sites in the region have yielded indications that the dead were dealt with as they were in Goyet. On the contrary, they have yielded burials. Researchers say that, in addition, other northern European Neanderthal sites had a greater variety and various arsenals of stone tools. “The big differences in the behavior of these people on the one hand, and the close genetic relationship between late European Neanderthals on the other, raise many questions about the social lives and exchange between various groups,” says Bocherens.

    Archaeozoologist Cédric Beauval of Archéosphère (left), palaeoanthropologist Hélène Rougier from California State University Northridge (center) and Isabelle Crevecoeur of the University of Bordeaux (right) identify human remains in the collection of finds from the Goyet caves.
  • Warming pulses in ancient climate record link volcanoes, asteroid impact and dinosaur-killing mass extinction

    {A new reconstruction of Antarctic ocean temperatures around the time the dinosaurs disappeared 66 million years ago supports the idea that one of the planet’s biggest mass extinctions was due to the combined effects of volcanic eruptions and an asteroid impact.}

    Two University of Michigan researchers and a Florida colleague found two abrupt warming spikes in ocean temperatures that coincide with two previously documented extinction pulses near the end of the Cretaceous Period. The first extinction pulse has been tied to massive volcanic eruptions in India, the second to the impact of an asteroid or comet on Mexico’s Yucatan Peninsula.

    Both events were accompanied by warming episodes the U-M-led team found by analyzing the chemical composition of fossil shells using a recently developed technique called the carbonate clumped isotope paleothermometer.

    The new technique, which avoids some of the pitfalls of previous methods, showed that Antarctic ocean temperatures jumped about 14 degrees Fahrenheit during the first of the two warming events, likely the result of massive amounts of heat-trapping carbon dioxide gas released from India’s Deccan Traps volcanic region. The second warming spike was smaller and occurred about 150,000 years later, around the time of the Chicxulub impact in the Yucatan.

    “This new temperature record provides a direct link between the volcanism and impact events and the extinction pulses — that link being climate change,” said Sierra Petersen, a postdoctoral researcher in the U-M Department of Earth and Environmental Sciences.

    “We find that the end-Cretaceous mass extinction was caused by a combination of the volcanism and meteorite impact, delivering a theoretical ‘one-two punch,’” said Petersen, first author of a paper scheduled for online publication July 5 in the journal Nature Communications.

    The cause of the Cretaceous-Paleogene (KPg) mass extinction, which wiped out the non-avian dinosaurs and roughly three-quarters of the planet’s plant and animal species about 66 million years ago, has been debated for decades. Many scientists believe the extinction was caused by an asteroid impact; some think regional volcanism was to blame, and others suspect it was due to a combination of the two.

    Recently, there’s been growing support for the so-called press-pulse mechanism. The “press” of gradual climatic change due to Deccan Traps volcanism was followed by the instantaneous, catastrophic “pulse” of the impact. Together, these events were responsible for the KPg extinction, according to the theory.

    The new record of ancient Antarctic ocean temperatures provides strong support for the press-pulse extinction mechanism, Petersen said. Pre-impact climate warming due to volcanism “may have increased ecosystem stress, making the ecosystem more vulnerable to collapse when the meteorite hit,” concluded Petersen and co-authors Kyger Lohmann of U-M and Andrea Dutton of the University of Florida.

    To create their new temperature record, which spans 3.5 million years at the end of the Cretaceous and the start of the Paleogene Period, the researchers analyzed the isotopic composition of 29 remarkably well-preserved shells of clam-like bivalves collected on Antarctica’s Seymour Island.

    These mollusks lived 65.5-to-69 million years ago in a shallow coastal delta near the northern tip of the Antarctic Peninsula. At the time, the continent was likely covered by coniferous forest, unlike the giant ice sheet that is there today.

    As the 2-to-5-inch-long bivalves grew, their shells incorporated atoms of the elements oxygen and carbon of slightly different masses, or isotopes, in ratios that reveal the temperature of the surrounding seawater.

    The isotopic analysis showed that seawater temperatures in the Antarctic in the Late Cretaceous averaged about 46 degrees Fahrenheit, punctuated by two abrupt warming spikes.

    “A previous study found that the end-Cretaceous extinction at this location occurred in two closely timed pulses,” Petersen said. “These two extinction pulses coincide with the two warming spikes we identified in our new temperature record, which each line up with one of the two ‘causal events.’”

    Unlike previous methods, the clumped isotope paleothermometer technique does not rely on assumptions about the isotopic composition of seawater. Those assumptions thwarted previous attempts to link temperature change and ancient extinctions on Seymour Island.

    The preservation of Cretaceous mollusk fossils from Seymour Island is excellent, with shells preserving original mother-of-pearl material as in these two specimens of Eselaevitrigonia regina.
  • A host of common chemicals endanger child brain development

    {In a new report, dozens of scientists, health practitioners and children’s health advocates are calling for renewed attention to the growing evidence that many common and widely available chemicals endanger neurodevelopment in fetuses and children of all ages.}

    The chemicals that are of most concern include lead and mercury; organophosphate pesticides used in agriculture and home gardens; phthalates, found in pharmaceuticals, plastics and personal care products; flame retardants known as polybrominated diphenyl ethers; and air pollutants produced by the combustion of wood and fossil fuels, said University of Illinois comparative biosciences professor Susan Schantz, one of dozens of individual signatories to the consensus statement.

    Polychlorinated biphenyls, once used as coolants and lubricants in transformers and other electrical equipment, also are of concern. PCBs were banned in the U.S. in 1977, but can persist in the environment for decades, she said.

    The new report, “Project TENDR: Targeting Environmental NeuroDevelopment Risks,” appears in the journal Environmental Health Perspectives.

    “These chemicals are pervasive, not only in air and water, but in everyday consumer products that we use on our bodies and in our homes,” Schantz said. “Reducing exposures to toxic chemicals can be done, and is urgently needed to protect today’s and tomorrow’s children.”

    Schantz is a faculty member in the College of Veterinary Medicine and in the Beckman Institute for Advanced Science and Technology at the U. of I.

    “The human brain develops over a very long period of time, starting in gestation and continuing during childhood and even into early adulthood,” Schantz said. “But the biggest amount of growth occurs during prenatal development. The neurons are forming and migrating and maturing and differentiating. And if you disrupt this process, you’re likely to have permanent effects.”

    Some of the chemicals of concern, such as phthalates and PBDEs, are known to interfere with normal hormone activity. For example, most pregnant women in the U.S. will test positive for exposure to phthalates and PBDEs, both of which disrupt thyroid hormone function.

    “Thyroid hormone is involved in almost every aspect of brain development, from formation of the neurons to cell division, to the proper migration of cells and myelination of the axons after the cells are differentiated,” said Schantz. “It regulates many of the genes involved in nervous system development.”

    Schantz and her colleagues at Illinois are studying infants and their mothers to determine whether prenatal exposure to phthalates and other endocrine disruptors leads to changes in the brain or behavior. This research, along with parallel studies in older children and animals, is a primary focus of the Children’s Environmental Health Research Center at Illinois, which Schantz directs.

    Phthalates also interfere with steroid hormone activity. Studies link exposure to certain phthalates with attention deficits, lower IQ and conduct disorders in children. “Phthalates are everywhere; they’re in all kinds of different products. We’re exposed to them every day,” Schantz said.

    The report criticizes current regulatory lapses that allow chemicals to be introduced into people’s lives with little or no review of their effects on fetal and child health.

    “For most chemicals, we have no idea what they’re doing to children’s neurodevelopment,” Schantz said. “They just haven’t been studied.

    “And if it looks like something is a risk, we feel policymakers should be willing to make a decision that this or that chemical could be a bad actor and we need to stop its production or limit its use,” she said. “We shouldn’t have to wait 10 or 15 years — allowing countless children to be exposed to it in the meantime — until we’re positive it’s a bad actor.”

    The National Institute of Environmental Health Sciences at the National Institutes of Health and the U.S. Environmental Protection Agency fund the Children’s Environmental Health Research Center at the University of Illinois.

    The above post is reprinted from materials provided by University of Illinois at Urbana-Champaign.The original item was written by Diana Yates. Note: Materials may be edited for content and length.

    In addition to mercury and lead, flame retardants, air pollutants and chemicals found in many plastics, cosmetics and food containers endanger child brain health, researchers say.