Category: Science News

  • Identical twins, not-so-identical stem cells

    {Salk scientists and collaborators have shed light on a long-standing question about what leads to variation in stem cells by comparing induced pluripotent stem cells (iPSCs) derived from identical twins. Even iPSCs made from the cells of twins, they found, have important differences, suggesting that not all variation between iPSC lines is rooted in genetics, since the twins have identical genes.}

    Because they can differentiate into almost any cell type in the body, stem cells have the potential to be used to create healthy cells to treat a number of diseases. But stem cells come in two varieties: embryonic stem cells (ESCs), which are isolated from embryos, and iPSCs, which are created in the lab from adult cells that are reprogrammed using mixtures of signaling molecules and are a promising tool for understanding disease and developing new treatments.

    Although iPSCs resemble ESCs in most ways, scientists have found that iPSCs often have variations in their epigenetics — methyl marks on the DNA that dictate when genes are expressed. These epigenetic markers aren’t the same between iPSCs and ESCs, or even between different lines of iPSCs. In the past, it’s been hard to determine what drives these differences.

    “When we reprogram cells, we see small differences when we compare them to stem cells that come from an embryo. We wanted to understand what types of differences are always there, what is causing them, and what they mean,” says Juan Carlos Izpisua Belmonte, a professor in Salk’s Gene Expression Laboratory and co-senior author, with Kelly Frazer of the University of California, San Diego, on the new paper, which was published in Cell Stem Cell in April 2017. A better understanding of these differences will help researchers refine stem-cell based treatments for disease.

    Izpisua Belmonte and Frazer, along with co-first authors of the paper Athanasia Panopoulos, formerly a postdoctoral fellow at Salk and now at the University of Notre Dame, and Erin Smith of UCSD, turned to twins to help sort it out.

    Although identical twins have the same genes as each other, their epigenomes — the collection of methyl marks studding their DNA — are different by the time they reach adulthood due in part to environmental factors. Reprogramming the skin cells of adult identical twins to their embryonic state eliminated most of these differences, the researchers found when they studied cells from three sets of twins. However, there were still key epigenetic differences between twins in terms of how the iPSCs compared to ESCs.

    When the team looked more in depth at the spots of the genome where this variation between methyl marks tended to show up in twins, they found that they often fell near binding sites for a regulatory protein called MYC.

    “In the past, researchers had found lots of sites with variations in methylation status, but it was hard to figure out which of those sites had variation due to genetics,” says Panopoulos. “Here, we could focus more specifically on the sites we know have nothing to do with genetics.” That new focus, she says, is what allowed them to home in on the MYC binding sites.

    The MYC protein — which is one of the molecules used to reprogram iPSCs from adult cells — likely plays a role in dictating which sites in the genome are randomly methylated during the reprogramming process, the researchers hypothesized.

    “The twins enabled us to ask questions we couldn’t ask before,” says Panopoulos. “You’re able to see what happens when you reprogram cells with identical genomes but divergent epigenomes, and figure out what is happening because of genetics, and what is happening due to other mechanisms.”

    The findings help scientists better understand the processes involved in reprogramming cells and the differences between iPSCs and ESCs, which has implications on future studies aiming to understand the specific causes and consequences of these changes, and the way iPSCs are being used for research and therapeutics.

    A new twin study sheds light on epigenetic patterns in stem cells.

    Source:Science Daily

  • Closer look at brain circuits reveals important role of genetics

    {Scientists at The Scripps Research Institute (TSRI) in La Jolla have revealed new clues to the wiring of the brain. A team led by Associate Professor Anton Maximov found that neurons in brain regions that store memory can form networks in the absence of synaptic activity.}

    “Our results imply that assembly of neural circuits in areas required for cognition is largely controlled by intrinsic genetic programs that operate independently of the external world,” Maximov explained.

    A similar phenomenon was observed by the group of Professor Nils Brose at the Max Planck Institute for Experimental Medicine in Germany. The two complementary studies were co-published as cover stories in the April 19, 2017, issue of the journal Neuron.

    {{The “Nature vs. Nurture” Question}}

    Experience makes every brain unique by changing the patterns and properties of neuronal connections. Vision, hearing, smell, taste and touch play particularly important roles during early postnatal life when the majority of synapses is formed. New synapses also appear in the adult brain during learning. These activity-dependent changes in neuronal wiring are driven by chemical neurotransmitters that relay signals from one neuron to another. Yet, animals and humans have innate behaviors whose features are consistent across generations, suggesting that some synaptic connections are genetically predetermined.

    The notion that neurons do not need to communicate to develop networks has also been supported by earlier discoveries of synapses in mice that lacked transmitter secretion in the entire brain. These studies were performed in the laboratory of Professor Thomas Südhof, who won the 2013 Nobel Prize in Physiology or Medicine.

    “We thought these experiments were quite intriguing,” Maximov said, “but they also had a major limitation: mice with completely disabled nervous systems became paralyzed and died shortly after birth, when circuitry in the brain is still rudimental.”

    The TSRI team set out to investigate if neurons can form and maintain connections with appropriate partners in genetically engineered animals that live into adulthood with virtually no synaptic activity in the hippocampus, a brain region that is critical for learning and memory storage. “While the idea may sound crazy at the first glance,” Maximov continued, “several observations hinted that this task is technically feasible.” Indeed, mammals can survive with injuries and developmental abnormalities that result in a massive loss of brain tissue.

    Inspired by these examples, Richard Sando, a graduate student in the Maximov lab, generated mice whose hippocampus permanently lacked secretion of glutamate, a neurotransmitter that activates neurons when a memory is formed. Despite apparent inability to learn and remember, these animals could eat, walk around, groom, and even engage in rudimental social interactions.

    Working closely with Professor Mark Ellisman, who directs the National Center for Microscopy and Imaging Research at the University of California, San Diego, Sando and his co-workers then examined the connectivity in permanently inactive areas. Combining contemporary genetic and imaging tools was fruitful: the collaborative team found that several key stages of neural circuit development widely believed to require synaptic activity were remarkably unaffected in their mouse model.

    The outcomes of ultra-structural analyses were particularly surprising: it turns out that neurotransmission is unnecessary for assembly of basic building blocks of single synaptic connections, including so-called dendritic spines that recruit signaling complexes that enable neurons to sense glutamate.

    Maximov emphasized that the mice could not function normally. In a way, their hippocampus can be compared to a computer that goes though the assembly line, but never gets plugged to a power source and loaded with software. As the next step, the team aims to exploit new chemical-genetic approaches to test if intrinsically-formed networks can support learning.

    A serial electron microscopy reconstruction of a single synaptic connection.

    Source:Science Daily

  • Why animals have evolved to favor one side of the brain

    {Most left-handers can rattle off a list of their eminent comrades-in-arms: Oprah Winfrey, Albert Einstein, and Barack Obama, just to name three, but they may want to add on cockatoos, “southpaw” squirrels, and some house cats. “Handed-ness” or left-right asymmetry is prevalent throughout the animal kingdom, including in pigeons and zebrafish. But why do people and animals naturally favor one side over the other, and what does it teach us about the brain’s inner workings? Researchers explore these questions in a Review published April 19 in Neuron.}

    “Studying asymmetry can provide the most basic blueprints for how the brain is organized,” says lead author Onur Güntürkün, of the Institute of Cognitive Neuroscience at Ruhr-University Bochum, in Germany. “It gives us an unprecedented window into the wiring of the early, developing brain that ultimately determines the fate of the adult brain.” Because asymmetry is not limited to human brains, a number of animal models have emerged that can help unravel both the genetic and epigenetic foundations for the phenomenon of lateralization.

    Güntürkün says that brain lateralization serves three purposes. The first of those is perceptual specialization: the more complex a task, the more it helps to have a specialized area for performing that task. For example, in most people, the right side of the brain focuses on recognizing faces, while the left side is responsible for identifying letters and words.

    The next area is motor specialization, which brings us to the southpaw. “What you do with your hands is a miracle of biological evolution,” he says. “We are the master of our hands, and by funneling this training to one hemisphere of our brains, we can become more proficient at that kind of dexterity.” Natural selection likely provided an advantage that resulted in a proportion of the population — about 10% — favoring the opposite hand. The thing that connects the two is parallel processing, which enables us to do two things that use different parts of the brain at the same time.

    Brain asymmetry is present in many vertebrates and invertebrates. “It is, in fact, an invention of nature, which evolved because many animals have the same needs for specialization that we do,” says Güntürkün, who is also currently a visiting fellow at the Stellenbosch Institute for Advanced Study in South Africa. Studies have shown that birds, like chickens, use one eye to distinguish grain from pebbles on the ground while at the same time using the other eye to keep watch for predators overhead.

    Research on pigeons has shown that this specialization often is a function of environmental influences. When a pigeon chick develops in the shell, its right eye turns toward the outside, leaving its left eye to face its body. When the right eye is exposed to light coming through the shell, it triggers a series of neuronal changes that allow the two eyes to ultimately have different jobs.

    A zebrafish model of lateralization, meanwhile, has enabled researchers to delve into the genetic aspects of asymmetrical development. Studies of important developmental pathways, including the Nodal signaling pathway, are uncovering details about how, very early in an embryo’s development, the cilia act to shuffle gene products to one side of the brain or the other. By manipulating the genes in Nodal and other pathways, researchers can study the effects of these developmental changes on zebrafish behaviors.

    Güntürkün says that this research can provide insight into the effects of asymmetry on brain conditions in humans. “There are almost no disorders of the human brain that are not linked to brain asymmetries,” he says. “If we understand the ontogeny of lateralization, we can make a great leap to see how brain wiring early in the developmental process may go wrong in these pathological cases.”

    Why do people and animals naturally favor one side over the other, and what does it teach us about the brain's inner workings?

    Source:Science Daily

  • Homing pigeons share our human ability to build knowledge across generations

    {Homing pigeons may share the human capacity to build on the knowledge of others, improving their navigational efficiency over time, a new Oxford University study has found.}

    The ability to gather, pass on and improve on knowledge over generations is known as cumulative culture. Until now humans and, arguably some other primates, were the only species thought to be capable of it.

    Takao Sasaki and Dora Biro, Research Associates in the Department of Zoology at Oxford University, conducted a study testing whether homing pigeons can gradually improve their flight paths, over time. They removed and replaced individuals in pairs of birds that were given a specific navigational task. Ten chains of birds were released from the same site and generational succession was simulated with the continuous replacement of birds familiar with the route with inexperienced birds who had never flown the course before. The idea was that these individuals could then pass their experience of the route down to the next pair generation, and also enable the collective intelligence of the group to continuously improve the route’s efficiency.

    The findings, published in Nature Communications, suggest that over time, the student does indeed become the teacher. The pairs’ homing performance improved consistently over generations — they streamlined their route to be more direct. Later generation groups eventually outperformed individuals that flew solo or in groups that never changed membership. Homing routes were also found to be more similar in consecutive generations of the same chain of pigeon pairs than across them, showing cross-generational knowledge transfer, or a “culture” of homing routes.

    Takao Sasaki, co-author and Research Fellow in the Department of Zoology said: ‘At one stage scientists thought that only humans had the cognitive capacity to accumulate knowledge as a society. Our study shows that pigeons share these abilities with humans, at least to the extent that they are capable of improving on a behavioural solution progressively over time. Nonetheless, we do not claim that they achieve this through the same processes.’

    When people share and pass knowledge down through generations, our culture tends to become more complex over time, There are many good examples of this from manufacturing and engineering. By contrast, when the process occurs between homing pigeons, the end result is an increase in the efficiency, (in this case navigational), but not necessarily the complexity, of the behaviour.

    Takao Sasaki added: ‘Although they have different processes, our findings demonstrate that pigeons can accumulate knowledge and progressively improve their performance, satisfying the criteria for cumulative culture. Our results further suggest that cumulative culture does not require sophisticated cognitive abilities as previously thought.’

    This study shows that collective intelligence, which typically focuses on one-time performance, can emerge from accumulation of knowledge over time.

    Dora Biro, co-author and Associate Professor of Animal Behaviour concludes: ‘One key novelty, we think, is that the gradual improvement we see is not due to new ‘ideas’ about how to improve the route being introduced by individual birds. Instead, the necessary innovations in each generation come from a form of collective intelligence that arises through pairs of birds having to solve the problem together — in other words through ‘two heads being better than one’.’

    Moving forward, the team intend to build on the study by investigating if a similar style of knowledge sharing and accumulation occurs in other multi-generational species’ social groups. Many animal groups have to solve the same problems repeatedly in the natural world, and if they use feedback from past outcomes of these tasks or events, this has the potential to influence, and potentially improve, the decisions the groups make in the future.

    Group of pigeons flying.

    Source:Science Daily

  • Cave-in: How blind species evolve

    {Why do animals that live in caves become blind?

    This question has long intrigued scientists and been the subject of hot debate.
    }

    Clearly, across the animal kingdom, blindness has evolved repeatedly. There are thousands of underground and cave-dwelling species, from naked mole rats to bats, found throughout nature. Many of these species have lost their sense of sight.

    Charles Darwin originally suggested that eyes could be lost by “disuse” over time. Now, Reed Cartwright, an ASU evolutionary biologist in the School of Life Sciences and researcher at the Biodesign Institute, wants to get to the heart of the matter — and in a recent publication in the journal BMC Evolutionary Biology, may be proving Darwin wrong.

    “We think that blindness in cavefish is indeed Darwinian, but ultimately this disproves Darwin’s original hypothesis of ‘disuse’,” said Cartwright. In new research, Cartwright explains that eyes are not lost by disuse, but rather, demonstrate Darwin’s fundamental theory of natural selection at work — with blindness selected as favorable and the fittest — for living in a cave.

    {{Go Fish}}

    For their work, his research team choose to model a well-studied blind cavefish, the Mexican tetra (Astyanax mexicanus), a small, docile, pink-hued fish just a few centimeters long that could easily make its home in an aquarium.

    It’s inhabited caves for 2-3 million years, giving it 5 million generations worth of time to evolve blindness. Cartwright’s group chose this Mexican tetra because there is also a surface-dwelling form that has retained its sight. And for scientists, this built-in comparative power makes it a good choice for further exploration. They have two populations to study that can interbreed and are polar opposites for physical traits.

    So Cartwright’s group decided to use computational power to investigate how multiple evolutionary mechanisms interact to shape the fish that live in caves.

    “The problem we have in these caves is that they are connected to the surface, and fish that can see immigrate into the cave and bring genes for sight with them,” said Cartwright. “Under these conditions, we don’t typically expect to find such a difference in traits between surface and cave populations. Unless selection was really, really, strong.”

    How strong? In their model, the selection for blindness would need to be about 48 times stronger than the immigration rate for Mexican tetras to evolve blindness in caves. Cartwright’s group estimates that a measure of fitness for blindness, called the selection coefficient, in the tetra is between 0.5 percent and 50 percent.

    These coefficients are high enough that laboratory experiments should have detected a difference between surface and cave forms of the fish; however, none have to date.

    {{Blinded by the light}}

    Cartwright’s team turned to a hypothesis going all the way back to a letter to the editor of Nature in 1925 by E. Ray Lankester, that essentially stated that the reason you have blindness in caves is because the fish that can see simply leave.

    “If sighted fish swim towards the light, the only fish that stay in the cave are blind fish. They aren’t trying to get to the light anymore because they can’t see it. Which actually is a form of selection, and thus, Darwinian evolution in action,” said Cartwright.

    According to Cartwright, explaining a fitness difference as big as 10 percent between sighted and blind fish may be difficult, “Iosing eyes might not give you 10 percent more offspring. However, if 10 percent of your seeing eye fish leave the cave, the migration rate is reasonably low, and that could be enough.”

    If over time, enough of the seeing eye fish are systematically being removed, they will also be removed from the gene pool, and that could be enough to drive the evolutionary process.

    It could be this sort of habitat preference that maintains the local blind fish population and the fish that can see are preferentially moving out of the cave. “We found that even a low level of preferential emigration, e.g. two percent, would provide a significant boost to local adaptation and the evolution of blindness in caves.”

    Cartwright’s team hopes that field biologists begin to consider Lankester’s 90-year old hypothesis when studying cavefish. “It would be great if someone could develop a study to test Lankester’s hypothesis and whether it is driving the evolution of blindness in caves. That would really help answer one of the questions that have intrigued biologists for over a century.”

    A well-studied blind cavefish (bottom), the Mexican tetra (Astyanax mexicanus), is a small, docile, pink-hued fish just a few centimeters long that could easily make its home in an aquarium. It’s inhabited caves for 2-3 million years, giving it 5 million generations worth of time to evolve blindness. ASU evolutionary biologist Reed Cartwright chose this Mexican tetra because there is also a surface-dwelling form (top) that has retained its sight.

    Source:Science Daily

  • Device pulls water from dry air, powered only by the sun

    {Metal-organic framework sucks up water from air with humidity as low as 20 percent}

    Imagine a future in which every home has an appliance that pulls all the water the household needs out of the air, even in dry or desert climates, using only the power of the sun.

    That future may be around the corner, with the demonstration this week of a water harvester that uses only ambient sunlight to pull liters of water out of the air each day in conditions as low as 20 percent humidity, a level common in arid areas.

    The solar-powered harvester, reported in the journal Science, was constructed at the Massachusetts Institute of Technology using a special material — a metal-organic framework, or MOF — produced at the University of California, Berkeley.

    “This is a major breakthrough in the long-standing challenge of harvesting water from the air at low humidity,” said Omar Yaghi, one of two senior authors of the paper, who holds the James and Neeltje Tretter chair in chemistry at UC Berkeley and is a faculty scientist at Lawrence Berkeley National Laboratory. “There is no other way to do that right now, except by using extra energy. Your electric dehumidifier at home ‘produces’ very expensive water.”

    The prototype, under conditions of 20-30 percent humidity, was able to pull 2.8 liters (3 quarts) of water from the air over a 12-hour period, using one kilogram (2.2 pounds) of MOF. Rooftop tests at MIT confirmed that the device works in real-world conditions.

    “One vision for the future is to have water off-grid, where you have a device at home running on ambient solar for delivering water that satisfies the needs of a household,” said Yaghi, who is the founding director of the Berkeley Global Science Institute, a co-director of the Kavli Energy NanoSciences Institute and the California Research Alliance by BASF. “To me, that will be made possible because of this experiment. I call it personalized water.”

    Yaghi invented metal-organic frameworks more than 20 years ago, combining metals like magnesium or aluminum with organic molecules in a tinker-toy arrangement to create rigid, porous structures ideal for storing gases and liquids. Since then, more than 20,000 different MOFs have been created by researchers worldwide. Some hold chemicals such as hydrogen or methane: the chemical company BASF is testing one of Yaghi’s MOFs in natural gas-fueled trucks, since MOF-filled tanks hold three times the methane that can be pumped under pressure into an empty tank.

    Other MOFs are able to capture carbon dioxide from flue gases, catalyze the reaction of adsorbed chemicals or separate petrochemicals in processing plants.

    In 2014, Yaghi and his UC Berkeley team synthesized a MOF — a combination of zirconium metal and adipic acid — that binds water vapor, and he suggested to Evelyn Wang, a mechanical engineer at MIT, that they join forces to turn the MOF into a water-collecting system.

    The system Wang and her students designed consisted of more than two pounds of dust-sized MOF crystals compressed between a solar absorber and a condenser plate, placed inside a chamber open to the air. As ambient air diffuses through the porous MOF, water molecules preferentially attach to the interior surfaces. X-ray diffraction studies have shown that the water vapor molecules often gather in groups of eight to form cubes.

    Sunlight entering through a window heats up the MOF and drives the bound water toward the condenser, which is at the temperature of the outside air. The vapor condenses as liquid water and drips into a collector.

    “This work offers a new way to harvest water from air that does not require high relative humidity conditions and is much more energy efficient than other existing technologies,” Wang said.

    This proof of concept harvester leaves much room for improvement, Yaghi said. The current MOF can absorb only 20 percent of its weight in water, but other MOF materials could possibly absorb 40 percent or more. The material can also be tweaked to be more effective at higher or lower humidity levels.

    “It’s not just that we made a passive device that sits there collecting water; we have now laid both the experimental and theoretical foundations so that we can screen other MOFs, thousands of which could be made, to find even better materials,” he said. “There is a lot of potential for scaling up the amount of water that is being harvested. It is just a matter of further engineering now.”

    Yaghi and his team are at work improving their MOFs, while Wang continues to improve the harvesting system to produce more water.

    “To have water running all the time, you could design a system that absorbs the humidity during the night and evolves it during the day,” he said. “Or design the solar collector to allow for this at a much faster rate, where more air is pushed in. We wanted to demonstrate that if you are cut off somewhere in the desert, you could survive because of this device. A person needs about a Coke can of water per day. That is something one could collect in less than an hour with this system.”

    This is the water harvester built at MIT with MOFs from UC Berkeley. Using only sunlight, the harvester can pull liters of water from low-humidity air over a 12-hour period.

    Source:Science Daily

  • Performance of earthquake early warning systems

    {The future of earthquake early warning systems may be contained in smartphones — and vehicles, and “smart” appliances and the increasing number of everyday objects embedded with sensors and communication chips that connect them with a global network.}

    At a presentation at the 2017 Seismological Society of America’s (SSA) Annual Meeting, Benjamin Brooks of the U.S. Geological Survey and colleagues will share data from a recent project in Chile that provided early detection, estimates and locations for earthquakes using a network of sensor boxes equipped with smartphones and consumer-quality GPS chips. Data collected by the sensor boxes is transmitted through an Android app developed by the researchers and analyzed to produce earthquake source models, which in turn can be used to create ground shaking forecasts and local tsunami warnings.

    The sensor stations have successfully detected three magnitude 5 or larger earthquakes since December 2016, with no false alarms. Although the smartphone-based sensors in the study are distributed in a fixed network, Brooks and colleagues say, it may be possible to someday harness individual smartphones and “smart” appliances into a crowd-sourced network for earthquake early warning.

    On the U.S West Coast, seismologists at the University of Washington are expanding and testing the capabilities of earthquake early warning systems already under development, such as the G-FAST system in the Pacific Northwest, and ShakeAlert in California. Brendan Crowell and colleagues will discuss the performance of G-FAST as tested by 1300 simulated megathrust earthquakes of magnitudes between 7.5 and 9.5 in the Cascadia region. Renate Hartog will present data suggesting that the algorithms behind ShakeAlert can be configured to work for the Pacific Northwest as well as California, suggesting that a West Coast-wide earthquake early warning system could be closer to reality.

    In other presentations at the SSA Annual Meeting, researchers will also discuss how earthquake early warning systems are developing ways to improve real-time ground motion alerts. Many early warning systems perform best when asked to pinpoint the magnitude and location of earthquakes, but ground motion warnings are also key to predicting and preventing infrastructure damage and destruction.

    Source:Science Daily

  • College students study best later in the day, study shows

    {Students learn more effectively between 11 a.m. and 9:30 p.m. than at other times of the day}

    A new cognitive research study used two new approaches to determine ranges of start times that optimize functioning for undergraduate students. Based on a sample of first and second year university students, the University of Nevada, Reno and The Open University in the United Kingdom used a survey-based, empirical model and a neuroscience-based, theoretical model to analyse the learning patterns of each student to determine optimum times when cognitive performance can be expected to be at its peak.

    “The basic thrust is that the best times of day for learning for college-age students are later than standard class hours begin,” Mariah Evans, associate professor of sociology at the University of Nevada, Reno and co-author of the study, said. “Especially for freshmen and sophomores, we should be running more afternoon and evening classes as part of the standard curriculum.”

    Prior research has demonstrated that late starts are optimal for most high school students, and this study extends that analysis to freshmen and sophomores in college. The analysis by Evans, Jonathan Kelley, fellow University sociology professor, and Paul Kelley, honorary associate of sleep, circadian and memory neuroscience at The Open University, assessed the preferred sleeping times of the participants and asked them to rate their fitness for cognitive activities in each hour of the 24-hour day.

    “Neuroscientists have documented the time shift using biological data — on average, teens’ biologically ‘natural’ day begins about two hours later than is optimal for prime age adults,” Evans said. “The survey we present here support that for college students, but they also show that when it comes to optimal performance, no one time fits all.”

    The study showed that much later starting times of after 11 a.m. or noon, result in the best learning. It also revealed that those who saw themselves as “evening” people outnumbered the “morning” people by 2:1, and it concluded that every start time disadvantages one or more of the chronotypes (propensity for the individual to be alert and cognitively active at a particular time during a 24-hour period).

    “Thus, the science supports recent moves by the University to encourage evening classes as part of the standard undergraduate curriculum,” Evans said. “It also supports increasing the availability of asynchronous online classes to enable students to align their academic work times with their optimal learning times.”

    The results, Identifying the best times for cognitive functioning using new methods: Matching university times to undergraduate chronotypes, were published in Frontiers in Human Neuroscience March 31, 2017.

    “This raises the question as to why conventional universities start their lectures at 9 a.m. or even earlier when our research reveals that this limits the performance of their students,” Kelley said. “This work is very helpful for asynchronous online learning as it allows for the student to target their study time to align with their personal rhythm and at the time of day when they know they are most effective.”

    New University of Nevada, Reno and The Open University research study used two new approaches to determine ranges of the best times of day for learning for college-age students.

    Source:Science Daily

  • Ant agricultural revolution began 30 million years ago in dry, desert-like climate

    {World’s first sustainable, industrial-scale agriculture began when crops became dependent on their ant farmers}

    Millions of years before humans discovered agriculture, vast farming systems were thriving beneath the surface of the Earth. The subterranean farms, which produced various types of fungi, were cultivated and maintained by colonies of ants, whose descendants continue practicing agriculture today.

    By tracing the evolutionary history of these fungus-farming ants, scientists at the Smithsonian’s National Museum of Natural History have learned about a key transition in the insects’ agricultural evolution. This transition allowed the ants to achieve higher levels of complexity in farming, rivaling the agricultural practices of humans: the domestication of crops that became permanently isolated from their wild habitats and thereby grew dependent on their farmers for their evolution and survival.

    In the April 12 issue of Proceedings of Royal Society B, scientists led by entomologist Ted Schultz, the museum’s curator of ants, report that the transition likely occurred when farming ants began living in dry climates, where moisture-loving fungi could not survive on their own. The finding comes from a genetic analysis that charts the evolutionary relationships of farming and non-farming ants from wet and dry habitats throughout the Neotropics.

    About 250 species of fungus-farming ants have been found in tropical forests, deserts and grasslands in the Americas and the Caribbean, and these species fall into two different groups based on the level of complexity of their farming societies: lower and higher agriculture. All farming ants start new fungal gardens when a queen’s daughter leaves her mother’s nest to go off and found her own nest, taking with her a piece of the original colony’s fungus to start the next colony’s farm.

    In the lower, primitive forms of ant agriculture — which largely occur in wet rain forests — fungal crops occasionally escape from their ant colonies and return to the wild. Lower ants also occasionally regather their farmed fungi from the wild and bring them back to their nests to replace faltering crops. These processes allow wild and cultivated fungi to interbreed and limit the degree of influence the lower ants have over the evolution of their crops.

    vBut, as with certain crops that have been so heavily modified by human breeders that they can no longer reproduce and live on their own in the wild, some fungal species have become so completely dependent on their relationship with farming ants that they are never found living independent of their farmers. These higher agricultural ants cultivate highly “domesticated” crops, enabling them to live in vast communities and to work together through division of labor to fertilize their fungal crops, haul away waste, keep pathogens at bay and maintain ideal growing conditions.

    “These higher agricultural-ant societies have been practicing sustainable, industrial-scale agriculture for millions of years,” Schultz said. “Studying their dynamics and how their relationships with their fungal partners have evolved may offer important lessons to inform our own challenges with our agricultural practices. Ants have established a form of agriculture that provides all the nourishment needed for their societies using a single crop that is resistant to disease, pests and droughts at a scale and level of efficiency that rivals human agriculture.”

    Today, many agricultural ant species are threatened by habitat destruction, and as part of his studies, Schultz has been collecting specimens from the field and preserving them in the museum’s cryogenic biorepository for future genomic studies. In the current study, he and his colleagues compared the genomes of 119 modern ant species, most of which were collected during his decades of field expeditions.

    Using powerful new genomic tools, the scientists compared DNA sequences at each of more than 1,500 genome sites for 78 fungus-farming species and 41 non-fungus-farming species. Their data-rich analysis gave the team a great deal of confidence in the evolutionary relationships they were able to map, Schultz said.

    Their analysis clarifies the closest living non-farming relative of today’s fungus-growing ants and allows Schultz and his team to begin to look at the geographic backgrounds of these species and deduce when, where and under what conditions particular traits emerged. In this study, the team was interested in learning when ants began practicing higher agriculture — that is, when some fungal crops came to be dependent on the ant-fungus relationship for survival.

    According to the evolutionary tree they constructed, the first ants to transition to higher agriculture likely lived in a dry or seasonally dry climate. The transition appears to have occurred around 30 million years ago — a time when the planet was cooling, and dry areas were becoming more prevalent.

    Fungi that had evolved to live in wet forests would have been poorly equipped to survive independently in this changing climate. “But if your ant farmer evolves to be better at living in a dry habitat, and it brings you along and it sees to all your needs, then you’re going to be doing okay,” Schultz said.

    Just as humans living in a dry or temperate climate might raise tropical plants in a greenhouse, agricultural ants carefully maintain the humidity within their fungal gardens. “If things are getting a little too dry, the ants go out and get water and they add it,” Schultz said. “If they’re too wet, they do the opposite.” So even when conditions above the surface become inhospitable, fungi can thrive inside the underground, climate-controlled chambers of an agricultural ant colony.

    In this situation, fungi can become dependent on their ant farmers — unable to escape the nest and return to the wild. “If you’ve been carried into a dry habitat, your fate is going to match the fate of the colony you’re in,” Schultz said. “At that point, you’re bound in a relationship with those ants that you were not bound in when you were in a wet forest.”

    Schultz said the conditions present during this evolutionary transition illustrate how an organism can become domesticated even if its farmers are not consciously selecting for desirable traits as human breeders might do. Ants that moved their fungi into new habitats would have isolated the organism from its wild relatives, just as humans do when they domesticate a crop. This isolation creates an opportunity for the farmed species to evolve independently from species in the wild, adopting new traits.

    Funding for this study was provided by the Smithsonian and the National Science Foundation.

    {{

    Left panel: Ted Schultz (left) and Jeffrey Sosa-Calvo (right) excavate a primitive, lower fungus-farming ant nest in the seasonally dry Brazilian Cerrado (savanna) near Brasilia in 2009. Photo credit: Cauê Lopes. Center and right panel: The underground garden chamber of a primitive, lower fungus-farming ant colony revealed by excavation. Photo credits: Ted Schultz, Smithsonian. Lower, primitive fungus-farming ant colonies and agricultural behaviors are comparably smaller-scale and simpler than the colonies of higher fungus-farming ants.

    }}

    Source:Science Daily

  • Surprising brain change appears to drive alcohol dependence

    {A new study led by scientists at The Scripps Research Institute (TSRI) could help researchers develop personalized treatments for alcoholism and alcohol use disorder.}

    The research reveals a key difference between the brains of alcohol-dependent versus nondependent rats. When given alcohol, both groups showed increased activity in a region of the brain called the central amygdala (CeA) — but this activity was due to two completely different brain signaling pathways.

    TSRI Professor Marisa Roberto, senior author of the new study, said the findings could help researchers develop more personalized treatments for alcohol dependence, as they evaluate how a person’s brain responds to different therapeutics.

    The findings were published recently online ahead of print in The Journal of Neuroscience.

    {{Researchers Find Brain’s Alcohol Response ‘Switch’}}

    The new research builds on the Roberto lab’s previous discovery that alcohol increases neuronal activity in the CeA. The researchers found increased activity both nondependent, or naïve, and alcohol-dependent rats.

    As they investigated this phenomenon in the new study, Roberto and her colleagues were surprised to find that the mechanisms underlying this increased activity differed between the two groups.

    By giving naïve rats a dose of alcohol, the researchers engaged proteins called calcium channels and increased neuronal activity. Neurons fired as the specific calcium channels at play, called L-type voltage-gated calcium channels (LTCCs), boosted the release of a neurotransmitter called GABA. Blocking these LTCCs reduced voluntary alcohol consumption in naïve rats.

    But in alcohol-dependent rats, the researchers found decreased abundance of LTCCs on neuronal cell membranes, disrupting their normal ability to drive a dose of alcohol’s effects on CeA activity. Instead, increased neuronal activity was driven by a stress hormone called corticotropin-releasing factor (CRF) and its type 1 receptor (CRF1). The researchers found that blocking CeA CRF1s reduced voluntary alcohol consumption in the dependent rats.

    Studying these two groups shed light on how alcohol functionally alters the brain, Roberto explained.

    “There is a switch in the molecular mechanisms underlying the CeA’s response to alcohol (from LTCC- to CRF1-driven) as the individual transitions to the alcohol-dependent state,” she said.

    The cellular and molecular experiments were led by TSRI Research Associate and study first author Florence Varodayan. The behavioral tests were conducted by TSRI Research Associate Giordano de Guglielmo in the lab of TSRI Associate Professor Olivier George.

    Roberto hopes the findings lead to better ways to treat alcohol dependence. Alcohol use disorder appears to have many different root causes, but the new findings suggest doctors could analyze certain symptoms or genetic markers to determine which patients are likely to have CRF-CRF1 hyperactivation and benefit from the development of a novel drug that blocks that activity.

    This study sheds light on how alcohol functionally alters the brain.

    Source:Science Daily