Category: Science News

  • Plugged-in parenting: How parental smartphone use may affect kids

    {A parent gets home from work just as a new email “dings” on his or her phone. At the same time, the toddler is calling out for a snack or because big brother isn’t sharing, and he would really like to show off his Lego creation.}

    Meanwhile, the phone keeps buzzing — more emails, social media notifications, a breaking news alert, an “urgent” text.

    As smartphones and tablets blur lines between work, home and social lives, parents are grappling to balance it all, a new small study suggests. Parents’ use of mobile technology around young children may be causing internal tension, conflicts and negative interactions with their kids, suggests the qualitative study in the Journal of Developmental & Behavioral Pediatrics.

    It’s a challenge both parents and health care providers should tune in to.

    “Parents are constantly feeling like they are in more than one place at once while parenting. They’re still ‘at work.’ They’re keeping up socially. All while trying to cook dinner and attend to their kids,” says lead author Jenny Radesky, M.D., a child behavior expert and pediatrician at University of Michigan C.S. Mott Children’s Hospital who conducted the study with colleagues from Boston Medical Center.

    “It’s much harder to toggle between mom or dad brain and other aspects of life because the boundaries have all blurred together. We wanted to understand how this was affecting parents emotionally. We found that parents are struggling to balance family time and the desire to be present at home with technology-based expectations like responding to work and other demands.”

    The study involved in-depth interviews with 35 caregivers, which included moms, dads and grandmothers.

    {{‘The whole world is in your lap’}}

    Participants consistently expressed an internal struggle between multitasking mobile technology use, work and children, information overload and emotional tensions around disrupting family routines, such as meal time. As one mom in a focus group described it, “the whole world is in your lap.”

    Some parents also reported a trickle-down effect. Their emotional response to whatever they were reading on their mobile device — whether it was a work email or bad news — sometimes affected how they responded to their children, for example. Parents also described more attention-seeking behaviors from children when they were heavily attentive to their mobile devices, which prompted negative interactions such as snapping at kids.

    At the same, caregivers said that mobile technology provided “an escape” from the boredom and stress of parenting and home life demands. One mom said that after long days with kids, plugging into the outside world was a reminder, “I have a life beyond this.”

    Other boons included more ability to work from home (when digital connection to work could be kept in check); easier communication with estranged family members by allowing a more “filtered” view of their life; and serving as a tool to keep peace and quiet in the house.

    “You don’t have to be available to your children 100 percent of the time — in fact, it’s healthy for them to be independent. It’s also important for parents to feel relevant at work and other parts of their lives,” Radesky says.

    “However, we are seeing parents overloaded and exhausted from being pulled in so many different directions.”

    Parents are estimated to use mobile devices such as tablets, smartphones and wearables nearly three hours a day. But few studies have explored the role these technologies play in family interactions.

    Radesky and colleagues wanted to explore the issue further after an observational study of caregivers eating with young children in fast food restaurants. In that study and subsequent videotaped research, her team found that parent mobile device use is associated with fewer verbal and nonverbal interactions with the children.

    “Technology has transformed the way parents use digital media around their children,” Radesky says. “Compared to traditional distractions like books, mobile technology is described as much more commanding of attention that is unpredictable and requires a greater emotional investment.

    “Kids require a lot of different types of thinking, so multitasking between them and technology can be emotionally and mentally draining. As clinicians, we have an opportunity to start conversations with parents and help them manage this conflict with ideas on how to unplug and set boundaries.”

    With all this in mind, physicians can recommend some ideas for families struggling to stay unplugged. Some suggestions from Dr. Radesky:

    {{Get screen time under control}}

    {{Set boundaries. }} Create a family plan that includes unplugged spaces or times of day. For example, you may abolish tech use at dinner time or bedtime. Or maybe it’s right when you get home and your kids are excited to see you. Maybe you plug in your device in a certain room and only use it there or agree not to use it in certain areas of the house (i.e. kids’ bedrooms).

    {{Track your mobile use. }} Consider creating a filter or block on your device to avoid the temptation of tech use at home. Apps like “Moment” and “Quality Time” may also help you track mobile use and see where you may be spending too much time. If 90 percent of your time is on Facebook or work email, for example, you can think of ways to cut down technology time for these purposes.

    {{Identify top device stressors.}} Think about which parts of your mobile device use are most stressful for you. If it’s reading the news or checking work email, for example, reserve these tasks for times when you know your kids are occupied. This way, you have your own time and space to process the information rather than interrupting time with kids who may react to your negative emotions with their own negativity.

  • Mapping free-fall styles of solid objects within fluids

    {By carefully observing scenes as simple as leaves falling from trees or dandelion seeds blowing in the wind, we can see diverse “falling styles” that include tumbling, fluttering or spiraling.}

    James Clerk Maxwell conducted some of the first documented studies of free-falling objects during the mid-1800s, when the physicist analyzed the tumbling motion of a freely falling plate. But much remains unknown about the phenomena.

    Maxwell’s work inspired a team of researchers from National University of Singapore and Nanjing University of Aeronautics and Astronautics in China to conduct a numerical study to explore the patterns made by 2-D rectangular plates falling freely within water. They report their findings this week in Physics of Fluids, from AIP Publishing.

    The team’s goal was to view and determine the plates’ regular free-fall patterns, identify the parameters influencing them, and figure out why plates don’t always fall in the same way each time.

    “Since we want to track the motion of a falling object, the flow field around the object also needed to be explored,” said Yan Wang, a research scientist working in the College of Engineering at the National University of Singapore. “To do this, we used computational fluid dynamics — a branch of fluid mechanics that uses numerical analysis and algorithms to solve and analyze flow problems on supercomputers.”

    In particular, the group tapped a “novel lattice Boltzmann flux solver with immersed boundaries” so they could carry out numerical simulations of the freely falling objects within an infinitely large domain. Another key concept involved in their work is the motion of a rigid body and unsteady aerodynamics.

    “This allowed us to study the key parameters that govern the patterns of falling plates and to construct a phase diagram to classify them,” Wang said. “Our most important finding is that the plates’ fluttering frequencies — unstable oscillations — are caused by a linear relationship with the Froude number (a dimensionless number used to indicate how well a particular model works in relation to a real system). And the lift forces on the fluttering plates are linearly dependent on the angle of attack at the cusp-like turning point.” These findings about the force characteristics may help improve the wing designs for unmanned aerial vehicles and to control their motions.

    The kinematics and mechanics involved in this phenomenon are relevant to many academic and engineering applications such as unsteady aerodynamics, biomechanics, sedimentology and chemical engineering.

    In terms of applications, the group’s work can be used “to build unsteady aerodynamic force models for falling objects or to predict the falling styles of objects,” Wang said. “Our work may also aid in reducing the search areas for belongings lost within bodies of water.”

    At this time, “the unsteady aerodynamics of freely falling objects remains far from understood,” he added. “And, although the falling patterns of an object can be predicted, controlling their motion actively or passively will require further studies.”

    Four different modes for the way a plate free falls within a fluid.
  • Childhood family environment linked with relationship quality 60 years later

    {Growing up in a warm family environment in childhood is associated with feeling more secure in romantic relationships in one’s 80s, according to new research published in Psychological Science, a journal of the Association for Psychological Science. The findings show that men who grew up in caring homes were more adept at managing stressful emotions when assessed as middle-aged adults, which helps to explain why they had more secure marriages late in life.}

    “Our study shows that the influences of childhood experiences can be demonstrated even when people reach their 80s, predicting how happy and secure they are in their marriages as octogenarians,” says researcher Robert Waldinger of Harvard Medical School. “We found that this link occurs in part because warmer childhoods promote better emotion management and interpersonal skills at midlife, and these skills predict more secure marriages in late life.”

    The unique longitudinal study, which followed the same individuals for over six decades beginning in adolescence, provides evidence for the life-long effects of childhood experiences.

    “With all the things that happen to human beings and influence them between adolescence and the ninth decade of life, it’s remarkable that the influence of childhood on late-life marriage can still be seen,” notes Marc Schulz, study co-author and professor at Bryn Mawr College.

    Waldinger and Schulz examined data collected from 81 men who participated in a 78-year study of adult development, 51 of whom were part of a Harvard College cohort and 30 of whom were part of an inner-city Boston cohort. All of the men completed regular interviews and questionnaires throughout the course of the study.

    To gauge the participants’ early home environment, the researchers examined data collected when the participants were adolescents, including participants’ reports about their home life, interviews with the participants’ parents, and developmental histories recorded by a social worker. The researchers used these data to create a composite measure of family environment.

    When the participants were 45 to 50 years old, they completed interviews in which they discussed the challenges they encountered in various aspects of their lives, including their relationships, their physical health, and their work. The research team used the original interview notes to rate participants’ ability to manage their emotions in response to these challenges.

    Finally, when participants were in their late 70s or early 80s, they completed a semistructured interview that focused on their attachment bond with their current partner. In these interviews, they were asked to talk about their marriages, including how comfortable they were depending on their partner and providing support to their partner. The researchers used data from these interviews to establish an overall rating of participants’ security of attachment to their partner.

    Waldinger and Schulz found that participants who had a nurturing family environment early in life were more likely to have secure attachments to their romantic partners late in life. Further analyses indicated that this association could be explained, in part, by better emotion regulation skills in midlife.

    These results add to previous research showing that the quality of people’s early home environments can have “far-reaching effects on wellbeing, life achievement, and relationship functioning throughout the lifespan,” says Waldinger.

    Taken together, these findings highlight the life-long effects of childhood experience, emphasize the importance of prioritizing the wellbeing of children, and suggest that supporting adaptive emotion management skills may help to lessen the impact of early childhood adversity, Waldinger and Schulz conclude.

  • Study finds differences in obesity rates between children/teens with and without autism

    {Children and teens with autism spectrum disorder (ASD) may be more likely to be obese and stay obese during adolescence than their peers without ASD, according to a new epidemiological study led by researchers from Tufts University School of Medicine and published online in Childhood Obesity in advance of print.}

    Previous studies have found that children with developmental disabilities, including ASD, have a higher risk of obesity than children without ASD. Using data from the 2011-2012 National Survey of Children’s Health, the team of researchers found that, among children ages 10 to 17, the rate of obesity remained fairly steady in children with ASD whereas the rate of obesity decreased in children without ASD.

    “We expected to see an increased prevalence of obesity with age in children with ASD compared to those without ASD, which would increase the obesity disparity. What we found was that the disparity did increase with age over adolescence, but the underlying patterns were not as expected. The prevalence of obesity in the ASD group was high and remained so, while the prevalence in children without ASD declined over adolescence,” said lead and corresponding study author Aviva Must, Ph.D., Morton A. Madoff Professor and chair of public health and community medicine at Tufts University School of Medicine in Boston.

    Using the nationally representative 2011-2012 National Survey of Children’s Health, the team of researchers in the Healthy Weight Research Network (HWRN) analyzed data from a total of 43,777 children ages 10 to 17 with available information on weight, height, gender and ASD status. BMI-for-age was calculated using parent-reported height, weight and age. Race and socioeconomic status were also collected from the data set. Must is also co-director of the HWRN.

    As earlier research had indicated they would, the researchers found a higher prevalence of obesity in children ages 10 to 17 with ASD than in children without ASD (23.1 percent versus 14.1 percent). However, the prevalence of obesity was consistent between ages 10 to 17 among children with ASD while it decreased with age among non-ASD children. Between ages 10 and 17, there was no significant increase in obesity prevalence among children with ASD (from 20.0 percent to 22.1 percent); among non-ASD children, however, obesity prevalence was cut in half (from 19.1 percent to 8.3 percent).

    In exploratory work, the researchers observed obesity disparities by gender and by race. Obesity prevalence for youth with ASD increased in boys and decreased in girls over the teen years. With respect to race and ethnicity, obesity prevalence increased for white, non-Hispanic youth with ASD and decreased in non-white, non-Hispanic youth with ASD over the same age range. These preliminary findings need confirmation in larger samples and studies that follow children over time.

    Obesity in childhood could have long-term health effects for people with ASD. The researchers believe many mechanisms should be explored for their potential role in the maintenance of obesity rates they observed in children with ASD.

    “Factors to consider with obesity in children with ASD are sensory sensitivity, the need for routine or sameness, behavioral rigidity, use of food as a reward, mealtime stress and parental stress; any or all of these could contribute to obesity,” said last author Linda Bandini, Ph.D., associate professor at UMass Medical School Shriver Center and department of health sciences, Boston University and director of the HWRN. “When it comes to energy expenditure, exercise for many teens comes in the form of competitive sports, in which children with developmental disabilities are less likely to take part. And another reward and calming technique parents of children with ASD have reported using is television, which may contribute to higher levels of sedentary behavior,” she continued.

    The researchers expect that further research and qualitative approaches, such as interviews with adolescents and caregivers, could be used to better understand the influence of behavioral and sociodemographic factors on the prevalence of obesity among children with ASD. This information will help facilitate obesity prevention and interventions for adolescents with ASD and their caregivers.

    “Children with developmental disabilities face unique challenges and are not always served by health interventions aimed at those without disorders such as ASD. The complexity of their medical needs is both why particular attention should be paid to their circumstances and why it is difficult to do so. Identifying the factors that support healthy weight in children without ASD, as well as any factors children with ASD are more or uniquely vulnerable to, could inform approaches for parents, teachers and others who work with youth with ASD,” said Must.

    The researchers also note that their findings should be considered in the context of some limitations, including the parent-reported nature of the data and insufficient detailed information to explore the role of medication use, which is relatively common in children with developmental disabilities and can contribute to weight gain.

  • Mice sing like a jet-engine

    {Mice court one another with ultrasonic love songs that are inaudible to the human ear. New research shows they make these unique high frequency sounds using a mechanism that has only previously been observed in supersonic jet engines.}

    Mice, rats and many other rodents produce ultrasonic songs that they use for attracting mates and territorial defense. These ‘singing’ mice are often used to study communication disorders in humans, such as stuttering. However, until now it was not understood how mice can make these ultrasonic sounds, which may aid in the development of more effective animal models for studying human speech disorders.

    Now, new research co-authored at the University of Cambridge and published in the journal Current Biology has found that when mice ‘sing’, they use a mechanism similar to that seen in the engines of supersonic jets.

    “Mice make ultrasound in a way never found before in any animal,” said the study’s lead author Elena Mahrt, from Washington State University.

    Previously, it had been thought that these ‘Clangers’-style songs were either the result of a mechanism similar to that of a tea kettle, or of the resonance caused by the vibration of the vocal cords. In fact, neither hypothesis turned out to be correct. Instead, mice point a small air jet coming from the windpipe against the inner wall of the larynx, causing a resonance and producing an ultrasonic whistle.

    Using ultra-high-speed video of 100,000 frames per second the researchers showed that the vocal folds remain completely still while ultrasound was coming from the mouse’s larynx.

    “This mechanism is known only to produce sound in supersonic flow applications, such as vertical takeoff and landing with jet engines, or high-speed subsonic flows, such as jets for rapid cooling of electrical components and turbines,” said Dr Anurag Agarwal, study co-author and head of the Aero-acoustics laboratories at Cambridge’s Department of Engineering. “Mice seem to be doing something very complicated and clever to make ultrasound.”

    “It seems likely that many rodents use ultrasound to communicate, but very little is known about this — it is even possible that bats use this cool mechanism to echolocate,” said the study’s senior author Dr Coen Elemans from the University of Southern Denmark. “Even though mice have been studied so intensely, they still have some cool tricks up their sleeves.”

    Mice, rats and many other rodents produce ultrasonic songs that they use for attracting mates and territorial defense.
  • Group work can harm memory

    {A new study by psychologists from the University of Liverpool and the University of Ontario Institute of Technology (UOIT) reveals that collaborating in a group to remember information is harmful.}

    The research, conducted by Dr Craig Thorley, the University’s Department of Psychological Sciences, and Dr Stéphanie Marion, from UOIT’s Faculty of Social Science and Humanities, statistically analysed 64 earlier collaborative remembering studies and found that groups recall less than their individual members would if working alone.

    The same study also found that collaborative remembering boosts later individual learning: people who previously recall in a group remember more than those who do not.

    The research provides the first systematic investigation into the costs and benefits of collaborative remembering.

    {{Collaborative inhibition}}

    Collaborative remembering is important as it is used in a number of different everyday settings. In the workplace, interview panels jointly recall candidates’ answers before deciding who to employ. In the courtroom, jurors work together to recall trial evidence prior to reaching a verdict. In schools and universities, students work together to revise course content prior to exams.

    The study, published in Psychological Bulletin this week, first compared the recall of collaborative groups to the pooled recall of an equivalent number of individuals. For example, if a collaborative group consisted of four people, their recall was compared to that of four individuals who worked alone but whose recall was combined. Collaborative group recall was consistently lower than pooled individual recall. This effect is known as collaborative inhibition.

    The study suggests collaborative inhibition occurs as group members disrupt each other’s retrieval strategies when recalling together.

    {{Retrieval strategies
    }}

    Dr Craig Thorley, said: “Collaborative group members develop their own preferred retrieval strategies for recalling information. For example, Person A may prefer to recall information in the order it was learned but Person B may prefer to recall it in the reverse order. Importantly, recall is greatest when people can use their own preferred retrieval strategies.

    “During collaboration, members hear each other recall information using competing retrieval strategies and their preferred strategies become disrupted. This results in each group member underperforming and the group as a whole suffers. Individuals who work alone can use their preferred retrieval strategies without this disruption so recall more.”

    Several factors were also found to influence the extent to which collaborative inhibition occurs. One of these findings was collaboration is more harmful to larger groups than smaller groups. Another was that friends and family members are more effective at working together than strangers.

    Dr Thorley adds: “Smaller groups perform better than larger groups as they contain fewer competing (disruptive) retrieval strategies. Friends and family members perform better than strangers as they tend to develop complementary (and not competing) retrieval strategies.”

    {{Collaboration Boosts Later Memory}}

    The study also compared the recall of people who had previously collaborated in a group to the recall of people who had previously worked alone. It was found that collaborating in a group boosted later individual recall.

    Dr Stéphanie Marion, states: “We believe that this occurs as working in a group means people are re-exposed to things they may have forgotten and this boosts their memory later on. One of the important consequences of this is that it suggests getting people to work together to remember something (e.g., students revising together) is beneficial for individual learning.”

  • Not really a matter of choice?

    {What happens in the brain when it comes to decision making?}

    Choices, it is commonly understood, lead to action – but how does this happen in the brain? Intuitively, we first make a choice between the options. For example, when approaching a yellow traffic light, we need to decide either to hit the breaks or to accelerate the car. Next, the appropriate motor response is selected and carried out, in this case moving the foot to the left or to the right. Traditionally, it is assumed that separate brain regions are responsible for these stages. Specifically, it is assumed that the motor cortex carries out this final response selection without influencing the choice itself.

    Two Tübingen Neuroscientists, Anna-Antonia Pape and research group leader Markus Siegel of the Werner Reichardt Centre for Integrative Neuroscience (CIN) and MEG Center, have found evidence that challenges this intuitive division between a ‘deciding’ and a ‘responding’ stage in decision making. The results of their study have been published in the latest Nature Communications.

    While recording brain activity using magnetoencephalography (MEG) to monitor activity in motor areas, Pape and Siegel set 20 human subjects the simple task of deciding whether or not a field of dots on a screen was slowly moving together. The subjects could respond “yes” or “no” by pressing a button with either their left or their right hand. The mapping from choice (yes/no) to response (left/right button) changed randomly on each trial, with a short cue telling subjects the current configuration. This ensured the participants’ brains could not plan a motor response, i.e. the correct button press, during choice formation. Astonishingly, while the test subjects were able to press the ‘correct’ button most of the time, subjects still showed a strong tendency towards motor response alternation. In other words, they often simply pressed the button they had not pressed in the trial just prior to the current one. This tendency was pronounced enough to detract from subjects’ overall decision task performance.

    In their MEG data, Pape and Siegel found a neural correlate of this tendency in the motor cortex itself. They showed that the upcoming motor decision can be predicted from the status of motor areas even before decision formation has begun. This pre-decisional motor activity to a large extent originates from the neural residue of the previous motor response. How often the subjects alternated between response alternatives is predicted by how pronounced the previous response’s vestiges in the motor cortex still are. Together, these results suggest that the status of the motor cortex even before decision making can influence the formation of a given choice.

    These results challenge the traditional view of decision making. According to this view, decisions are formed in the prefrontal cortex and fronto-parietal cortex, brain regions that are associated with ‘higher’ brain functions that are essential for memory and problem solving. The motor cortex is seen as the structure merely executing the behaviour that those ‘higher’ brain regions have determined. Contrary to this view, Pape and Siegel’s findings suggest that the motor cortex also plays a role in informing decision-based behaviour.

    Does that mean the way we respond to our environment is not a matter of choice after all? Do we just randomly ‘decide’ what to do based on the state our motor cortex happens to be in? Anna-Antonia Pape, who recorded and analysed the data, does not think so: “The effect is there, yes, but I wouldn’t link it to the question of free will by any means! Higher brain areas are still very important for the decision making process, but now we know that motor areas can tip the scales.”

    Results of a new study challenge the traditional view of decision making.
  • High number of pesticides within colonies linked to honey bee deaths

    {Some compounds commonly regarded as ‘bee-safe’ could be a major contributor to honey bee colony losses in North America.}

    Honey bee colonies in the United States have been dying at high rates for over a decade, and agricultural pesticides — including fungicides, herbicides and insecticides — are often implicated as major culprits. Until now, most scientific studies have looked at pesticides one at a time, rather than investigating the effects of multiple real-world pesticide exposures within a colony.

    A new study is the first to systematically assess multiple pesticides that accumulate within bee colonies. The researchers found that the number of different pesticides within a colony — regardless of dose — closely correlates with colony death. The results also suggest that some fungicides, often regarded as safe for bees, correlate with high rates of colony deaths. The study appeared online September 15, 2016, in the journal Nature Scientific Reports.

    “Our results fly in the face of one of the basic tenets of toxicology: that the dose makes the poison,” said Dennis vanEngelsdorp, an assistant professor of entomology at UMD and senior author on the study. “We found that the number of different compounds was highly predictive of colony death, which suggests that the addition of more compounds somehow overwhelms the bees’ ability to detoxify themselves.”

    The researchers followed 91 honey bee colonies, owned by three different migratory commercial beekeepers, for an entire agricultural season. The colonies began their journey in Florida and moved up the East Coast, providing pollination services for different crops along the way. They also spent time in locations meant for honey production, as well as “holding areas” where beekeepers prepare large numbers of colonies for upcoming pollination contracts.

    A total of 93 different pesticide compounds found their way into the colonies over the course of the season, accumulating in the wax, in processed pollen known as bee bread and in the bodies of nurse bees. At every stop along the beekeepers’ itinerary, the researchers assessed three different parameters within each colony: the total number of pesticides; the total number of “relevant” pesticides (defined as those above a minimum threshold of toxicity); and each colony’s “hazard quotient,” a measure devised by other researchers to integrate the total hazard posed to each colony by the cumulative toxicity of all pesticides present.

    All three measures correlated with a higher probability of colony death or queen failure. In addition, the researchers found between five and 20 different pesticide residues in every sample of bee bread that exceeded a hazard quotient’s safety threshold. The highest number of pesticides accumulated in the colonies early on, shortly after beekeepers placed colonies into early season flowering crops like apples and blueberries. Honey production stopovers and holding areas offered the bees some respite from further contamination.

    The study results also suggest that some fungicides, which have led to the mortality of honey bee larvae in lab studies, could have toxic effects on colony survival in the field. In the current study, pesticides with a particular mode of action also corresponded to higher colony mortality. For example, the fungicides most closely linked to queen deaths and colony mortality disrupted sterols — compounds that are essential for fungal development and survival.

    “We were surprised to find such an abundance of fungicides inside the hives, but it was even more surprising to find that fungicides are linked to imminent colony mortality,” said Kirsten Traynor, a postdoctoral researcher in entomology at UMD and lead author on the study. “These compounds have long been thought to be safe for bees. We’re seeing them at higher doses than the chemicals beekeepers apply directly to the colonies to control varroa mites. So that is particularly concerning.”

    The current study borrows a concept from human cancer research: the “exposome,” or the sum total of chemicals an organism is exposed to over its lifetime. But instead of looking at individual bees, the researchers assessed each colony as a single “superorganism” that functions as a single, cohesive unit.

    Within this framework, the researchers tracked the death of queen bees, which is a life-threatening event for the colony as a whole. In some cases, a colony is able to create a new queen, but if those efforts fail the entire colony will die. In the current study, colonies with very low pesticide contamination in the wax experienced no queen events, while all colonies with high pesticide contamination in the wax lost a queen during the beekeeping season.

    “This is a huge problem for beekeepers currently. Not long ago, a queen would typically last up to two years. But now many commercial beekeepers replace the queens in at least half of their colonies every spring in the hopes that this will prevent premature queen deaths,” Traynor explained. “Even with such measures, many queens still don’t make it through one season.”

    The research team did not find a significant contribution from neonicotinoid pesticides. These compounds, derived from nicotine, are currently some of the most common pesticides in use globally. Because of their ubiquitous use, neonicotinoids have received significant media attention for their potential role in honey bee declines.

    “We just did not find neonicotinoids in the colonies,” vanEngelsdorp explained. “There were some trace residues of neonicotinoids in a few samples, but not nearly on par with other compounds. However, it’s possible we did not test the right matrix — we did not test nectar, for example — or that the product breaks down faster than others in the collection process or that neonicotinoids are simply not very prevalent when crops are flowering.”

    Because industrial practices have changed since the researchers collected the data for this study, Traynor and vanEngelsdorp acknowledge that further research could reveal new patterns in the relationship between pesticides and honey bee health. But the current study nonetheless offers some important insights for beekeepers and farmers alike.

    “We have to figure out ways to reduce the amount of products that bees are exposed to while still helping farmers produce their crops,” vanEngelsdorp said. “This will require careful examination of spray plans, to make sure we only use the products we need, when we need them, in order to reduce the number of products bees are exposed to while pollinating different crops.”

    A honey bee forager collecting nectar from a cleome flower. Bees make honey from nectar. They also collect pollen, which they convert into bee bread.
  • Distracted much? New research may help explain why

    {American professional golfer Tom Kite said two things about distraction that sum up the findings of a new study on the subject: First, “You can always find a distraction if you’re looking for one.” And, second, “Discipline and concentration are a matter of being interested.”}

    The new research offers evidence that one’s motivation is just as important for sustained attention to a task as is the ease with which the task is done. It also challenges the hypothesis, proposed by some cognitive neuroscientists, that people become more distractible as they tackle increasingly difficult tasks.

    A report of the new study appears in the Journal of Experimental Psychology: General.

    “People must almost continuously balance their need for inner focus (reflection, mental effort) with their need for attending to the world,” write the authors of the study, University of Illinois psychology professors Simona Buetti and Alejandro Lleras. “But, when the need for inner focus is high, we may have the impression that we momentarily disengage from the world entirely in order to achieve a heightened degree of mental focus.”

    Buetti and Lleras designed several experiments to test whether people are more easily distracted when the mental effort required to complete a task goes up, as is generally assumed in their field.

    The researchers first asked participants to solve math problems of varying difficulty while photographs of neutral scenes — for example, cows in a pasture, a portrait of a man, a cup on a table — flashed on a computer display for three seconds, enticing the subjects to look at them. An eye-tracking device measured the frequency, speed and focus of participants’ eyes as they completed the math problems.

    The results showed that participants who were engaged in an easy version of the task were more likely to look at the distractors than those engaged in an extremely challenging version. These results run counter to current theories, the researchers said.

    “This suggests that focus on complex mental tasks reduces a person’s sensitivity to events in the world that are not related to those tasks,” Buetti said. This finding corroborates research on a phenomenon called “inattentional blindness,” in which people involved in an engaging task often fail to notice strange and unexpected events.

    “Between the inner world of solving a problem and the outer world — what’s going on around you — there seems to be a need to disengage from one when heightened attention to the other is required,” Lleras said.

    “Interestingly, when participants completed a mix of easy and hard tasks, the difficulty of the task did not seem to affect their distractibility,” Buetti said. This finding led the researchers to hypothesize that the ability to avoid being distracted is not driven primarily by the difficulty of the task, but is likely the result of an individual’s level of engagement with the endeavor. They call this concept the “engagement theory of distractibility.”

    The team did further studies to test this idea, manipulating subjects’ enthusiasm for the task with financial incentives. To the researchers’ surprise, this manipulation had little effect on participants’ distractibility. However, there were large differences between people in terms of their distractibility.

    “The more participants struggled with a task, the more they reflexively avoided distraction, irrespective of financial incentive,” Buetti said. “So, the take-home message is: Characteristics of the task itself, like its difficulty, do not alone predict distractibility. Other factors also play a role, like the ease with which we can perform a task, as well as a decision that is internal to each of us: how much we decide to cognitively engage in a task.”

    The new research offers evidence that one's motivation is just as important for sustained attention to a task as is the ease with which the task is done.
  • Exhaling Earth: Scientists closer to forecasting volcanic eruptions

    {On average, 40 volcanoes on land erupt into the atmosphere each month, while scores of others on the seafloor erupt into the ocean. A new time-lapse animation uniting volcanoes, earthquakes, and gaseous emissions reveals unforgettably the large, rigid plates that make the outermost shell of Earth and suggests the immense heat and energy beneath them seeking to escape.}

    With one click, visitors can see the last 50 years of “Eruptions, Earthquakes, and Emissions.” Called E3, the app allows the viewer to select and learn about individual eruptions, emissions, and earthquakes as well as their collective impact. Visualizing these huge global datasets together for the first time, users can speed or slow or stop the passage of time. They can observe flat maps or globes, and watch gas clouds circle the planet. Data from Smithsonian’s Global Volcanism Program and the United States Geological Earthquake Survey (USGS) feed into the app, and the datasets are available for free download. The app will update continuously, accumulating new events and additional historical information as it becomes available.

    “Have you had a ‘eureka!’ moment where you suddenly see order in what appeared chaotic? This app abounds in such moments,” said Elizabeth Cottrell, head of the Global Volcanism Program of the Smithsonian Institution in Washington, DC. “As geologic events accumulate over time, Earth’s tectonic plates appear before your eyes. What took geologists more than 200 years to learn, a viewer learns in seconds. We wanted to share the excitement with as big an audience as possible. This is the first time we’re able to present these datasets together for the public.”

    She added, “This app is interesting not only for educators and the public, but also will help scientists understand global eruption patterns and linkages between Earth’s inner workings and the air we breathe.”

    A team of experts developed the app with support from the Smithsonian Institution and the Deep Carbon Observatory, an international multidisciplinary research program exploring the quantities, movements, forms, and origins of carbon deep inside Earth. Deep Carbon Observatory scientists are studying volcanic emissions as part of this mission, and will more than triple the number of permanent volcano gas monitoring stations from 2012-2019.

    Tracking volcanic emissions to avoid disaster

    Hundreds of millions of people around the world live on the flanks of active volcanoes, and eruptions can cause massive economic damage even when few people live nearby. In 2010, Eyafjallajökull erupted in Iceland, spewing massive ash clouds, disrupting air travel for millions of people and costing the airline industry nearly USD 2 billion. Better anticipation of eruptions could lower the human and economic toll of these natural phenomena.

    Recent discoveries by Deep Carbon Observatory (DCO) scientists in the Deep Earth Carbon Degassing (DECADE) initiative are laying the foundation for improved volcanic eruption forecasts. These hard-won advances required expensive, dangerous expeditions to sniff gas emissions for clues.

    “We are deploying automated monitoring stations at volcanoes around the world to measure the gases they emit,” said Tobias Fischer, a volcanologist at the University of New Mexico, USA, and leader of DECADE. “We measure carbon dioxide, sulfur dioxide, and water vapor (steam), the major gases emitted by all volcanoes on the planet. In the hours before an eruption, we see consistent changes in the amount of carbon dioxide emitted relative to sulfur dioxide. Keeping an eye on the ratios globally via satellites and on-site monitoring helps us learn the precursors of volcanic eruptions. Monitoring these volcanic gas variations also helps us come up with a more accurate estimate of total volcanic carbon dioxide emissions on Earth — a major goal of DCO.”

    “Our goal of tripling the number of volcanoes monitored around the world by 2019 is no small task,” added Fischer. “Installing instruments on top of volcanoes is dangerous work in extremely hard-to-reach places.”

    “Sometimes our monitoring stations become victims of eruptions they are trying to measure, as happened recently on Villarrica volcano in Chile. At least our instruments recorded gas composition changes right up until the eruption destroyed them,” Fischer noted.

    By 2019, DECADE scientists hope to have gas monitoring stations on 15 of the world’s 150 most active volcanoes. This will add to the eight stations currently operated by other entities such as the USGS and the University of Palermo (Italy). Data collected at these monitoring stations are feeding a new database of volcanic carbon emissions, making potentially life-saving information available to many more scientists around the world.

    Advancing knowledge and forecasting potential from land

    DCO volcanologists are also advancing basic knowledge about how different volcanoes work, which is further advancing eruption forecasting.

    Maarten de Moor and his team at the National University in Costa Rica, for example, using DECADE monitoring stations, have measured gas emissions at Póas and Turrialba volcanoes in Costa Rica over several years. De Moor and colleagues have observed remarkable changes in gas compositions before eruptions at these volcanoes, both of which have a huge impact on local society. Turrialba, for example, deposited ash on the capital city of San José over the last few weeks, affecting about 3 million people and closing the international airport.

    “We’re getting more and more confident that changes in the carbon to sulfur ratio precede eruptions,” said de Moor. “Potentially, we can now see an eruption coming just by looking at gas emissions. What is truly fascinating is how dynamic these volcanoes are in their degassing and eruptive behavior. To understand the big picture of Earth degassing, we also need to understand the processes driving temporal variations in volcanic emissions.”

    Historically, volcanologists have measured emissions of smelly sulfur dioxide much more easily than colorless, odorless carbon dioxide emissions. But DCO scientists at Centre National de la Recherche Scientifique (CNRS) and Université de Lorraine in France are designing new geochemical tools to detect and monitor large-scale emissions of volcanic carbon dioxide. Tools include a new high-precision method for measuring excess airborne amounts of a rare form of helium found in magma, high-temperature fluids from below Earth’s crust that come out of volcanoes in the form of lava and gases.

    “Our helium data suggest that even when they are not erupting, volcanoes constantly release carbon dioxide and other gases through the crust, from magma chambers deep underground,” said Bernard Marty, leader of the CNRS group. “We see low level release of carbon dioxide over large areas surrounding Mt. Etna volcano in Sicily and Erta Ale volcano in Afar, Ethiopia, which tells us this might be happening at sites around the world.”

    Eyes in space add to the toolkit

    To assess volcanic activity and gas release on a global scale, DCO researchers at the University of Cambridge, UK, are taking yet another approach; measuring volcanic gases from space using satellites.

    “While water vapor and carbon dioxide are much more abundant volcanic gases, sulfur dioxide is easier to measure because Earth’s atmosphere contains very little sulfur dioxide,” said Marie Edmonds, co-Chair of DCO’s Reservoirs and Fluxes Science Community. “With satellites, we have been able to measure sulfur dioxide emissions for years and the technology keeps getting better. An exciting new aspect of DCO’s research combines the satellite data with ground-based measurements of carbon to sulfur ratios provided by DECADE. This powerful combination allows us to better define global volcanic emissions, or degassing, of carbon dioxide.”

    “DECADE’s volcano-based instruments make it possible for us to ground-truth our satellite observations and obtain much more frequent measurements” added Edmonds. “Eventually we hope we’ll get as accurate measurements from space as we do from the ground. When this happens, we can monitor volcanoes in remote parts of the world for a fraction of the cost and without risking scientists’ lives.” As the data accumulate, they too will stream into and through the E3 app.

    Volcanologist Tobias Fischer (University of New Mexico, USA) samples gases emitted from a sulfur-caked fumarole on Póas volcano in Costa Rica, one of the 15 volcanoes in DCO's DECADE gas monitoring network.