Category: Science News

  • Ancient Chaco Canyon population likely relied on imported food

    {Corn may have come from the Chuska Slope settlement some 50 miles away.}

    The ancient inhabitants of New Mexico’s Chaco Canyon, the zenith of Pueblo culture in the Southwest a thousand years ago, likely had to import corn to feed the multitudes residing there, says a new University of Colorado Boulder study.

    CU Boulder scientist Larry Benson said the new study shows that Chaco Canyon — believed by some archeologists to have been populated by several thousand people around A.D. 1100 and to have held political sway over an area twice the size of Ohio — had soils that were too salty for the effective growth of corn and beans.

    “The important thing about this study is that it demonstrates you can’t grow great quantities of corn in the Chaco valley floor,” said Benson, an adjunct curator of anthropology at the University of Colorado Museum of Natural History. “And you couldn’t grow sufficient corn in the side canyon tributaries of Chaco that would have been necessary to feed several thousand people.

    “Either there were very few people living in Chaco Canyon, or corn was imported there.”

    A paper by Benson was published online in the Journal of Archaeological Science: Reports.

    Between the ninth and 12th centuries, Chaco Canyon (officially the Chaco Culture Natural Historic Park) located in the San Juan Basin in north-central New Mexico was the focus of an unprecedented construction effort, said Benson. At the height of its cultural heyday, 12 stone masonry “great houses” and other structures were built there, along with a network of ceremonial roads linking Chaco with other Pueblo sites in the Southwest.

    As part of the study, Benson used a tree ring data set created by University of Arizona Professor Emeritus Jeff Dean that showed annual Chaco Canyon precipitation spanning 1,100 years. The tree rings indicate the minimum amount of annual precipitation necessary to grow corn was exceeded only 2.5 percent of the time during that time period.

    Benson suggests that much of the corn consumed by the ancient people of Chaco may have come from the Chuska Slope, the eastern flank of the Chuska Mountains some 50 miles west of Chaco Canyon that also was the source of some 200,000 timbers used to shore up Chaco Canyon masonry structures. Between 11,000 and 17,000 Pueblo people are thought to have resided on the Chuska Slope prior to A.D. 1130, he said.

    Winter snows in the Chuska Mountains would have produced a significant amount of spring snowmelt that was combined with surface water features like natural “wash systems,” said Benson. Water concentrated and conveyed by washes would have allowed for the diversion of surface water to irrigate large corn fields on the Chuska Slope, he said.

    Benson said the Chaco Canyon inhabitants traded regularly with the Chuska Slope residents, as evidenced by stone tool material (chert), pottery and wooden beams.

    “There were timbers, pottery and chert coming from the Chuska region to Chaco Canyon, so why not surplus corn?” asks Benson, a former U.S. Geological Survey scientist.

    Many archaeologists are still puzzled as to why Chaco Canyon was built in an area that has long winters, marginal rainfall and short growing seasons. “I don’t think anyone understands why it existed,” Benson said. “There was no time in the past when Chaco Canyon was a Garden of Eden.”

    Ancient inhabitants of Chaco Canyon likely had to import corn to feed the masses a thousand years ago says a new CU-Boulder study.
  • No excuses: Real reason you’re late may vary with age

    {A song is just a song, but as time goes by, something as random as a song’s length could be the difference in whether you miss an important deadline or arrive late for an appointment, suggests time-management research from Washington University in St. Louis.}

    The study, published in the Journal of Experimental Psychology: General, shows that people rely heavily on time estimates of past experiences to plan for future tasks and that outside influences, such as background music, can skew our perception of time, causing even the best-laid plans to go awry.

    “Our results suggest time estimates of tasks that we need to incorporate into our later plans, like a drive to an appointment, are often based on our memory of how long it took us to perform that same drive previously,” said Emily Waldum, principal author of the paper and a postdoctoral researcher in psychological and brain sciences in Arts & Sciences.

    “Even if you think you estimated the duration of events accurately, external factors unrelated to that event can bias time estimates,” she said. “Something as simple as the number of songs you heard play on your phone during a run can influence whether you over- or under-estimate the duration of the run.”

    In a complicated modern world where multitasking is the norm, it’s easy for our game plans to fall apart due to breakdowns in “prospective memory,” a term psychologists use to describe the process of remembering to do something in the future.

    Waldum and co-author Mark McDaniel, a professor of psychological and brain sciences, designed this study to tease out differences in how people young and old approach a challenge that requires them to plan ahead and complete a series of time-based tasks by a specific deadline.

    The research included 36 college undergraduates and 34 healthy older adults in their 60s, 70s and 80s. It aimed to simulate the complicated time-based prospective memory (TBPM) challenges that people old and young experience in everyday life.

    In the first part of the study, participants were asked to keep track of how long it took to complete a trivia quiz. The quiz always ran 11 minutes, but participants had to make their own time estimates without access to a clock. Some completed the quiz with no background noise, while others heard either two long songs or four short songs.

    Later, the participants were challenged to put together as many pieces of a puzzle as possible while leaving enough time to complete the same quiz before a 20-minute deadline.

    Contrary to previous research, this study found that seniors managed to complete future tasks on time at about the same rate as college undergraduates, although each age group used surprisingly different strategies to estimate how much time they would need to repeat the quiz and finish the next phase of the experiment on deadline.

    Older adults reported ignoring songs heard in the background, relying instead on an internal clock to estimate how long it took them to complete the first quiz. Consistent with other research on internal clocks and time perception, seniors in this experiment tended to underestimate time taken on the first quiz. This led them to spend a little too much time on the puzzle and to finish the second quiz a bit beyond deadline.

    “When younger adults heard two long songs during the first quiz, they performed a lot like older adults, underestimating the quiz duration and winding up a bit late,” Waldum said. “When they heard four short songs, younger adults overestimated how much time they would need to repeat the quiz leading them to finish it too early.”

    Thus, older adults performed about the same, regardless of whether they heard songs or not. For young people though, background music played a big role in whether they were too early or too late, Waldum said.

    While the challenges of being on time may remain largely the same throughout a lifetime, this study suggests that the tricks we use to stay on schedule may evolve as we age.

    For college students with young, agile minds and no fear of multitasking, using songs to estimate the passage of time may be a plausible approach when no clock is available.

    “In a scenario where the duration of a background event is set, such as a 30-minute television show, this is a very good strategy because it provides useful duration information whether you’re paying attention to the show or not,” Waldum said. “However, when background events are less predictable, as in the case with songs and many other events, basing a time estimate on them can be risky.”

    Older adults, who generally see declines in memory and the speed at which they process information, tended to avoid multitasking throughout the study.

    During the first quiz, they ignored songs and relied more on an internal clock to make time estimates. In the second phase of the study when a clock was made available, they were less likely to pause working on the puzzle and quiz to check the clock.

    These findings suggest that older adults may actually over-rely on their internal clocks that give us a feeling of elapsed time. Checking a clock when it is available is a much better strategy than relying on a feeling of elapsed time, and indeed increased clock-checking predicts better time-based prospective memory performance in this and many other previous studies.

    Therefore, even if checking the clock requires some multitasking, it is worth your time, Waldum said.

    No matter what challenges the future brings — getting out the door and to work, finishing walking the dog before the cookies are done or purchasing popcorn before a movie starts — the fundamentals of being on time still apply.

    You must remember this: Success requires making accurate estimates of the time needed to complete prerequisite tasks, remembering to carry out these tasks at the appropriate time and avoiding distractions that could prevent you from staying on schedule.

    “Our study provides some good news for older adults,” Waldum said. “Our results, while preliminary, suggest that time-management ability and the ability to perform some types of complex time-based tasks in real life may largely be preserved with age.”

  • How the brain processes faces from sight to recognition

    {At a glance, you can recognize a friend’s face whether they are happy or sad or even if you haven’t seen them in a decade. How does the brain do this — recognize familiar faces with efficiency and ease despite extensive variation in how they appear?}

    Researchers at Carnegie Mellon University are closer than ever before to understanding the neural basis of facial identification. In a study published in the Dec. 26, 2016 issue of the Proceedings of the National Academy of Sciences (PNAS), they used highly sophisticated brain imaging tools and computational methods to measure the real-time brain processes that convert the appearance of a face into the recognition of an individual. The research team is hopeful that the findings might be used in the near future to locate the exact point at which the visual perception system breaks down in different disorders and injuries, ranging from developmental dyslexia to prosopagnosia, or face blindness.

    “Our results provide a step toward understanding the stages of information processing that begin when an image of a face first enters a person’s eye and unfold over the next few hundred milliseconds, until the person is able to recognize the identity of the face,” said Mark D. Vida, a postdoctoral research fellow in the Dietrich College of Humanities and Social Sciences’ Department of Psychology and Center for the Neural Basis of Cognition (CNBC).

    To determine how the brain rapidly distinguishes faces, the researchers scanned the brains of four people using magnetoencephalography (MEG). MEG allowed them to measure ongoing brain activity throughout the brain on a millisecond-by-millisecond basis while the participants viewed images of 91 different individuals with two facial expressions each: happy and neutral. The participants indicated when they recognized that the same individual’s face was repeated, regardless of expression.

    The MEG scans allowed the researchers to map out, for each of many points in time, which parts of the brain encode appearance-based information and which encode identity-based information. The team also compared the neural data to behavioral judgments of the face images from humans, whose judgments were based mainly on identity-based information. Then, they validated the results by comparing the neural data to the information present in different parts of a computational simulation of an artificial neural network that was trained to recognize individuals from the same face images.

    “Combining the detailed timing information from MEG imaging with computational models of how the visual system works has the potential to provide insight into the real-time brain processes underlying many other abilities beyond face recognition,” said David C. Plaut, professor of psychology and a member of the CNBC.

  • First movie of energy transfer in photosynthesis solves decades-old debate

    {Using ultrafast imaging of moving energy in photosynthesis, scientists have determined the speed of crucial processes for the first time.}

    This should help scientists understand how nature has perfected the process of photosynthesis, and how this might be copied to produce fuels by artificial photosynthesis.

    During photosynthesis, plants harvest light and, though a chemical process involving water and carbon dioxide, convert this into fuel for life.

    A vital part of this process is using the light energy to split water into oxygen and hydrogen. This is done by an enzyme called Photosystem II. Light energy is harvested by ‘antennae’, and transferred to the reaction centre of Photosystem II, which strips electrons from water. This conversion of excitation energy into chemical energy, known as ‘charge separation’, is the first step in splitting water.

    It was previously thought that the process of charge separation in the reaction centre was a ‘bottleneck’ in photosynthesis — the slowest step in the process — rather than the transfer of energy along the antennae.

    Since the structure of Photosystem II was first determined 2001, there was some suggestion that in fact it could be the energy transfer step that was slowest, but it was not yet possible to prove experimentally.

    Now, using ultrafast imaging of electronic excitations that uses small crystals of Photosystem II, scientists from Imperial College London and Johannes Kepler University (JKU) in Austria have shown that the slowest step is in fact the process through which the plants harvest light and transfer its energy through the antennae to the reaction centre.

    The new insights into the precise mechanics of photosynthesis should help researchers hoping to copy the efficiency of natural photosynthesis to produce green fuels. Study author Dr Jasper van Thor, from the Department of Life Sciences at Imperial, said: “We can now see how nature has optimised the physics of converting light energy to fuel, and can probe this process using our new technique of ultrafast crystal measurements.

    “For example, is it important that the bottleneck occurs at this stage, in order to preserve overall efficiency? Can we mimic it or tune it to make artificial photosynthesis more efficient? These questions, and many others, can now be explored.”

    Co-author Dr Thomas Renger from the Department of Theoretical Biophysics at JKU added: “When we predicted the present model of energy transfer eight years ago, this prediction was based on a structure-based calculation. Since such calculations are far from trivial for a system as complex as this, some doubts remained. The technique invented by Jasper’s group at Imperial has allowed us to remove these doubts and has fully confirmed our predictions.”

    Although the researchers could determine which step is faster, both steps occur incredibly quickly — the whole process takes a matter of nanoseconds (billionths of a second), with the individual steps of energy transfer and charge separation taking only picoseconds (trillionths of a second).

    The team used a sophisticated system of lasers to cause reactions in crystals of Photosystem II, and then to measure in space and time the movement of excitations of electrons — and hence the transfer of energy — across the antennae and reaction centre.

    The resulting movie of the movement of excited electrons across minute sections of the system revealed where energy is held and when it is passed along. This proved that the initial step of separating charges for the water-splitting reaction takes place relatively quickly, but that the light harvesting and transfer process is slower.

    Dr van Thor added: “There had been clues that the earlier models of the bottleneck of photosynthesis were incorrect, but until now we had no direct experimental proof. We can now show that what I was lectured as an undergraduate in the 1990s is no longer supported.”

    Dr Jasper van Thor working in the ultrafast spectroscopy laboratory.
  • How brain begins repairs after ‘silent strokes’

    {UCLA researchers have shown that the brain can be repaired — and brain function can be recovered — after a stroke in animals. The discovery could have important implications for treating a mind-robbing condition known as a white matter stroke, a major cause of dementia.}

    White matter stroke is a type of ischemic stroke, in which a blood vessel carrying oxygen to the brain is blocked. Unlike large artery blockages or transient ischemic attacks, individual white matter strokes, which occur in tiny blood vessels deep within the brain, typically go unnoticed but accumulate over time. They accelerate Alzheimer’s disease due to damage done to areas of the brain involved in memory, planning, walking and problem-solving.

    “Despite how common and devastating white matter stroke is there has been little understanding of how the brain responds and if it can recover,” said Dr. Thomas Carmichael, senior author of the study and a professor of neurology at the David Geffen School of Medicine at UCLA. “By studying the mechanisms and limitations of brain repair in this type of stroke, we will be able to identify new therapies to prevent disease progression and enhance recovery.”

    In a five-year study, Carmichael’s team looked at white matter strokes in animals and found that the brain initiated repair by sending replacement cells to the site, but then the process stalled. The team had a short list of molecular suspects from previous research that they thought might be responsible. Researchers identified a molecular receptor as the likely culprit in stalling the repair; when they blocked the receptor, the animals began to recover from the stroke.

    “White matter stroke is an important clinical target for the development of new therapies,” Carmichael said.

    Annually in the United States, about 795,000 suffer a stroke, resulting in nearly 130,000 deaths. Multiply the number of strokes by six, and you’ll have an estimate of the number of strokes that are “silent,” in that they do not produce symptoms that lead to hospitalization. Most of these silent strokes are white matter strokes.

    Annually in the United States, about 795,000 suffer a stroke, resulting in nearly 130,000 deaths.
  • Heart-related deaths spike at Christmas

    {Heart-related deaths spike during Christmas, but the effect may have nothing to do with the cold winter season, according to new research in Journal of the American Heart Association, the Open Access Journal of the American Heart Association/American Stroke Association.}

    “Spikes in deaths from natural causes during Christmas and New Year’s Day has been previously established in the United States. However, the Christmas holiday period (December 25th to January 7th) in the U.S. falls within the coldest period of the year when death rates are already seasonally high due to low temperatures and influenza,” said Josh Knight, B.Sc., study author and research fellow at the University of Melbourne in Australia.

    In this study, researchers analyzed trends in deaths in New Zealand, where Christmas occurs during the summer season when death rates are usually at a seasonal low — allowing researchers to separate any winter effect from a holiday effect.

    During a 25-year period (1988-2013), there were a total of 738,409 deaths (197,109 were noted as cardiac deaths).

    Researchers found:

    A 4.2 percent increase in heart-related deaths occurring away from a hospital from December 25 — January 7.
    The average age of cardiac death was 76.2 years during the Christmas period, compared with 77.1 years during other times of the year.
    There are a range of theories that may explain the spike in deaths during the holiday season, including the emotional stress associated with the holidays, changes in diet and alcohol consumption, less staff at medical facilities, and changes in the physical environment (for example visiting relatives). However, there have been few attempts to replicate prior studies.

    Although more research is needed to explain the spike in deaths, researchers suggest one possibility may be that patients hold back in seeking medical care during the holiday season.

    “The Christmas holiday period is a common time for travel within New Zealand, with people frequently holidaying away from their main medical facilities. This could contribute to delays in both seeking treatment, due to a lack of familiarity with nearby medical facilities, and due to geographic isolation from appropriate medical care in emergency situations,” Knight said

    Another explanation may have to do with a terminally ill patients’ will to live and hold off death for a day that is important to them.

    “The ability of individuals to modify their date of death based on dates of significance has been both confirmed and refuted in other studies, however it remains a possible explanation for this holiday effect,” Knight said.

    However, researchers note that the study did not track daily temperatures and New Zealand has an island climate, which almost eliminates the extremes of temperature that have been associated with heart-related death rates in previous studies.

  • Study identifies key indicators linking violence and mental illness

    {New research from North Carolina State University, RTI International, Arizona State University and Duke University Medical Center finds a host of factors that are associated with subsequent risk of adults with mental illness becoming victims or perpetrators of violence. The work highlights the importance of interventions to treat mental-health problems in order to reduce community violence and instances of mental-health crises.}

    “This work builds on an earlier study that found almost one-third of adults with mental illness are likely to be victims of violence within a six-month period,” says Richard Van Dorn, a researcher at RTI and lead author of a paper describing the work. “In this study, we addressed two fundamental questions: If someone is victimized, is he or she more likely to become violent? And if someone is violent, is he or she more likely to be victimized? The answer is yes, to both questions.”

    The researchers analyzed data from a database of 3,473 adults with mental illnesses who had answered questions about both committing violence and being victims of violence. The database drew from four earlier studies that focused on issues ranging from antipsychotic medications to treatment approaches. Those studies had different research goals, but all asked identical questions related to violence and victimization. For this study, the researchers used a baseline assessment of each study participant’s mental health and violence history as a starting point, and then tracked the data on each participant for up to 36 months.

    Specifically, the researchers assessed each individual’s homelessness, inpatient mental-health treatment, psychological symptoms of mental illness, substance use and as victims or perpetrators of violence. The researchers evaluated all of these items as both indicators and outcomes — i.e., as both causes and effects.

    “We found that all of these indicators mattered, but often in different ways,” says Sarah Desmarais, an associate professor of psychology at NC State and co-author of the paper. “For example, drug use was a leading indicator of committing violence, while alcohol use was a leading indicator of being a victim of violence.”

    However, the researchers also found that one particular category of psychological symptoms was also closely associated with violence: affective symptoms.

    “By affect, we mean symptoms including anxiety, depressive symptoms and poor impulse control,” Desmarais says. “The more pronounced affective symptoms were, the more likely someone was to both commit violence and be a victim of violence.

    “This is particularly important because good practices already exist for how to help people, such as therapeutic interventions or medication,” she adds. “And by treating people who are exhibiting these symptoms, we could reduce violence. Just treating drug or alcohol use — which is what happens in many cases — isn’t enough. We need to treat the underlying mental illness that is associated with these affective symptoms.”

    The research also highlighted how one violent event could cascade over time.

    For example, on average, the researchers found that one event in which a person was a victim of violence triggered seven other effects, such as psychological symptoms, homelessness and becoming perpetrators of violence. Those seven effects, on average, triggered an additional 39 additional effects.

    “It’s a complex series of interactions that spirals over time, exacerbating substance use, mental-health problems and violent behavior,” Van Dorn says.

    “These results tell us that we need to evaluate how we treat adults with severe mental illness,” he adds.

    “Investing in community-based mental health treatment programs would significantly reduce violent events in this population,” says Desmarais. “That would be more effective and efficient than waiting for people to either show up at emergency rooms in the midst of a mental-health crisis or become involved in the legal system as either victims or perpetrators of violence.

    “We have treatments for all of these problems, we just need to make them available to the people that need them,” Desmarais says.

  • Close friendships between children can influence responses to fear

    {New research led by the University of East Anglia (UEA) shows that close friends may influence how school-aged children think about danger.}

    The study investigated whether close friends affect each other’s fear responses, both in terms of beliefs and what they would do to avoid potential danger.

    The findings, published in the December issue of the journal Behaviour Research and Therapy, show for the first time that children in close friendships exhibit shared patterns of fear-related thoughts, and that they influence each other’s fears when discussing these issues together.

    It is well known that fears are common in children and although these usually diminish over time, some children go on to develop significant fears that can interfere with their daily lives. Specific phobias are the more common form of childhood anxiety and if left untreated they can continue into adulthood.

    While some childhood fears can be explained by a child’s genetic inheritance, there is considerable evidence that children’s fears are affected by direct learning and the information they are given from others, for example their parents. This study suggests that the transmission of fears, as well as ideas about how to behave in fear-provoking situations, might also occur in other close relationships, such as those with friends.

    Lead author Dr Jinnie Ooi, who conducted the research as part of her PhD at UEA’s School of Psychology, said the findings could have practical implications for professionals working with children, for example those being treated for anxiety disorders.

    “Our findings indicate that close friends may share negative thoughts and to some extent may maintain these thoughts,” said Dr Ooi, a senior research associate. “Hopefully with this knowledge, we may be able to design interventions whereby close friends can help change their friends’ thoughts during therapy.

    “It may also be beneficial to ask children being treated for anxiety disorders to identify whether they have friends who may be influencing or maintaining their negative thoughts, and it may subsequently be useful for them to be given strategies for how to discuss these thoughts with peers in an adaptive way.”

    An important finding is that children’s fear-related thoughts do not necessarily become more negative when children discuss their fears with close friends who are more anxious. The authors say this supports the use of group therapy and may be useful information for parents concerned that exposure to more anxious children within group-based therapy may increase their child’s anxiety.

    They also suggest that school-based interventions aiming to reduce anxiety in primary school-aged children could instruct pairs of close friends to discuss and resolve their worries in a positive manner with each other.

    The study involved 242 British school children (106 boys, 136 girls), aged seven to 10 years old, who completed questionnaires to measure anxiety and fear beliefs. They were then shown pictures of two Australian marsupials — the Cuscus and the Quoll — that would be unfamiliar to them. They were read two versions of information about the animals, one ambiguous and one which described them as threatening, after which their fear responses towards each animal were assessed. Next, pairs of close friends (40 pairs of boys, 55 pairs of girls, and 26 boy-girl pairs) discussed their feelings about the animals, and their fear responses were measured again.

    The study also explored whether the children’s avoidance behaviours were affected by the discussion. They were given a map showing an enclosure, with one of the animals situated at one end of a path and the opening of the enclosure at the other. The children were asked to draw a cross on the path to show where they would like to be in the enclosure, with avoidance behaviour measured as the distance from the cross to the animal.

    After completing all the tasks the children were presented with real information about the Cuscus and the Quoll and shown a short video about each of them.

    Results showed that children influenced each other’s thoughts following the discussion; from being given the information about the animals to the discussion their fear responses became more similar and close friends’ fear responses in the information task significantly predicted children’s fear responses in the discussion task.

    Gender pair type predicted change in children’s fear responses over time. Children in boy-boy pairs showed a significant increase in fear responses following the discussion; their fear level became more in line with that of other gender pairs for that task, while those in girl-girl pairs showed a significant decrease in their fear beliefs, at least when threatening information was given. Differences in anxiety level between close friends did not affect change in fear responses over time.

  • Thinking with our hands can help find new ways of solving problems, research reveals

    {New research by two cognitive psychology experts from Kingston University London is demonstrating how our decision making is heavily influenced by the world around us. Gaëlle Vallée-Tourangeau, Professor of Organisational Behaviour, and Frédéric Vallée-Tourangeau, Professor of Psychology, are challenging the traditional idea that thinking takes place strictly in the head.}

    Have you ever tried to solve a complicated maths problem by using your hands, or shaped a piece of clay without planning it out in your head first? Understanding how we think and make decisions by interacting with the world around us could help businesses find new ways of improving productivity — and even improve people’s chances of getting a job, according to experts from Kingston University London.

    New research by Gaëlle Vallée-Tourangeau, Professor of Organisational Behaviour, and Frédéric Vallée-Tourangeau, Professor of Psychology, is challenging the traditional idea that thinking takes place strictly in the head. Instead, they are seeking to demonstrate how our decision making is heavily influenced by the world around us — and that using tools or objects when problem solving can spark new ways of finding solutions.

    The idea that thinking is done only in the head is a convenient illusion that doesn’t reflect how problems are solved in reality, Professor Gaëlle Vallée-Tourangeau explained. “When you write or draw, the action itself makes you think differently,” she said. “In cognitive psychology you are trained to see the mind as a computer, but we’ve found that people don’t think that way in the real world. If you give them something to interact with they think in a different way.”

    In a recently-published study in the Acta Psychologica journal, the two academics from the British institution invited 50 participants to try and solve the problem of how to put 17 animals into four pens in such a way that there were an odd number of animals in each one.

    The participants were split into two groups, with the first group able to build physical models with their hands and the second group tasked with using an electronic tablet and stylus to sketch out an answer. They found that the model-building participants were much more likely to find the solution — which requires designing an overlapping pen configuration — than those with the tablet.

    “We showed with this study that for some types of problem — regardless of an individual’s cognitive ability — being able to physical interact with tools gave people a fighting chance of solving it,” said Professor Frédéric Vallée-Tourangeau. “By contrast, a pen and paper-type method almost guaranteed they wouldn’t be able to. It demonstrates how interacting with the world can really benefit people’s performance.”

    The cognitive psychology experts have also been working on a new piece of research exploring how maths anxiety — a debilitating emotional reaction to mental arithmetic that can lead sufferers to avoid even simple tasks like splitting a restaurant bill — could potentially be managed through interactivity.

    The study, now published in the Cognitive Research: Principles and Implications journal, involved asking people to speak a word repeatedly while doing long sums at the same time. It found that the mathematical ability of those asked to do the sums in their heads was more affected than those given number tokens that they could move with their hands.

    However, the really interesting finding was how a person’s maths anxiety affected the results, Professor Frédéric Vallée-Tourangeau said. “We found that for those adding the sums in their head, their maths anxiety score predicted the magnitude of errors made while speaking a word repeatedly. If they’re really maths anxious, the impact will be huge,” he explained. “But in a high interactivity context — when they were moving number tokens — they behaved as if they were not anxious about numbers.

    “The horrible thing about maths anxiety is that some people cope by completely avoiding maths, which only worsens the problem. That’s what makes these findings really interesting. Trying to understand why the fear factor is eliminated or controlled to a manageable level when using your hands rather than just your head is the question we’re trying to get to the bottom of now.”

    As well as potentially being of benefit when it comes to teaching, re-examining old ideas of how we think could have numerous practical applications, Professor Gaëlle Vallée-Tourangeau added.”If you look at recruitment, for example, a lot of assessment centres use classical intelligence tests when interviewing candidates. But depending on the type of work they are recruiting for, they may be missing out on the best people for the job.

    “In business and management, all the models are using the old metaphor of decision making as information processing, which is something I think we need to overcome. We need to redefine how thinking occurs.”

    Interacting with the world changes the way we think, new research shows.
  • Multi-social millennials more likely depressed than social(media)ly conservative peers

    {Compared with the total time spent on social media, use of multiple platforms is more strongly associated with depression and anxiety among young adults, the University of Pittsburgh Center for Research on Media, Technology and Health (CRMTH) found in a national survey.}

    The analysis, published online and scheduled for the April print issue of the journal Computers in Human Behavior, showed that people who report using seven to 11 social media platforms had more than three times the risk of depression and anxiety than their peers who use zero to two platforms, even after adjusting for the total time spent on social media overall.

    “This association is strong enough that clinicians could consider asking their patients with depression and anxiety about multiple platform use and counseling them that this use may be related to their symptoms,” said lead author and physician Brian A. Primack, M.D., Ph.D., director of CRMTH and assistant vice chancellor for health and society in Pitt’s Schools of the Health Sciences. “While we can’t tell from this study whether depressed and anxious people seek out multiple platforms or whether something about using multiple platforms can lead to depression and anxiety, in either case the results are potentially valuable.”

    In 2014, Primack and his colleagues sampled 1,787 U.S. adults ages 19 through 32, using an established depression assessment tool and questionnaires to determine social media use.

    The questionnaires asked about the 11 most popular social media platforms at the time: Facebook, YouTube, Twitter, Google Plus, Instagram, Snapchat, Reddit, Tumblr, Pinterest, Vine and LinkedIn.

    Participants who used seven to 11 platforms had 3.1 times the odds of reporting higher levels of depressive symptoms than their counterparts who used zero to two platforms. Those who used the most platforms had 3.3 times the odds of high levels of anxiety symptoms than their peers who used the least number of platforms. The researchers controlled for other factors that may contribute to depression and anxiety, including race, gender, relationship status, household income, education and total time spent on social media.

    Primack, who also is a professor of medicine at Pitt, emphasized that the directionality of the association is unclear.

    “It may be that people who suffer from symptoms of depression or anxiety, or both, tend to subsequently use a broader range of social media outlets. For example, they may be searching out multiple avenues for a setting that feels comfortable and accepting,” said Primack. “However, it could also be that trying to maintain a presence on multiple platforms may actually lead to depression and anxiety. More research will be needed to tease that apart.”

    Primack and his team propose several hypotheses as to why multi-platform social media use may drive depression and anxiety:

    Multitasking, as would happen when switching between platforms, is known to be related to poor cognitive and mental health outcomes.

    The distinct set of unwritten rules, cultural assumptions and idiosyncrasies of each platform are increasingly difficult to navigate when the number of platforms used rises, which could lead to negative mood and emotions.

    There is more opportunity to commit a social media faux pas when using multiple platforms, which can lead to repeated embarrassments.

    “Understanding the way people are using multiple social media platforms and their experiences within those platforms — as well as the specific type of depression and anxiety that social media users experience — are critical next steps,” said co-author and psychiatrist César G. Escobar-Viera, M.D., Ph.D., a postdoctoral research associate at Pitt’s Health Policy Institute and at CRMTH. “Ultimately, we want this research to help in designing and implementing educational public health interventions that are as personalized as possible.”

    Compared with the total time spent on social media, use of multiple platforms is more strongly associated with depression and anxiety among young adults.