Category: Science News

  • Storing a memory involves distant parts of the brain

    {New research from scientists at the Howard Hughes Medical Institute’s Janelia Research Campus shows that distant parts of the brain are called into action to store a single memory. In studies with mice, the researchers discovered that to maintain certain short-term memories, the brain’s cortex — the outer layer of tissue thought to be responsible for generating most thoughts and actions — relies on connections with a small region in the center of the brain called the thalamus.}

    The thalamus is best known as a relay center that passes incoming sensory information to other parts of the brain for processing. But clinical findings suggested that certain parts of the thalamus might also play a critical role in consciousness and cognitive function. The discovery that the thalamus is needed to briefly store information so that animals can act on a past experience demonstrates that the region has a powerful influence on the function of the cortex, says Janelia group leader Karel Svoboda, who led the study. “It really suggests that cortex by itself cannot maintain these memories,” he says. “Instead the thalamus is an important participant.”

    Svoboda, Janelia group leader Shaul Druckmann, and their colleagues reported their findings in the May 11, 2017, issue of the journal Nature.

    When a memory is formed in the brain, activity in the cells that store the information changes for the duration of the memory. Since individual neurons cannot remain active for more than a few milliseconds on their own, groups of cells work together to store the information. Neurons signaling back and forth can sustain one another’s activity for the seconds that it takes to store a short-term memory.

    Svoboda wants to understand exactly how such memories are formed and maintained, including where in the brain they are stored. In prior work, his team determined that in mice, a region of the cortex called the anterior lateral motor cortex (ALM) is critical for short-term memory. Activity in this area is necessary for mice to perform a memory-related task in which they experience a sensory cue that they must remember for several seconds before they are given an opportunity to act on the cue and earn a reward.

    Svoboda and his colleagues wanted to understand if ALM stores these memories by itself, or if other parts of the brain work in concert with the ALM to store memories. ALM connects to several other brain regions via long-range connections. The next step was to investigate whether any of the region’s long-range communications were important for memory storage.

    Zengcai Guo and Hidehiko Inagaki, postdoctoral researchers in Svoboda’s lab, tested those connections one by one, evaluating whether switching off neurons in various brain regions interfered with memory-associated activity in the ALM and impacted animals’ ability to remember their cues.

    The results were clear. “The only player that perturbed the memory was the thalamus,” Svoboda says. “And it was an incredibly dramatic effect. If you turn off these thalamic neurons, activity and short-term memories completely disappear in the cortex. The cortex effectively becomes comatose.”

    In further experiments, the team discovered that information flows both ways between the thalamus and the ALM portion of the cortex. “It’s like a game of ping-pong,” Svoboda says. “One excites the other, and the other then excites the first, and so on and so forth. This back and forth maintains these activity patterns that correspond to the memory.”

    The finding highlights the functional importance of connections between distant parts of the brain, which Svoboda says are often neglected as neuroscientists focus their attention on activity within particular regions. “It was unexpected that these short-term memories are maintained in a thalamocortical loop,” he says. “This tells us that these memories are widely distributed across the brain.”

    To maintain certain short-term memories, the brain’s cortex relies on connections with the thalamus, new research shows.

    Source:Science Daily

  • Region in brain found to be associated with fear of uncertain future

    {Findings could lead to new ways to identify, treat individuals at risk for anxiety disorders}

    People who struggle to cope with uncertainty or the ambiguity of potential future threats may have an unusually large striatum, an area of the brain already associated with general anxiety disorder, according to research published by the American Psychological Association.

    “Uncertainty and ambiguity of potential future threats are central to understanding the generation of anxiety and anxiety disorders,” said lead author Justin Kim, PhD, of Dartmouth College. “Our research suggests a relationship between an individual’s ability to deal with this uncertainty and the volume of gray matter within a specific area of the brain.”

    The research was published in the APA journal Emotion.

    In the study, 61 students had MRI scans taken of their brains after filling out a survey designed to measure their ability to tolerate the uncertainty of future negative events. Kim and his colleagues analyzed the MRIs and compared them with the intolerance of uncertainty scores. They found the volume of the striatum was significantly associated with intolerance of uncertainty.

    “People who had difficulty tolerating an uncertain future had a relatively enlarged striatum,” said Kim. “What surprised us was that it was only the striatum and not other parts of the brain we examined.”

    Previous studies focusing specifically on patients with obsessive compulsive disorder and general anxiety disorder have also found increased gray matter volumes in the striatum, but this is the first time it has been found in association with intolerance of uncertainty in the absence of a confirmed diagnosis, according to Kim.

    “Our findings demonstrate that the relationship between increased striatal volumes and intolerance of uncertainty can be observed in healthy individuals,” he said. “Having a relatively enlarged volume of the striatum may be associated with how intolerant you are when facing an uncertain future, but it does not mean you have OCD or generalized anxiety disorder.”

    While the striatum has been primarily known for its role in motor function, animal studies have also suggested that it plays a role in how we predict whether or not we will receive a reward for a particular behavior while learning new tasks, according to Kim. “To put it another way, the striatum encodes how predictable and expected a reward is — a higher form of reward processing compared to simply responding to a reward. Given that an important component of intolerance of uncertainty is a desire for predictability, our findings offer a biological marker related to our need for predictability,” he said.

    Since the findings came from psychologically healthy individuals, Kim suggested that that the volume of the striatum in young adults could predict those at risk for developing generalized anxiety disorder or OCD later in life, but that remains to be seen. More important, he said, the findings could serve as a starting point for treating symptoms specific to these disorders by monitoring the striatum and tracking its volume over the course of treatment.

    In this study, 61 students had MRI scans taken of their brains after filling out a survey designed to measure their ability to tolerate the uncertainty of future negative events.

    Source:Science Daily

  • Newly discovered brain network offers clues to social cognition

    {Scientists call our ability to understand another person’s thoughts — to intuit their desires, read their intentions, and predict their behavior — theory of mind. It’s an essential human trait, one that is crucial to effective social interaction. But where did it come from?}

    Working with rhesus macaque monkeys, researchers in Winrich Freiwald’s Laboratory of Neural Systems at The Rockefeller University have discovered tantalizing clues about the origins of our ability to understand what other people are thinking. As reported in Science on May 18, Freiwald and postdoc Julia Sliwa have identified areas in the brains of these primates that are exclusively dedicated to analyzing social interactions. And they may have evolved into the neural circuitry that supports theory of mind in the human brain.

    The team used functional magnetic resonance imaging (fMRI) to identify those parts of the monkeys’ brains that become active when the animals watched different kinds of videos.

    Some of those videos showed inanimate objects (i.e., monkey toys) colliding or otherwise interacting physically. Others showed macaques interacting with the same objects by playing with them. And others still showed macaques interacting socially with other macaques: grooming, playing, fighting, etc.

    By analyzing the fMRI data, the researchers were able to determine precisely which portions of the monkeys’ brains responded to physical or social interactions. And much of what they found came as a surprise.

    Monkey see, monkey analyze

    For example, the team expected that areas containing specialized brain cells called mirror neurons, which fire when an animal performs an action such as grasping a stick or hitting a ball, or sees another animal performing the same action, would light up when the macaques watched other macaques playing with toys.

    But the macaques’ mirror neuron regions also showed activity when the animals watched their fellow monkeys interacting socially — and even when they watched objects colliding with other objects.

    That, says Sliwa, suggests that the motor neuron system, which also exists in the human brain, could be more involved than previously thought in understanding a variety of both social and non-social interactions.

    The scientists also expected those areas of the brain that respond selectively to specific visual shapes — namely, faces, bodies, or objects — would be activated when the monkeys watched videos featuring those shapes. And that did indeed happen.

    Surprisingly, though, the body-selective areas of the macaques’ brains got an extra boost when the animals watched videos of monkeys interacting with objects. And their face-selective areas perked up even more in response to videos of monkey-on-monkey social interactions. This suggests that the same parts of the brain that are responsible for analyzing visual shapes might also be partly responsible for analyzing both physical and social interactions.

    An exclusive social network

    Most intriguingly, the team discovered that additional areas of the brain, far removed from those face- and body-selective areas, also lit up in response to social interactions. Digging deeper, the researchers even identified a portion of the network that responded exclusively to social interactions, remaining nearly silent in their absence.

    “That was both unexpected and mind-boggling,” says Freiwald, who explains that no other study has shown evidence of a network in the brain going dark when denied its preferred input.

    This socially sensitive network is located in the same areas of the brain that are associated with theory of mind in humans — areas that are similarly activated only when we reflect on the thoughts of others.

    As a result, says Sliwa, it could represent an “evolutionary precursor” to the neural network that produces theory of mind in our own brains. And we humans, in turn, might not be quite as unique — or as far removed from our primate cousins — as we like to think.

    While showing monkeys videos of social interaction, scientists scanned their brains and tracked their gaze (red dot).

    Source:Science Daily

  • Blind people have brain map for ‘visual’ observations too

    {Is what you’re looking at an object, a face, or a tree? When processing visual input, our brain uses different areas to recognize faces, body parts, scenes, and objects. Scientists at KU Leuven (University of Leuven), Belgium, have now shown that people who were born blind use a ‘brain map’ with a very similar layout to distinguish between these same categories.}

    Our brain only needs a split second to determine what we’re seeing. The area in our brain that can categorize these visual observations so quickly is the so-called ventral-temporal cortex, the visual brain. Like a map, this region is divided into smaller regions, each of which recognizes a particular category of observations — faces, body parts, scenes, and objects.

    Scientists have long wondered whether we’re born with this map, or whether its development relies on the visual input that we receive.

    To answer this question, researchers from the KU Leuven Laboratory for Biological Psychology conducted an experiment with people who were born blind — some of them even without eyeballs — and have therefore never processed any visual information.

    They asked the blind participants to listen to sounds from four categories: laughing, kissing, and lip smacking for faces; hand clapping and footsteps for body parts; forest and beach sounds for scenes; and a clock, washing machine, and car for objects. Meanwhile, a scanner measured the activity in their visual brain.

    “We found that blind individuals also use the map in the visual brain,” Professor Hans Op de Beeck from the KU Leuven Laboratory of Biological Psychology explains. “Their visual brain responds in a different way to each category. This means that blind people, too, use this part of the brain to differentiate between categories, even though they’ve never had any visual input. And the layout of their map is largely the same as that of sighted people. This means that visual experience is not required to develop category selectivity in the visual brain.”

    But these findings also raise new questions. For one thing, sounds are very different from visual input such as images and videos, so what exactly is being processed in blind people’s visual brain? Further research will have to show.

    Different brain areas recognize different visual categories: faces, objects, scenes, or body parts.

    Source:Science Daily

  • Why did hunter-gatherers first begin farming?

    {The beginnings of agriculture changed human history and has fascinated scientists for centuries.}

    Researchers from the Grantham Centre for Sustainable Futures at the University of Sheffield have shed light on how hunter-gatherers first began farming and how crops were domesticated to depend on humans.

    Domesticated crops have been transformed almost beyond recognition in comparison with their wild relatives — a change that happened during the early stages of farming in the Stone Age.

    For grain crops like cereals, the hallmark of domestication is the loss of natural seed dispersal — seeds no longer fall off plants but have become dependent on humans or machines to spread them.

    Professor Colin Osborne, from the Grantham Centre for Sustainable Futures at the University of Sheffield, said: “We know very little about how agriculture began, because it happened 10,000 years ago — that’s why a number of mysteries are unresolved. For example why hunter-gatherers first began farming, and how were crops domesticated to depend on people.

    “One controversy in this area is about the extent to which ancient peoples knew they were domesticating crops. Did they know they were breeding domestication characteristics into crops, or did these characteristics just evolve as the first farmers sowed wild plants into cultivated soil, and tended and harvested them?”

    The new research, published in the journal Evolution Letters, shows the impact of domestication on vegetable seed size.

    Any selective breeding of vegetables by early farmers would have acted on the leaves, stems or roots that were eaten as food, but should not have directly affected seed size.

    Instead, any changes in vegetable seed size must have arisen from natural selection acting on these crops in cultivated fields, or from genetic links to changes in another characteristic like plant or organ size. In the last instance, people might have bred crops to become bigger, and larger seeds would have come along unintentionally.

    The University of Sheffield researchers gathered seed size data from a range of crops and found strong evidence for a general enlargement of seeds due to domestication.

    They discovered domesticated maize seeds are 15 times bigger than the wild form, soybean seeds are seven times bigger. Wheat, barley and other grain crops had more modest increases in size (60 per cent for barley and 15 per cent for emmer wheat) but these changes are important if they translate into yield.

    “We found strong evidence for a general enlargement of seeds due to domestication across seven vegetable species,” said Professor Osborne.

    “This is especially stunning in a crop like a sweet potato, where people don’t even plant seeds, let alone harvest them. The size of this domestication effect falls completely within the range seen in cereals and pulse grains like lentils and beans, raising the possibility that at least part of the seed enlargement in these crops also evolved during domestication without deliberate foresight from early farmers.”

    Professor Osborne added: “Our findings have important implications for understanding how crops evolved, because they mean that major changes in our staple crops could have arisen without deliberate foresight by early farmers.

    “This means that unconscious selection was probably more important in the genesis of our food plants than previously realised. Early increases in the yields of crops might well have evolved in farmers’ fields rather than being bred artificially.”

    Domesticated crops have been transformed almost beyond recognition in comparison with their wild relatives - a change that happened during the early stages of farming in the Stone Age.

    Source:Science Daily

  • The human sense of smell: It’s stronger than we think

    {Researcher debunks 19th century myth that animals are better at sniffing out scents}

    When it comes to our sense of smell, we have been led to believe that animals win out over humans: No way can we compete with dogs and rodents, some of the best sniffers in the animal kingdom.

    But guess what? It’s a big myth. One that has survived for the last 150 years with no scientific proof, according to Rutgers University-New Brunswick neuroscientist John McGann, associate professor in the Department of Psychology, School of Arts and Sciences, in a paper published on May 12 in Science.

    McGann, who has been studying the olfactory system, or sense of smell, for the past 14 years, spent part of the last year reviewing existing research, examining data and delving into the historical writings that helped create the long-held misconception that human sense of smell was inferior because of the size of the olfactory bulb.

    “For so long people failed to stop and question this claim, even people who study the sense of smell for a living,” says McGann, who studies how the brain understands sensory stimuli using information gleaned from prior experience.

    “The fact is the sense of smell is just as good in humans as in other mammals, like rodents and dogs.” Humans can discriminate maybe one trillion different odors, he says, which is far more, than the claim by “folk wisdom and poorly sourced introductory psychology textbooks,” that insist humans could only detect about 10,000 different odors.

    McGann points to Paul Broca, a 19th century brain surgeon and anthropologist as the culprit for the falsehood that humans have an impoverished olfactory system — an assertion that, McGann says, even influenced Sigmund Freud to insist that this deficiency made humans susceptible to mental illness.

    “It has been a long cultural belief that in order to be a reasonable or rational person you could not be dominated by a sense of smell,” says McGann. “Smell was linked to earthly animalistic tendencies.” The truth about smell, McGann says, is that the human olfactory bulb, which sends signals to other areas of a very powerful human brain to help identify scents, is quite large and similar in the number of neurons to other mammals.

    The olfactory receptor neurons in the nose work by making physical contact with the molecules composing the odor, and they send this information back to that region of the brain.

    “We can detect and discriminate an extraordinary range of odors; we are more sensitive than rodents and dogs for some odors; we are capable of tracking odor trails; and our behavioral and affective states are influenced by our sense of smell,” McGann writes in Science.

    In Broco’s 1879 writings, he claimed that the smaller volume of the olfactory area compared to the rest of the brain meant that humans had free will and didn’t have to rely on smell to survive and stay alive like dogs and other mammals.

    In reality, McGann says, there is no support for the notion that a larger olfactory bulb increases sense of smell based solely on size and insists that the human sense of smell is just as good and that of animals.

    “Dogs may be better than humans at discriminating the urines on a fire hydrant and humans may be better than dogs at discriminating the odors of fine wine, but few such comparisons have actual experimental support,” McGann writes in Science.

    The idea that humans don’t have the same sense of smell abilities as animals flourished over the years based on some genetic studies which discovered that rats and mice have genes for about 1000 different kinds of receptors that are activated by odors, compared to humans, who only have about 400.

    “I think it has been too easy to get caught up in numbers,” says McGann. “We’ve created a confirmation bias by working off a held belief that humans have a poor sense of smell because of these lower numbers of receptors, which in reality is still an awful lot.”

    The problem with this continuing myth, McGann says, is that smell is much more important than we think. It strongly influences human behavior, elicits memories and emotions, and shapes perceptions.

    Our sense of smell plays a major, sometimes unconscious, role in how we perceive and interact with others, select a mate, and helps us decide what we like to eat. And when it comes to handling traumatic experiences, smell can be a trigger in activating PTSD.

    While smell can begin to deteriorate as part of the aging process, McGann says, physicians should be more concerned when a patient begins to lose the ability to detect odors and not just retreat back to the misconception that humans’ sense of smell is inferior.

    “Some research suggests that losing the sense of smell may be the start of memory problems and diseases like Alzheimer’s and Parkinson’s,” says McGann. “One hope is that the medical world will begin to understand the importance of smell and that losing it is a big deal.”

    Humans can detect about one trillion different odors and have just as good a sense of smell as animals, suggests a new study.

    Source:Science Daily

  • Marijuana use tied to poorer school performance

    {When high school students started smoking marijuana regularly they were less likely to get good grades and want to pursue university, according to a new study from the University of Waterloo.}

    The study, published in the Journal of School Health, found that when students started using marijuana at least once a month they were about four times more likely to skip class, two-to-four times less likely to complete their homework and value getting good grades, and about half as likely to achieve high grades, than when they had never used the drug.

    The study also asked students the highest level of education they would like and expect to achieve. Results indicated that when students started smoking marijuana daily, their likelihood of reporting ambitions to pursue university, as opposed to stopping at high school or before, was about 50 per cent lower than when they had never used the drug.

    “The findings support the importance of preventing and delaying the initiation of marijuana use among adolescents,” said Scott Leatherdale, a professor in the School of Public Health and Health Systems and head of COMPASS, the largest longitudinal study of substance use among youth. “More youth today use marijuana than cigarettes, yet public health prevention efforts lag behind those of alcohol and tobacco.”

    The human brain actively develops until a person reaches their early twenties. Studies suggest that adults who smoked the drug regularly during adolescence exhibit reduced neural connectivity in regions responsible for memory, learning and inhibitions.

    “We’ve seen reductions in the number of youth perceiving marijuana as harmful, yet they have greater vulnerability to adverse consequences,” said Karen Patte, a post-doctoral fellow and lead author of the paper. “We found that the more frequently students started using the drug, the greater their risk of poor school performance and engagement.”

    The study also looked at the effects of alcohol use on academic aspirations and expectations. Unlike marijuana, students initiating regular alcohol use tended to report goals to pursue post-secondary education.

    “Drinking has long been tied to university settings, which may make alcohol a more acceptable substance choice for students planning to attend university,” said Leatherdale. “All substances present risks to health and well-being. With marijuana legalization on the horizon, it’s critical we understand these risks in order to promote successful transitions into adulthood for our youth.”

    The human brain actively develops until a person reaches their early twenties. Studies suggest that adults who smoked pot regularly during adolescence exhibit reduced neural connectivity in regions responsible for memory, learning and inhibitions.

    Source:Science Daily

  • Men and women show equal ability at recognizing faces

    {Despite conventional wisdom that suggests women are better than men at facial recognition, Penn State psychologists found no difference between men and women in their ability to recognize faces and categorize facial expressions.}

    In the study, the researchers used behavioral tests, as well as neuroimaging, to investigate whether there is an influence of biological sex on facial recognition, according to Suzy Scherf, assistant professor of psychology and neuroscience.

    “There has been common lore in the behavioral literature that women do better than men in many types of face-processing tasks, such as face recognition and detecting and categorizing facial expressions, although, when you look in the empirical literature, the findings are not so clear cut,” said Scherf. “I went into this work fully expecting to see an effect of biological sex on the part of the observer in facial recognition — and we did not find any. And we looked really hard.”

    Scherf said that facial recognition is one of the most important skills people use to navigate social interactions. It is also a key motivation for certain types of behavior, as well.

    “Within 30 milliseconds of looking at a face, you can figure out the age, the sex, whether you know the person or not, whether the person is trustworthy, whether they’re competent, attractive, warm, caring — we can make categorizations on faces that fast,” said Scherf. “And some of that is highly coordinated with our behavioral decisions of what we are going to do following those attributions and decisions. For example, Do I want to vote for this person? Do I want to have a conversation with this person? Where do I fit in the status hierarchy? A lot of what we do is dictated by the information we get from faces.”

    Scherf added that the importance of facial recognition for both sexes underlines the logic of why men and women should have equal facial recognition abilities.

    “Faces are just as important for men, you can argue, as they are for women,” said Scherf. “Men get all the same cues from faces that women do.”

    According to Scherf, the researchers did not find any evidence of another commonly held belief that women could recognize faces of their own biological sex more easily than the other, also referred to as “own gender bias.”

    The researchers, who report their findings in eNeuro (available online), used a common face recognition task called the Cambridge Face Memory Test, which measures whether a person can identify a male face out of a line up of three faces. They also created their own female version of the memory test. Because of previous concerns of an own gender bias in women, the Cambridge Face Memory Test features only male faces.

    “We couldn’t test the own gender bias without a female version of this test,” said Scherf, who worked with Daniel B. Elbich and Natalie V. Motta-Mena, both graduate students in psychology.

    In a second test, they scanned the brains of participants in an MRI machine while the subjects watched a series of short video clips of unfamiliar faces, famous faces, common objects and navigational scenes, such as a clip of the Earth from outer space; and in a separate task as they recognized specific faces.

    After the tests, the scans of neural activity happening in areas known for facial recognition — as well as other types of visual recognition — were statistically identical for both men and women.

    Participants were carefully selected for the study because certain conditions can affect facial recognition.

    “In order to enroll someone in our study, we went through a careful screening procedure to make sure that people did not have a history of neurological or psychiatric disorders in themselves, or in their first-degree relatives,” said Scherf. “This is important because in nearly all the affective disorders — depression, anxiety, schizophrenia, bipolar — face processing is disrupted.”

    The researchers also screened out participants with concussions, which can disrupt patterns of brain activation and function, Scherf added.

    Scherf, who also studies adolescents and pubertal development, began to investigate biological sex differences to further her own understanding of what sex differences — if any — exist in sexually mature men and women, compared to adolescents.

    The Social Science Research Institute and the National Science Foundation supported this work.

    The study found no evidence of the commonly held belief that women can recognize faces more accurately than men.

    Source:Science Daily

  • The family dog could help boost physical activity for kids with disabilities

    {The family dog could serve as a partner and ally in efforts to help children with disabilities incorporate more physical activity into their daily lives, a new study from Oregon State University indicates.}

    In a case study of one 10-year-old boy with cerebral palsy and his family’s dog, researchers found the intervention program led to a wide range of improvements for the child, including physical activity as well as motor skills, quality of life and human-animal interactions.

    “These initial findings indicate that we can improve the quality of life for children with disabilities, and we can get them to be more active,” said Megan MacDonald, an assistant professor in OSU’s College of Public Health and Human Sciences and corresponding author on the study. “And in this case, both are happening simultaneously, which is fantastic.”

    The researchers detailed the child’s experience in the adapted physical activity intervention program in a case study just published in the journal Animals. Co-authors are Monique Udell of the OSU College of Agricultural Sciences; Craig Ruaux of the OSU College of Veterinary Medicine; Samantha Ross of the OSU College of Public Health and Human Sciences; Amanda Tepfer of Norwich University and Wendy Baltzer of Massey University in New Zealand. The research was supported by the Division of Health Sciences at OSU.

    Children with physical disabilities such as cerebral palsy spend significantly less time participating in physical activity compared to their peers and are considered a health disparity group, meaning they generally face more health concerns than their peers.

    Researchers designed an adapted physical activity, animal-assisted intervention where the family dog would serve as a partner with the child in physical activities designed to help improve overall physical activity, motor skills and quality of life. The family dog is a good choice for this type of intervention because the animal is already known to the child and there is an existing relationship — and both the dog and the child will benefit from the activities, MacDonald said.

    Researchers took initial assessments of the child’s daily physical activity, motor skills and quality of life before starting the eight-week intervention. A veterinarian examined the dog’s fitness for participation and the human-animal interaction between the dog, a year-old Pomeranian, and the child was also assessed.

    Then the pair began the eight-week intervention, which included a supervised physical activity program once a week for 60 minutes and participation in activities such as brushing the dog with each hand; playing fetch and alternating hands; balancing on a wobble board; and marching on a balancing disc.

    “The dog would also balance on the wobble board, so it became a challenge for the child — if the dog can do it, I can, too,” MacDonald said. “It was so cool to see the relationship between the child and the dog evolve over time. They develop a partnership and the activities become more fun and challenging for the child. It becomes, in part, about the dog and the responsibility of taking care of it.”

    The dog and the child also had “homework,” which included brushing the dog, playing fetch and going on daily walks. The child wore an accelerometer to measure physical activity levels at home.

    At the conclusion of the intervention, researchers re-assessed and found that the child’s quality of life had increased significantly in several areas, including emotional, social and physical health, as assessed by the child as well as the parent. In addition, the child’s sedentary behavior decreased and time spent on moderate to vigorous activity increased dramatically.

    “The findings so far are very encouraging,” MacDonald said. “There’s a chance down the road we could be encouraging families to adopt a dog for the public health benefits. How cool would that be?”

    The researchers also found that the relationship between the dog and the child improved over the course of the therapy as they worked together on various tasks. The dog’s prosocial, or positive, behavior toward the child is a sign of wellbeing for both members of the team, said Udell, who is director of the Human-Animal Interaction Lab at OSU.

    “A closer child-dog bond increases the likelihood of lasting emotional benefits and may also facilitate long-term joint activity at home, such as taking walks, simply because it is enjoyable for all involved,” she said.

    This study is one of the first to evaluate how a dog’s behavior and wellbeing are affected by their participation in animal-assisted therapy, Udell noted. From an animal welfare standpoint, it is promising that the dog’s behavior and performance on cognitive and physical tasks improved alongside the child’s.

    Though the case study features only one child, the research team recruited several families with children with disabilities and their dogs to participate in the larger project, which was designed in part to test the design and methodology of the experiment and determine if it could be implemented on a larger scale.

    Based on the initial results, researchers hope to pursue additional studies involving children with disabilities and their family dogs, if funding can be secured. They would like to examine other benefits such a pairing might have, including the sense of responsibility the child appears to gain during the course of the intervention.

    “We’re also learning a lot from our child participants,” MacDonald said. “They’re teaching us stuff about friendship with the animal and the responsibility of taking care of a pet, which allows us to ask more research questions about the influence of a pet on the child and their family.”

    Source:Science Daily

  • Oldest evidence of life on land found in 3.48 billion-year-old Australian rocks

    {Fossils discovered by UNSW scientists in 3.48 billion year old hot spring deposits in the Pilbara region of Western Australia have pushed back by 580 million years the earliest known existence of microbial life on land.}

    Previously, the world’s oldest evidence for microbial life on land came from 2.7- 2.9 billion-year-old deposits in South Africa containing organic matter-rich ancient soils.

    “Our exciting findings don’t just extend back the record of life living in hot springs by 3 billion years, they indicate that life was inhabiting the land much earlier than previously thought, by up to about 580 million years,” says study first author, UNSW PhD candidate, Tara Djokic.

    “This may have implications for an origin of life in freshwater hot springs on land, rather than the more widely discussed idea that life developed in the ocean and adapted to land later.”

    Scientists are considering two hypotheses regarding the origin of life. Either that it began in deep sea hydrothermal vents, or alternatively that it began on land in a version of Charles Darwin’s “warm little pond.”

    “The discovery of potential biological signatures in these ancient hot springs in Western Australia provides a geological perspective that may lend weight to a land-based origin of life,” says Ms Djokic.

    “Our research also has major implications for the search for life on Mars, because the red planet has ancient hot spring deposits of a similar age to the Dresser Formation in the Pilbara.

    “Of the top three potential landing sites for the Mars 2020 rover, Columbia Hills is indicated as a hot spring environment. If life can be preserved in hot springs so far back in Earth’s history, then there is a good chance it could be preserved in Martian hot springs too.”

    The study, by Ms Djokic and Professors Martin Van Kranendonk, Malcolm Walter and Colin Ward of UNSW Sydney, and Professor Kathleen Campbell of the University of Auckland, is published in the journal Nature Communications.

    The researchers studied exceptionally well-preserved deposits which are approximately 3.5 billion years old in the ancient Dresser Formation in the Pilbara Craton of Western Australia.

    They interpreted the deposits were formed on land, not in the ocean, by identifying the presence of geyserite – a mineral deposit formed from near boiling-temperature, silica-rich, fluids that is only found in a terrestrial hot spring environment. Previously, the oldest known geyserite had been identified from rocks about 400 million years old.

    Within the Pilbara hotspring deposits, the researchers also discovered stromatolites – layered rock structures created by communities of ancient microbes. And there were other signs of early life in the deposits as well, including fossilised micro-stromatolites, microbial palisade texture and well preserved bubbles that are inferred to have been trapped in a sticky substance (microbial) to preserve the bubble shape.

    “This shows a diverse variety of life existed in fresh water, on land, very early in Earth’s history,” says Professor Van Kranendonk, Director of the Australian Centre for Astrobiology and head of the UNSW school of Biological, Earth and Environmental Sciences.

    “The Pilbara deposits are the same age as much of the crust of Mars, which makes hot spring deposits on the red planet an exciting target for our quest to find fossilised life there.”

    Spherical bubbles preserved in 3.48 billion-year-old rocks in the Dresser Formation in the Pilbara Craton in Western Australia provide evidence for early life having lived in ancient hot springs on land.

    Source:Science Daily