Category: Science News

  • New appetite control mechanism found in brain

    {Study explains why food looks even better when dieting}

    Over recent decades, our understanding of hunger has greatly increased, but this new discovery turns things on their head. Up until now, scientists knew that leptin — a hormone released by fatty tissue, reduces appetite, while ghrelin — a hormone released by stomach tissue makes us want to eat more. These hormones, in turn, activate a host of neurons in the brain’s hypothalamus — the body’s energy control center.

    The discovery of NPGL by Professor Kazuyoshi Ukena of Hiroshima University shows that hunger and energy consumption mechanisms are even more complex than we realized — and that NPGL plays a central role in what were thought to be well-understood processes.

    Professor Ukena first discovered NPGL in chickens after noticing that growing birds grew larger irrespective of diet — suggesting there was more to energy metabolism than meets the eye. Intrigued, the researchers at HU performed a DNA database search to see if mammals might also possess this elusive substance. They found that it exists in all vertebrates — including humans.

    In order to investigate its role, if any, in mammals, Professor Ukena’s team fed three groups of mice, on three distinct diets, to see how NPGL levels are altered. The first set of mice was fed on a low-calorie diet for 24 hours. The second group was fed on a high-fat diet for 5 weeks — and the third lucky group was fed on a high-fat diet, but for an extended period of 13 weeks.

    The mice fed on a low calorie diet were found to experience an extreme increase in NPGL expression, while the 5-week high-fat-diet group saw a large decrease in NPGL expression.

    Further analysis found that mice possess NPGL, and its associated neuron network, in the exact same locations of the brain as those regions already known to control appetite suppression and energy use.

    Professor Ukena proposes that NPGL plays a vital role in these mechanisms — increasing appetite when energy levels fall and reducing appetite when an energy overload is detected — together, helping to keep us at a healthy and functioning weight, and more importantly alive!

    As NPGL levels greatly increased in mice exposed to a low calorie diet, Professor Ukena believes it is an appetite promoter, working in opposition to appetite suppressing hormones such as leptin. Backing this hypothesis up, it was found that mice directly injected with NPGL exhibited a voracious appetite.

    Interestingly NPGL levels, which plummeted in the 5-week-long high-fat-diet mice — fell back to normal levels in mice who gorged themselves for the longer period of 13 weeks.

    It is proposed that exposure to high-fat diets for long periods of time lead to insensitivity to leptin’s appetite-suppressing effects, and so NPGL — even at normal levels — leads to weight gain and obesity, showing that the body can only do so much to keep our weight in check.

    Professor Ukena says that further study is required to understand the interaction of previously known appetite mechanisms with this new kid on the homeostasis block. It does seem however, that we still have a lot to learn about appetite, hunger, and energy consumption. It is hoped that this study into mammalian NPGL adds another piece to the puzzle.

    What is certain — but you knew this already — is that dieting is difficult. The discovery and study of mammalian NPGL helps explain why, and provides a plausible excuse for those whose good intentions fall short.

    Source:Science Daily

  • Scythian horse breeding unveiled: Lessons for animal domestication

    {Nomad Scythian herders roamed vast areas spanning the Central Asian steppes during the Iron Age, approximately from the 9th to the 1st century BCE (Before Common Era). These livestock pastoralists, who lived on wagons covered by tents, left their mark in the history of warfare for their exceptional equestrian skills. They were among the first to master mounted riding and to make use of composite bows while riding. A new study published in Science led by Professor Ludovic Orlando and involving 33 international researchers from 16 universities, now reveals the suite of traits that Scythian breeders selected to engineer the type of horse that best fit their purpose.}

    The study took advantage of exceptionally preserved horse remains in royal Scythian burials, such as the site of Arzhan, Tuva Republic, where over 200 horses have been excavated but also at Berel’, Kazakhstan, where no less than 13 horses were preserved in a single, permafrozen funerary chamber. Applying the latest methods in ancient DNA research, the researchers could sequence the genome of 13 Scythian stallions. These were 2,300-2,700 years old and included 11 specimens from Berel’ and two from Arzhan. The researchers also sequenced the genome of one 4,100 year-old mare from Chelyabinsk, Russia, belonging to the earlier Sintashta culture, which developed the first two-wheeled chariots drawn by horses.

    The DNA variation observed at key genes revealed a large diversity of coat coloration patterns within Scythian horses, including bay, black, chestnut, cream and spotted animals. Scythian horses did not carry the mutation responsible for alternate gaits, and as a consequence, were not natural amblers. However, some but not all individuals carried variants associated with short-distance sprint performance in present-day racing horses. This indicates that Scythian breeders valued animals showing diverse endurance and speed potential.

    “With the exception of two horses, none of the animals were related. It fits with Herodotus’ depiction of Scythian funerary rituals, whereby sacrificed horses represented gifts from allied tribes spread across the steppes,” says Dr. Pablo Librado, post-doctoral researcher at the Centre for GeoGenetics, University of Copenhagen, Denmark, and co-leading author of the study.

    Importantly, none of the ancient horses analyzed in the study were inbred, which suggests that Scythian breeders succeeded in maintaining natural herd structures and did not perform selection through a limited number of valuable lineages. This contrasts with modern management practice where single stallions can be used to father hundreds of offspring. Patterns of genetic variation along the genome also revealed a total of 121 genes selected by Scythian breeders, most of which are involved in the development of forelimbs. This is reminiscent of the morphological indices measured on bones, and indicates that Scythian breeders selected horses showing more robust morphologies.

    “In this study we wanted to go beyond the myth of Scythians being aggressive warriors, drinking the blood of their enemies in skull mugs. We wanted to reveal the many facets of the exceptional relationship that these people developed with their horses,” says Ludovic Orlando, Professor of Molecular Archaeology at the Centre for GeoGenetics, University of Copenhagen and CNRS Research Director at the AMIS laboratory, University of Toulouse.

    The genome data set generated in the study also reveals important lessons for the history of horse management, which started some 5,500 years ago, and animal domestication as a whole. By contrasting patterns of genetic variation in ancient and present-day horses, the authors found support for a significant demographic collapse during the last 2,300 years, which resulted in an important reduction of genetic diversity within horse domesticates. During the same time period, reproductive management has involved an increasingly reduced number of stallions, up to the point that, today, almost all domesticates virtually carry the same, or highly similar, Y-chromosome haplotype(s).

    “Many Y-chromosome haplotypes co-existed within Scythian horse populations. The first three millennia of horse domestication thus preserved a large diversity of male lineages. It only vanished within the last 2,000 years,” adds Dr. Cristina Gamba, post-doctoral researcher at the Centre for GeoGenetics at the time of the study, and co-leading author of the study.

    The authors also found that the demographic collapse and loss of Y-chromosomal diversity observed within the last 2,300 years was mirrored by a significant accumulation of deleterious mutations in the genome of the horse. As these mutations reduce the fitness of their carriers, it shows that the last two millennia of horse management have negatively impacted the horse. However, early domestication stages, as represented by the Sintashta and Scythian genomes, did not have such an impact. This contrasts with the Cost-of-domestication hypothesis, which posits a negative impact starting from early domestication stages. In the case of horse domestication, it is likely that the demographic collapse within the last 2,000 years reduced the efficacy of negative selection to purge out deleterious mutations, which could then accumulate in the horse genome.

    Finally, the researchers developed a novel statistical method to investigate the genome data for signatures of positive selection in early domestication stages. They found that the genomic regions showing the most extreme signatures were involved in the development of the neural crest, and expressed within tissues derived from the neural crest.

    “The neural crest hypothesis proposes a unified model for the origin of similar traits commonly found in most domestic animals. As the neural crest represents a temporary group of cells during development which gives rise to many tissues and cell lineages, selection for genetic variants affecting the neural crest can almost in one go co-select for a range of traits. The over-representation detected in our study supports the neural crest as key to animal domestication and to the rise of common domestic traits in independent animal lineages,” concludes Professor Ludovic Orlando.

    These are Kazakh horses in North Central Kazakhstan.

    Source:Science Daily

  • What’s coming next? Scientists identify how the brain predicts speech

    {An international collaboration of neuroscientists has shed light on how the brain helps us to predict what is coming next in speech.}

    In the study, publishing on April 25 in the open access journal PLOS Biology scientists from Newcastle University, UK, and a neurosurgery group at the University of Iowa, USA, report that they have discovered mechanisms in the brain’s auditory cortex involved in processing speech and predicting upcoming words, which is essentially unchanged throughout evolution. Their research reveals how individual neurons coordinate with neural populations to anticipate events, a process that is impaired in many neurological and psychiatric disorders such as dyslexia, schizophrenia and Attention Deficit Hyperactivity Disorder (ADHD).

    Using an approach first developed for studying infant language learning, the team of neuroscientists led by Dr Yuki Kikuchi and Prof Chris Petkov of Newcastle University had humans and monkeys listen to sequences of spoken words from a made-up language. Both species were able to learn the predictive relationships between the spoken sounds in the sequences.

    Neural responses from the auditory cortex in the two species revealed how populations of neurons responded to the speech sounds and to the learned predictive relationships between the sounds. The neural responses were found to be remarkably similar in both species, suggesting that the way the human auditory cortex responds to speech harnesses evolutionarily conserved mechanisms, rather than those that have uniquely specialized in humans for speech or language.

    “Being able to predict events is vital for so much of what we do every day,” Professor Petkov notes. “Now that we know humans and monkeys share the ability to predict speech we can apply this knowledge to take forward research to improve our understanding of the human brain.”

    Dr Kikuchi elaborates, “in effect we have discovered the mechanisms for speech in your brain that work like predictive text on your mobile phone, anticipating what you are going to hear next. This could help us better understand what is happening when the brain fails to make fundamental predictions, such as in people with dementia or after a stroke.”

    Building on these results, the team are working on projects to harness insights on predictive signals in the brain to develop new models to study how these signals go wrong in patients with stroke or dementia. The long-term goal is to identify strategies that yield more accurate prognoses and treatments for these patients.

    The artificial grammar used in this study and the phase-amplitude coupling in human auditory cortex.

    Source:Science Daily

  • Smartphone-controlled cells help keep diabetes in check

    {Cells engineered to produce insulin under the command of a smartphone helped keep blood sugar levels within normal limits in diabetic mice, a new study reports.}

    More than 415 million people worldwide are living with diabetes, and frequently need to inject themselves with insulin to manage their blood sugars. Human cells can be genetically engineered into living factories that efficiently manufacture and deliver hormones and signaling molecules, but most synthetic biological circuits don’t offer the same degree of sensitivity and precision as digital sensors.

    Combining living tissues and technology, Jiawei Shao et al. created custom cells that produced insulin when illuminated by far-red light (the same wavelengths emitted by therapy bulbs and infrared saunas).

    The researchers added the cells to a soft bio-compatible sheath that also contained wirelessly-powered red LED lights to create HydrogeLEDs that could be turned on and off by an external electromagnetic field.

    Implanting the HydrogeLEDs into the skin of diabetic mice allowed Shao and colleagues to administer insulin doses remotely through a smartphone application. They not only custom-coded the smartphone control algorithms, but designed the engineered cells to produce insulin without any “cross-talk” between normal cellular signaling processes.

    The scientists went on to pair the system with a Bluetooth-enabled blood glucose meter, creating instant feedback between the therapeutic cells and the diagnostic device that helped diabetic animals rapidly achieve and maintain stable blood glucose levels in a small pilot experiment over a period of several weeks.

    The authors say that successfully linking digital signals with engineered cells represents an important step toward translating similar cell-based therapies into the clinic. A related Focus by Mark Gomelsky highlights the findings further.

    Image shows bone progenitor cells labeled by red glow inside a cleared femur.

    Source:Science Daily

  • Primitive human ‘lived much more recently’

    {A primitive type of human, once thought to be up to three million years old, actually lived much more recently, a study suggests.}

    The remains of 15 partial skeletons belonging to the species Homo naledi were described in 2015.

    They were found deep in a cave system in South Africa by a team led by Lee Berger from Wits University.

    In an interview, he now says the remains are probably just 200,000 to 300,000 years old.

    Although its anatomy shares some similarities with modern people, other anatomical features of Homo naledi hark back to humans that lived in much earlier times – some two million years ago or more.

    “These look like a primitive form of our own genus – Homo. It looks like it might be connected to early Homo erectus, or Homo habilis, Homo rudolfensis,” said Prof Berger’s colleague, John Hawks, from the University of Wisconsin.

    Although some experts guessed that naledi could had lived relatively recently, in 2015, Prof Berger told BBC News that the remains could be up to three million years old.

    New dating evidence places the species in a time period where Homo naledi could have overlapped with early examples of our own kind, Homo sapiens.

    Prof Hawks told the BBC’s Inside Science radio programme: “They’re the age of Neanderthals in Europe, they’re the age of Denisovans in Asia, they’re the age of early modern humans in Africa. They’re part of this diversity in the world that’s there as our species was originating.”

    “We have no idea what else is out there in Africa for us to find – for me that’s the big message. If this lineage, which looks like it originated two million years ago was still hanging around 200,000 years ago, then maybe that’s not the end of it. We haven’t found the last [Homo naledi], we’ve found one.”

    The naledi remains were uncovered in 2013 inside a difficult-to-access chamber within the Rising Star cave system. At the time, Prof Berger said he believed the remains had been deposited in the chamber deliberately, perhaps over generations.

    This idea, which would suggest that Homo naledi was capable of ritual behaviour, met with controversy because such practices are thought by some to be characteristic of human modernity.

    Prof Hawks says that the team has since started exploring a second chamber.

    “[The second] chamber has the remains of an additional three individuals, at least, including a really, really cool partial skeleton with a skull,” said Prof Hawks.

    Researchers have already attempted to extract DNA from the remains to gain more information about naledi’s place in the human evolutionary tree. However, they have not yet been successful.

    “[The remains] are obviously at an age where we have every reason to think there might be some chance. The cave is relatively warm compared to the cold caves in northern Europe and Asia where we have really good DNA preservation,” said Prof Hawks.

    A study outlining the dating evidence is due for publication in coming months.

    Homo naledi has much in common with early forms of the genus Homo

    Source:BBC

  • The evolution of dog breeds now mapped

    {When people migrate, Canis familiaris travels with them. Piecing together the details of those migrations has proved difficult because the clues are scattered across the genomes of hundreds of dog breeds. However, in a study published April 25 in Cell Reports, researchers have used gene sequences from 161 modern breeds to assemble an evolutionary tree of dogs. The map of dog breeds, which is the largest to date, unearths new evidence that dogs traveled with humans across the Bering land bridge, and will likely help researchers identify disease-causing genes in both dogs and humans.}

    The study highlights how the oldest dog breeds evolved or were bred to fill certain roles. “First, there was selection for a type, like herders or pointers, and then there was admixture to get certain physical traits,” says study co-author and dog geneticist Heidi Parker of the National Institutes of Health (NIH). “I think that understanding that types go back a lot longer than breeds or just physical appearances do is something to really think about.”

    Most popular breeds in America are of European descent, but in the study, researchers found evidence that some breeds from Central and South America — such as the Peruvian Hairless Dog and the Xoloitzcuintle — are likely descended from the “New World Dog,” an ancient canine sub-species that migrated across the Bering Strait with the ancestors of Native Americans. Scientists have previously reported archaeological evidence that the New World Dog existed, but this study marks the first living evidence of them in modern breeds.

    “What we noticed is that there are groups of American dogs that separated somewhat from the European breeds,” says study co-author Heidi Parker of the NIH. “We’ve been looking for some kind of signature of the New World Dog, and these dogs have New World Dogs hidden in their genome.” It’s unclear precisely which genes in modern hairless dogs are from Europe and which are from their New World ancestors, but the researchers hope to explore that in future studies.

    Other results were more expected. For instance, many breeds of “gun dogs,” such as Golden Retrievers and Irish Setters, can trace their origins to Victorian England, when new technologies, such as guns, opened up new roles on hunting expeditions. Those dogs clustered closely together on the phylogenetic tree, as did the spaniel breeds. Breeds from the Middle East, such as the Saluki, and from Asia, such as Chow Chows and Akitas, seem to have diverged well before the “Victorian Explosion” in Europe and the United States.

    Herding breeds, though largely European in origin, proved to be surprisingly diverse. “When we were looking at herding breeds, we saw much more diversity, where there was a particular group of herding breeds that seemed to come out of the United Kingdom, a particular group that came out of northern Europe, and a different group that came out of southern Europe,” says Parker, “which shows herding is not a recent thing. People were using dogs as workers thousands of years ago, not just hundreds of years ago.”

    Different herding dogs use very different strategies to bring their flocks to heel, so in some ways, the phylogenetic data confirmed what many dog experts had previously suspected, the researchers noted. “What that also tells us is that herding dogs were developed not from a singular founder but in several different places and probably different times,” says the study’s senior co-author and dog geneticist Elaine Ostrander, also of the NIH.

    Ostrander and her colleagues have spent years sequencing dog genomes but can also frequently be found out in the field at dog shows, recruiting dog owners to participate in the study. “If we see a breed that we haven’t had a good sample of to sequence, we definitely make a beeline for that owner,” says Ostrander. “And say, ‘Gosh, we don’t have the sequence of the Otterhound yet, and your dog is a beautiful Otterhound. Wouldn’t you like it to represent your breed in the dog genome sequence database?’ And of course, people are always very flattered to say, “Yes. I want my dog to represent Otterhound-ness.” All of the dog sequences in the study are from dogs whose owners volunteered, Ostrander says. Over half the dog breeds in the world today still have not been sequenced and the researchers intend to keep collecting dog genomes to fill in the gaps.

    Understanding dogs’ genetic backstory also has practical applications. Our canine compatriots fall victim to many of the same diseases that humans do — including epilepsy, diabetes, kidney disease, and cancer — but disease prevalence varies widely and predictably between breeds, while it is more difficult to compartmentalize at-risk human populations. “Using all this data, you can follow the migration of disease alleles and predict where they are likely to pop up next, and that’s just so empowering for our field because a dog is such a great model for many human diseases,” says Ostrander. “Every time there’s a disease gene found in dogs it turns out to be important in people, too.”

    The map of dog breeds, which is the largest to date, will likely help researchers identify disease-causing genes in both dogs and humans.

    Source:Science Daily

  • The placebo effect can mend a broken heart too, study shows

    {Feeling heartbroken from a recent breakup? Just believing you’re doing something to help yourself get over your ex can influence brain regions associated with emotional regulation and lessen the perception of pain.}

    That’s the takeaway from a new University of Colorado Boulder study that measured the neurological and behavioral impacts the placebo effect had on a group of recently broken-hearted volunteers.

    “Breaking up with a partner is one of the most emotionally negative experiences a person can have, and it can be an important trigger for developing psychological problems,” said first author and postdoctoral research associate Leonie Koban, noting that such social pain is associated with a 20-fold higher risk of developing depression in the coming year. “In our study, we found a placebo can have quite strong effects on reducing the intensity of social pain.”

    For decades, research has shown that placebos — sham treatments with no active ingredients — can measurably ease pain, Parkinson’s disease and other physical ailments.

    The new study, published in March in the Journal of Neuroscience, is the first to measure placebos’ impact on emotional pain from romantic rejection.

    Researchers recruited 40 volunteers who had experienced an “unwanted romantic breakup” in the past six months. They were asked to bring a photo of their ex and a photo of a same-gendered good friend to a brain-imaging lab.

    Inside a functional magnetic resonance imaging (fMRI) machine, the participants were shown images of their former partner and asked to recall the breakup. Then they were shown images of their friend. They were also subjected to physical pain (a hot stimulus on their left forearm).

    As these stimuli were alternately repeated, the subjects rated how they felt on a scale of 1 (very bad) to 5 (very good). Meanwhile, the fMRI machine tracked their brain activity.

    While not identical, the regions that lit up during physical and emotional pain were similar.

    This finding alone sends an important message to the heartbroken, said senior author Tor Wager, a professor of psychology and neuroscience at CU Boulder: “Know that your pain is real — neuro-chemically real.”

    The subjects were then taken out of the machine and given a nasal spray. Half were told it was a “powerful analgesic effective in reducing emotional pain.” Half were told it was a simple saline solution.

    Back inside the machine, the subjects were again shown images of their ex and subjected to pain. The placebo group not only felt less physical pain and felt better emotionally, but their brain responded differently when shown the ex.

    Activity in the brain’s dorsolateral prefrontal cortex — an area involved with modulating emotions — increased sharply. Across the brain, areas associated with rejection quieted. Notably, after the placebo, when participants felt the best they also showed increased activity in an area of the midbrain called the periaqueductal gray (PAG). The PAG plays a key role in modulating levels of painkilling brain chemicals, or opioids, and feel-good neurotransmitters like dopamine.

    While the study did not look specifically at whether the placebo prompted the release of such chemicals, the authors suspect this could be what’s happening.

    “The current view is that you have positive expectations and they influence activity in your prefrontal cortex, which in turn influences systems in your midbrain to generate neurochemical opioid or dopamine responses,” said Wager.

    Previous studies have shown that the placebo effect alone not only eases depression, but may actually make antidepressants work better.

    “Just the fact that you are doing something for yourself and engaging in something that gives you hope may have an impact,” said Wager. “In some cases, the actual chemical in the drug may matter less than we once thought.”

    The authors said the latest study not only helps them better understand how emotional pain plays out in the brain, but can also hint at ways people can use the power of expectation to their advantage.

    Said Koban: “What is becoming more and more clear is that expectations and predictions have a very strong influence on basic experiences, on how we feel and what we perceive.”

    Bottom line, if you’ve been dumped recently: “Doing anything that you believe will help you feel better will probably help you feel better,” she said.

    This is the first study to measure placebos' impact on emotional pain from romantic rejection.

    Source:Science Daily

  • Why children struggle to cross busy streets safely

    {New research shows perceptual judgment, motor skills not fully developed until age 14}

    For adults, crossing the street by foot seems easy. You take stock of the traffic and calculate the time it will take to get from one side to the other without being hit.

    Yet it’s anything but simple for a child.

    New research from the University of Iowa shows children under certain ages lack the perceptual judgment and motor skills to cross a busy road consistently without putting themselves in danger. The researchers placed children from 6 to 14 years old in a realistic simulated environment and asked them to cross one lane of a busy road multiple times.

    The results: Children up to their early teenage years had difficulty consistently crossing the street safely, with accident rates as high as 8 percent with 6-year-olds. Only by age 14 did children navigate street crossing without incident, while 12-year-olds mostly compensated for inferior road-crossing motor skills by choosing bigger gaps in traffic.

    “Some people think younger children may be able to perform like adults when crossing the street,” says Jodie Plumert, professor in the UI’s Department of Psychological and Brain Sciences. “Our study shows that’s not necessarily the case on busy roads where traffic doesn’t stop.”

    For parents, that means taking extra precautions. Be aware that your child may struggle with identifying gaps in traffic large enough to cross safely. Young children also may not have developed the fine motor skills to step into the street the moment a car has passed, like adults have mastered. And, your child may allow eagerness to outweigh reason when judging the best time to cross a busy street.

    “They get the pressure of not wanting to wait combined with these less-mature abilities,” says Plumert, corresponding author on the study, which appears in the Journal of Experimental Psychology: Human Perception and Performance, published by the American Psychological Association. “And that’s what makes it a risky situation.”

    The National Center for Statistics and Analysis reported 8,000 injuries and 207 fatalities involving motor vehicles and pedestrians age 14 and younger in 2014.

    Plumert and her team wanted to understand the reasons behind the accident rates. For the study, they recruited children who were 6, 8, 10, 12, and 14 years old, as well as a control group of adults. Each participant faced a string of approaching virtual vehicles travelling 25 mph (considered a benchmark speed for a residential neighborhood) and then crossed a single lane of traffic (about nine feet wide). The time between vehicles ranged from two to five seconds. Each participant negotiated a road crossing 20 times, for about 2,000 total trips involving the age groups.

    The crossings took place in an immersive, 3-D interactive space at the Hank Virtual Environments Lab on the UI campus. The simulated environment is “very compelling,” says Elizabeth O’Neal, a graduate student in psychological and brain sciences and the study’s first author. “We often had kids reach out and try to touch the cars.”

    The researchers found 6-year-olds were struck by vehicles 8 percent of the time; 8-year-olds were struck 6 percent; 10-year-olds were struck 5 percent; and 12-year-olds were struck 2 percent. Those age 14 and older had no accidents.

    Children contend with two main variables when deciding whether it’s safe to cross a street, according to the research. The first involves their perceptual ability, or how they judge the gap between a passing car and an oncoming vehicle, taking into account the oncoming car’s speed and distance from the crossing. Younger children, the study found, had more difficulty making consistently accurate perceptual decisions.

    The second variable was their motor skills: How quickly do children time their step from the curb into the street after a car just passed? Younger children were incapable of timing that first step as precisely as adults, which in effect gave them less time to cross the street before the next car arrived.

    “Most kids choose similar size gaps (between the passing car and oncoming vehicle) as adults,” O’Neal says, “but they’re not able to time their movement into traffic as well as adults can.”

    The researchers found children as young as 6 crossed the street as quickly as adults, eliminating crossing speed as a possible cause for pedestrian-vehicle collisions.

    So what’s a child to do? One recommendation is for parents to teach their children to be patient and to encourage younger ones to choose gaps that are even larger than the gaps adults would choose for themselves, O’Neal says. Also, civic planners can help by identifying places where children are likely to cross streets and make sure those intersections have a pedestrian-crossing aid.

    “If there are places where kids are highly likely to cross the road, because it’s the most efficient route to school, for example, and traffic doesn’t stop there, it would be wise to have crosswalks,” Plumert says.

    New research from the University of Iowa shows children under certain ages lack the perceptual judgment and motor skills to cross a busy road consistently without putting themselves in danger.

    Source:Science Daily

  • In young bilingual children, two languages develop simultaneously but independently

    {Study also shows Spanish is vulnerable to being taken over by English, but English is not vulnerable to being taken over by Spanish}

    A new study of Spanish-English bilingual children by researchers at Florida Atlantic University published in the journal Developmental Science finds that when children learn two languages from birth each language proceeds on its own independent course, at a rate that reflects the quality of the children’s exposure to each language.

    In addition, the study finds that Spanish skills become vulnerable as children’s English skills develop, but English is not vulnerable to being taken over by Spanish. In their longitudinal data, the researchers found evidence that as the children developed stronger skills in English, their rates of Spanish growth declined. Spanish skills did not cause English growth to slow, so it’s not a matter of necessary trade-offs between two languages.

    “One well established fact about monolingual development is that the size of children’s vocabularies and the grammatical complexity of their speech are strongly related. It turns out that this is true for each language in bilingual children,” said Erika Hoff, Ph.D., lead author of the study, a psychology professor in FAU’s Charles E. Schmidt College of Science, and director of the Language Development Lab. “But vocabulary and grammar in one language are not related to vocabulary or grammar in the other language.”

    For the study, Hoff and her collaborators David Giguere, a graduate research assistant at FAU and Jamie M. Quinn, a graduate research assistant at Florida State University, used longitudinal data on children who spoke English and Spanish as first languages and who were exposed to both languages from birth. They wanted to know if the relationship between grammar and vocabulary were specific to a language or more language general. They measured the vocabulary and level of grammatical development in these children in six-month intervals between the ages of 2 and a half to 4 years.

    The researchers explored a number of possibilities during the study. They thought it might be something internal to the child that causes vocabulary and grammar to develop on the same timetable or that there might be dependencies in the process of language development itself. They also considered that children might need certain vocabulary to start learning grammar and that vocabulary provides the foundation for grammar or that grammar helps children learn vocabulary. One final possibility they explored is that it may be an external factor that drives both vocabulary development and grammatical development.

    “If it’s something internal that paces language development then it shouldn’t matter if it’s English or Spanish, everything should be related to everything,” said Hoff. “On the other hand, if it’s dependencies within a language of vocabulary and grammar or vice versa then the relations should be language specific and one should predict the other. That is a child’s level of grammar should predict his or her future growth in vocabulary or vice versa.”

    Turns out, the data were consistent only with the final possibility — that the rate of vocabulary and grammar development are a function of something external to the child and that exerts separate influences on growth in English and Spanish. Hoff and her collaborators suggest that the most cogent explanation would be in the properties of children’s input or their language exposure.

    “Children may hear very rich language use in Spanish and less rich use in English, for example, if their parents are more proficient in Spanish than in English,” said Hoff. “If language growth were just a matter of some children being better at language learning than others, then growth in English and growth in Spanish would be more related than they are.”

    Detailed results of the study are described in the article, “What Explains the Correlation between Growth in Vocabulary and Grammar? New Evidence from Latent Change Score Analyses of Simultaneous Bilingual Development.”

    “There is something about differences among the children and the quality of English they hear that make some children acquire vocabulary and grammar more rapidly in English and other children develop more slowly,” said Hoff. “I think the key takeaway from our study is that it’s not the quantity of what the children are hearing; it’s the quality of their language exposure that matters. They need to experience a rich environment.”

    A new study of Spanish-English bilingual children by researchers at Florida Atlantic University published in the journal Developmental Science finds that when children learn two languages from birth each language proceeds on its own independent course, at a rate that reflects the quality of the children's exposure to each language.

    Source:Science Daily

  • More than recess: How playing on the swings helps kids learn to cooperate

    {A favorite childhood pastime — swinging on the playground swing set — also may be teaching kids how to get along.}

    The measured, synchronous movement of children on the swings can encourage preschoolers to cooperate on subsequent activities, University of Washington researchers have found.

    A study by the UW’s Institute for Learning & Brain Sciences (I-LABS) shows the potential of synchronized movement in helping young children develop collaborative skills. The study is published online in the Journal of Experimental Child Psychology.

    “Synchrony enhances cooperation, because your attention is directed at engaging with another person, at the same time,” explained Tal-Chen Rabinowitch, a postdoctoral researcher at I-LABS. “We think that being ‘in time’ together enhances social interaction in positive ways.”

    Previous studies, including others by Rabinowitch, have linked music and being in sync with other pro-social behaviors, such as helping, sharing and empathizing, among young children: Marching together to a song, for example, might prompt one child to share with another.

    In this study, Rabinowitch, along with I-LABS co-director and psychology professor Andrew Meltzoff, sought to focus on movement alone, without music, and examined how children cooperated with one another afterward. Cooperation — adapting to a situation, compromising with someone else, working toward a common goal — is considered a life skill, one that parents and teachers try to develop in a child’s early years.

    For the I-LABS study, researchers built a swing set that enabled two children to swing in unison, in controlled cycles of time. Pairs of 4-year-olds — who were unfamiliar to one another — were randomly assigned to groups that either swung together in precise time, swung out of sync with each other, or didn’t swing at all. The pairs in all three groups then participated in a series of tasks designed to evaluate their cooperation. In one activity, the children played a computer game that required them to push buttons at the same time in order to see a cartoon figure appear. Another, called the “give and take” activity, involved passing objects back and forth through a puzzle-like device.

    Researchers found that the children who swung in unison completed the tasks faster, indicating better cooperation than those who swung out of sync, or not at all. On the button-push task, for instance, the pairs who had been swinging together showed a greater tendency to strategically raise their hands before they pushed the button so as to signal their intent to the other child, which proved to be a successful tactic for the task.

    For 4-year-olds, moving in sync can create a feeling of “being like” another child that, consequently, may encourage them to communicate more and try to work together, Rabinowitch said.

    “Cooperation has both a social and cognitive side, because people can solve problems they couldn’t solve alone,” Meltzoff said. “We didn’t know before we started the study that cooperation between 4-year-olds could be enhanced through the simple experience of moving together. It’s provocative that kids’ cooperation can be profoundly changed by their experiences.”

    Rabinowitch believes the results of this study can have implications outside the lab. Teachers and parents can provide “in sync” opportunities for groups of children, whether through music, dance or play.

    Source:Science Daily