Category: Science News

  • Seeing structure that allows brain cells to communicate

    {For more than a century, neuroscientists have known that nerve cells talk to one another across the small gaps between them, a process known as synaptic transmission. But the details of how this crucial aspect of brain function occurs have remained elusive. Now, new research has for the first time elucidated details about the architecture that allows brain cells to communicate.}

    For more than a century, neuroscientists have known that nerve cells talk to one another across the small gaps between them, a process known as synaptic transmission (synapses are the connections between neurons). Information is carried from one cell to the other by neurotransmitters such as glutamate, dopamine, and serotonin, which activate receptors on the receiving neuron to convey excitatory or inhibitory messages.

    But beyond this basic outline, the details of how this crucial aspect of brain function occurs have remained elusive. Now, new research by scientists at the University of Maryland School of Medicine (UM SOM) has for the first time elucidated details about the architecture of this process. The paper was published today in the journal Nature.

    Synapses are very complicated molecular machines. They are also tiny: only a few millionths of an inch across. They have to be incredibly small, since we need a lot of them; the brain has around 100 trillion of them, and each is individually and precisely tuned to convey stronger or weaker signals between cells.

    To visualize features on this sub-microscopic scale, the researchers turned to an innovative technology known as single-molecule imaging, which can locate and track the movement of individual protein molecules within the confines of a single synapse, even in living cells. Using this approach, the scientists identified an unexpected and precise pattern in the process of neurotransmission. The researchers looked at cultured rat synapses, which in terms of overall structure are very similar to human synapses.

    “We are seeing things that have never been seen before. This is a totally new area of investigation,” said Thomas Blanpied, PhD, Associate Professor in the Department of Physiology, and leader of the group that performed the work. “For many years, we’ve had a list of the many types of molecules that are found at synapses, but that didn’t get us very far in understanding how these molecules fit together, or how the process really works structurally. Now by using single-molecule imaging to map where many of the key proteins are, we have finally been able to reveal the core architectural structure of the synapse.”

    In the paper, Blanpied describes an unexpected aspect to this architecture that may explain why synapses are so efficient, but also susceptible to disruption during disease: at each synapse, key proteins are organized very precisely across the gap between cells. “The neurons do a better job than we ever imagined of positioning the release of neurotransmitter molecules near their receptors,” Blanpied says. “The proteins in the two different neurons are aligned with incredible precision, almost forming a column stretching between the two cells.” This proximity optimizes the power of the transmission, and also suggests new ways that this transmission can be modified.

    Blanpied’s lab has created a video representation of the process: https://youtu.be/PNhUqhwHDaQ

    Understanding this architecture will help clarify how communication within the brain works, or, in the case of psychiatric or neurological disease, how it fails to work. Blanpied is also focusing on the activity of “adhesion molecules,” which stretch from one cell to the other and may be important pieces of the “nano-column.” He suspects that if adhesion molecules are not placed correctly at the synapse, synapse architecture will be disrupted, and neurotransmitters won’t be able to do their jobs. Blanpied hypothesizes that in at least some disorders, the issue may be that even though the brain has the right amount of neurotransmitter, the synapses don’t transmit these molecules efficiently.

    Blanpied says that this improved comprehension of synaptic architecture could lead to a better understanding of brain diseases such as depression, schizophrenia and Alzheimer’s disease, and perhaps suggest new ideas for treatments.

    Blanpied and his colleagues will next explore whether the synaptic architecture changes in certain disorders: they will begin by looking at a synapses in a mouse model of the pathology in schizophrenia.

    Trans-synaptic nanocolumn. A new model of the molecular architecture at points of neuron-to-neuron contact in the brain, based on measuring the location of individual protein molecules at the sites where cell contact is made.
  • DNA analyses reveal genetic identities of world’s first farmers

    {Research reshapes understanding of genetic heritage of modern West Eurasians.}

    Conducting the first large-scale, genome-wide analyses of ancient human remains from the Near East, an international team of scientists has illuminated the genetic identities and population dynamics of the world’s first farmers.

    The study reveals three genetically distinct farming populations living in the Near East at the dawn of agriculture 12,000 to 8,000 years ago: two newly described groups in Iran and the Levant and a previously reported group in Anatolia, in what is now Turkey.

    The findings, published in Nature on July 25, also suggest that agriculture spread in the Near East at least in part because existing groups invented or adopted farming technologies, rather than because one population replaced another.

    “Some of the earliest farming was practiced in the Levant, including Israel and Jordan, and in the Zagros mountains of Iran–two edges of the Fertile Crescent,” said Ron Pinhasi, associate professor of archaeology at University College Dublin and co-senior author of the study. “We wanted to find out whether these early farmers were genetically similar to one another or to the hunter-gatherers who lived there before so we could learn more about how the world’s first agricultural transition occurred.”

    The team’s analyses alter what is known about the genetic heritage of present-day people in western Eurasia. They now appear to have descended from four major groups: hunter-gatherers in what is now western Europe, hunter-gatherers in eastern Europe and the Russian steppe, the Iran farming group and the Levant farming group.

    “We found that the relatively homogeneous population seen across western Eurasia today, including Europe and the Near East, used to be a highly substructured collection of people who were as different from one another as present-day Europeans are from East Asians,” said David Reich, professor of genetics at Harvard Medical School and co-senior author of the study.

    “Near East populations mixed with one another over time and migrated into surrounding regions to mix with the people living there until those initially quite diverse groups became genetically very similar,” added Iosif Lazaridis, HMS research fellow in genetics and first author of the study.

    Early adopters

    Even as advances in ancient-DNA technology have made it possible to probe population mixing and large-scale migrations that occurred thousands of years ago, researchers have had trouble studying the genetic history of the Near East because the region’s warm climate has degraded much of the DNA in unearthed bones.

    The team overcame the problem of poor-quality DNA in part by extracting genetic material from ear bones that can yield up to 100 times more DNA than other bones in the body. The team also used a technique called in-solution hybridization to enrich for human DNA and filter out contaminant DNA from microbes.

    The combined techniques allowed the researchers to gather high quality genomic information from 44 ancient Near Easterners who lived between 14,000 and 3,400 years ago: hunter-gatherers from before the invention of farming, the first farmers themselves and their successors.

    By comparing the genomes to one another as well as to those of nearly 240 previously studied ancient people from nearby regions and about 2,600 present-day people, the researchers learned that the first farming cultures in the Levant, Iran and Anatolia were all genetically distinct. Farmers in the Levant and Iran were genetically similar, however, to earlier hunter-gatherers who had lived in the same areas.

    “Maybe one group domesticated goats and another began growing wheat, and the practices were shared in some way,” said Lazaridis. “These different populations all invented or adopted some facets of the farming revolution, and they all flourished.”

    The findings tell a different story from what researchers believe happened later in Europe, when the first farmers moved in from Anatolia and largely replaced the hunter-gatherer populations who’d been living there.

    Mix and match

    Over the following 5,000 years, the Near East farming groups mixed with one another and with hunter-gatherers in Europe.

    “All this extraordinary diversity collapsed,” said Reich. “By the Bronze Age, populations had ancestry from many sources and broadly resembled present-day ones.”

    The researchers also learned how descendants of each early farming group, even as they began to intermingle, contributed to the genetic ancestry of people in different parts of the world: Farmers related to the Anatolian group spread west into Europe, people related to the Levant group moved south into East Africa, people related to those in Iran or the Caucasus went north into the Russian steppe, and people related to both the farmers in Iran and hunter-gatherers from the steppe spread into South Asia.

    “The Near East was the missing link to understanding many human migrations,” said Pinhasi.

    Finally, the study provides a few more clues about a hypothetical, even more ancient, population called the Basal Eurasians, an early diverging branch of the family tree of humans living outside Africa, whose existence Lazaridis has inferred from DNA analyses but whose physical remains have not yet been found.

    “Every single group from the ancient Near East appears to have Basal Eurasian ancestry–up to around fifty percent in the earliest groups,” said Lazaridis.

    To the researchers’ surprise, statistical analyses suggested that the Basal Eurasians may have had no Neanderthal DNA. Other non-African groups have at least 2 percent Neanderthal DNA.

    The team believes this finding could help explain why West Eurasians have less Neanderthal DNA than East Asians, even though Neanderthals are known to have lived in west Eurasia.

    “Admixture with Basal Eurasians may have diluted the Neanderthal ancestry in West Eurasians who have ancient Near Eastern farmer ancestry,” said Reich. “Basal Eurasians may have lived in parts of the Near East that did not come into contact with the Neanderthals.”

    Going forward, said Pinhasi, “We’re eager to study remains from the world’s first civilizations, who succeeded the samples analyzed in the study.The people everyone reads about in history books are now within the reach of our genetic technology.”

    This study reveals three genetically distinct farming populations living in the Near East at the dawn of agriculture 12,000 to 8,000 years ago: two newly described groups in Iran and the Levant and a previously reported group in Anatolia, in what is now Turkey.
  • From vision to hand action: Neuroscientists decipher how our brain controls grasping movements

    {Our hands are highly developed grasping organs that are in continuous use. Long before we stir our first cup of coffee in the morning, our hands have executed a multitude of grasps. Directing a pen between our thumb and index finger over a piece of paper with absolute precision appears as easy as catching a ball or operating a doorknob. Now neuroscientists have studied how the brain controls the different grasping movements. In their research with rhesus macaques, it was found that the three brain areas that are responsible for planning and executing hand movements, perform different tasks within their neural network.}

    In their research with rhesus macaques, it was found that the three brain areas AIP, F5 and M1 that are responsible for planning and executing hand movements, perform different tasks within their neural network. The AIP area is mainly responsible for processing visual features of objects, such as their size and shape. This optical information is translated into motor commands in the F5 area. The M1 area is ultimately responsible for turning this motor commands into actions. The results of the study contribute to the development of neuroprosthetics that should help paralyzed patients to regain their hand functions.

    The three brain areas AIP, F5 and M1 lay in the cerebral cortex and form a neural network responsible for translating visual properties of an object into a corresponding hand movement. Until now, the details of how this “visuomotor transformation” are performed have been unclear. During the course of his PhD thesis at the German Primate Center, neuroscientist Stefan Schaffelhofer intensively studied the neural mechanisms that control grasping movements. “We wanted to find out how and where visual information about grasped objects, for example their shape or size, and motor characteristics of the hand, like the strength and type of a grip, are processed in the different grasp-related areas of the brain,” says Schaffelhofer.

    For this, two rhesus macaques were trained to repeatedly grasp 50 different objects. At the same time, the activity of hundreds of nerve cells was measured with so-called microelectrode arrays. In order to compare the applied grip types with the neural signals, the monkeys wore an electromagnetic data glove that recorded all the finger and hand movements. The experimental setup was designed to individually observe the phases of the visuomotor transformation in the brain, namely the processing of visual object properties, the motion planning and execution. For this, the scientists developed a delayed grasping task. In order for the monkey to see the object, it was briefly lit before the start of the grasping movement. The subsequent movement took place in the dark with a short delay. In this way, visual and motor signals of neurons could be examined separately.

    The results show that the AIP area is primarily responsible for the processing of visual object features. “The neurons mainly respond to the three-dimensional shape of different objects,” says Stefan Schaffelhofer. “Due to the different activity of the neurons, we could precisely distinguish as to whether the monkeys had seen a sphere, cube or cylinder. Even abstract object shapes could be differentiated based on the observed cell activity.”

    In contrast to AIP, area F5 and M1 did not represent object geometries, but the corresponding hand configurations used to grasp the objects. The information of F5 and M1 neurons indicated a strong resemblance to the hand movements recorded with the data glove. “In our study we were able to show where and how visual properties of objects are converted into corresponding movement commands,” says Stefan Schaffelhofer. “In this process, the F5 area plays a central role in visuomotor transformation. Its neurons receive direct visual object information from AIP and can translate the signals into motor plans that are then executed in M1. Thus, area F5 has contact to both, the visual and motor part of the brain.”

    Knowledge of how to control grasp movements is essential for the development of neuronal hand prosthetics. “In paraplegic patients, the connection between the brain and limbs is no longer functional. Neural interfaces can replace this functionality,” says Hansjörg Scherberger, head of the Neurobiology Laboratory at the DPZ. “They can read the motor signals in the brain and use them for prosthetic control. In order to program these interfaces properly, it is crucial to know how and where our brain controls the grasping movements.” The findings of this study will facilitate to new neuroprosthetic applications that can selectively process the areas’ individual information in order to improve their usability and accuracy.

    All finger and hand movements of the monkeys were recorded with an electromagnetic data glove.
  • Hot news flash! Menopause, sleepless nights make women’s bodies age faster

    {Menopause–and the insomnia that often accompanies it –make women age faster, two new studies reveal.The work suggests these factors could increase women’s risk for aging-related diseases and earlier death.}

    Two UCLA studies reveal that menopause–and the insomnia that often accompanies it — make women age faster.

    The dual findings, respectively published July 25 in the Proceedings of the National Academy of Sciences and Biological Psychiatry, suggest these factors could increase women’s risk for aging-related diseases and earlier death.

    “For decades, scientists have disagreed over whether menopause causes aging or aging causes menopause,” said Steve Horvath, a professor of human genetics and biostatistics at the David Geffen School of Medicine at UCLA and UCLA Fielding School of Public Health, and a senior author on both studies. “It’s like the chicken or the egg: which came first? Our study is the first to demonstrate that menopause makes you age faster.”

    “Not getting restorative sleep may do more than just affect our functioning the next day; it might also influence the rate at which our biological clock ticks,” said Judith Carroll, an assistant professor of psychiatry at the UCLA Semel Institute for Neuroscience and Human Behavior and the Cousins Center for Psychoneuroimmunology, and first author of the sleep study. “In the women we studied, those reporting symptoms such as restless sleep, waking repeatedly at night, having difficulty falling asleep, and waking too early in the morning tended to be older biologically than women of similar chronological age who reported no symptoms.”

    For their findings, both studies used a “biological clock” developed by Horvath, which has become a widely used method for tracking the epigenetic shift in the genome. Epigenetics is the study of changes to DNA packaging that influence which genes are expressed but don’t affect the DNA sequence itself.

    The Menopause Connection

    In the menopause study, Horvath and first author Morgan Levine tracked methylation, a chemical biomarker linked to aging, to analyze DNA samples from more than 3,100 women enrolled in four large studies, including the Women’s Health Initiative (WHI) a major 15-year research program that addressed the most common causes of death, disability and poor quality of life in postmenopausal women. They measured the biological age of cells from blood, saliva and inside the cheek, to explore the relationship between each woman’s chronological age and her body’s biological age.

    “We discovered that menopause speeds up cellular aging by an average of 6 percent,” said Horvath. “That doesn’t sound like much but it adds up over a woman’s lifespan.”

    Take, for example, a woman who enters early menopause at age 42. Eight years later, he said, her body would be a full year older biologically than another 50-year-old woman who entered menopause naturally at age 50.

    “On average, the younger a woman is when she enters menopause, the faster her blood ages,” explained Levine, a postdoctoral researcher in Horvath’s lab. “This is significant because a person’s blood may mirror what’s happening in other parts of the body, which could have implications for death and disease risk.”

    The Importance of Sleep

    In the sleep study, Carroll and her colleagues drew their data from more than 2,000 women in the WHI. Using the epigenetic clock, they found that postmenopausal women with five insomnia symptoms were nearly two years older biologically than women the same chronological age with no insomnia symptoms.

    “We can’t conclude definitively from our study that the insomnia leads to the increased epigenetic age, but these are powerful findings,” said Carroll. “In the future, we will need to carry out studies of the same individuals over an extended period of time to determine cause-and-effect relationships between biological age and sleep disorders.”

    While both studies are bad news for many women, Horvath suggests that scientists in the future may use the epigenetic clock as a diagnostic tool to evaluate the effects of therapies, like hormone therapy for menopause.

    “The big question is which menopausal hormone therapy offers the strongest anti-aging effect while limiting health risks,” said Horvath.

    “No longer will researchers need to follow patients for years to track their health and occurrence of diseases. Instead we can use the epigenetic clock to monitor their cells’ aging rate and to evaluate which therapies slow the biological aging process,” explained Horvath. “This could greatly reduce the length and costs of clinical trials and speed benefits to women.”

    Does a tough menopause accelerate aging?
  • How rope was made 40,000 years ago

    {Rope and twine are critical components in the technology of mobile hunters and gatherers. In exceptional cases impressions of string have been found in fired clay and on rare occasions string was depicted in the contexts of Ice Age art, but on the whole almost nothing is known about string, rope and textiles form the Paleolithic. Researchers have now discovered a tool used to make early rope.}

    Prof. Nicholas Conard and members of his team, present the discovery of a tool used to make rope in today‘s edition of the journal: Archäologische Ausgrabungen Baden-Württemberg.

    Rope and twine are critical components in the technology of mobile hunters and gatherers. In exceptional cases impressions of string have been found in fired clay and on rare occasions string was depicted in the contexts of Ice Age art, but on the whole almost nothing is known about string, rope and textiles form the Paleolithic.

    A key discovery by Conard’s team in Hohle Fels Cave in southwestern Germany and experimental research and testing by Dr. Veerle Rots and her team form the University of Liège is rewriting the history of rope.

    The find is a carefully carved and beautifully preserved piece of mammoth ivory 20.4 cm in length with four holes between 7 and 9 mm in diameter. Each of the holes is lined with deep, and precisely cut spiral incisions. The new find demonstrates that these elaborate carvings are technological features of rope-making equipment rather than just decoration.

    Similar finds in the past have usually been interpreted as shaft-straighteners, decorated artworks or even musical instruments. Thanks to the exceptional preservation of the find and rigorous testing by the team in Liège, the researchers have demonstrated that the tool was used for making rope out of plant fibers available near Hohle Fels. “This tool answers the question of how rope was made in the Paleolithic”, says Veerle Rots, “a question that has puzzled scientists for decades.”

    Excavators found the rope-making tool in archaeological horizon Va near the base of the Aurignacian deposits of the site. Like the famous female figurines and the flutes recovered from the Hohle Fels, the rope-making tool dates to about 40,000 years ago, the time when modern humans arrived in Europe. The discovery underlines the importance of fiber technology and the importance of rope and string for mobile hunters and gatherers trying to cope with challenges of life in the Ice Age.

    Prof. Conard’s team has excavated at Hohle Fels over each of the last 20 years, and it is this long-term commitment that has over and over again paid off, to make Hohle Fels one of the best known Paleolithic sites worldwide. Hohle Fels and neighboring sites from the Ach and Lone Valleys have been nominated for UNESCO World Cultural Heritage status. The excavations at Hohle Fels near Schelklingen in the Ach Valley are funded by the HeidelbergCement AG, the Ministry of Science of Baden-Württemberg and the Heidelberger Academie of Sciences.

    The rope-making tool will be on exhibit at the Urgeschichtliches Museum in Blaubeuren starting Saturday, July 23rd . (www.urmu.de)

    Rope making tool from mammoth ivory from Hohle Fels Cave in southwestern Germany, ca. 40,000 years old.
  • Background noise may hinder toddlers’ ability to learn words

    {A new study of toddlers has found that the presence of background noise in home or at school can make learning new words more difficult for children.}

    The environments children are in, including how much and what kinds of stimulation they are exposed to, influence what and how they learn. One important task for children is zeroing in on the information that’s relevant to what they’re learning and ignoring what isn’t. A new study has found that the presence of background noise in the home or at school makes it more difficult for toddlers to learn new words.The study also found that providing additional language cues may help young children overcome the effects of noisy environments.

    Conducted at the University of Wisconsin-Madison, the research appears in the journal Child Development.

    “Learning words is an important skill that provides a foundation for children’s ability to achieve academically,” notes Brianna McMillan, doctoral student in psychology at the University of Wisconsin-Madison, who led the study.

    “Modern homes are filled with noisy distractions such as TV, radio, and people talking that could affect how children learn words at early ages. Our study suggests that adults should be aware of the amount of background speech in the environment when they’re interacting with young children.”

    Studies on the impact of environmental noise suggest that too much noise can affect children both cognitively and psycho-physiologically, as seen in more negative school performance and increased levels of cortisol and heart rate. However, most studies of word learning are conducted in quiet laboratory settings. This study focused on word learning but attempted to replicate the noisy environments children may inhabit at home and at school. In the study, 106 children ages 22 to 30 months took part in three experiments in which they were taught names for unfamiliar objects and then tested on their ability to recognize the objects when they were labeled. First, toddlers listened to sentences featuring two new words.

    Then they were taught which objects the new names corresponded to. Finally, the toddlers were tested on their ability to recall the words.

    In the first experiment, 40 toddlers (ages 22 to 24 months) heard either louder or quieter background speech when learning the new words. Only toddlers who were exposed to the quieter background speech successfully learned the words. In the second experiment, a different group of 40 toddlers (ages 28 to 30 months) was tested to determine whether somewhat older children could better overcome the effects of background noise. Again, only when background noise was quieter could the older toddlers successfully learn the new words.

    In the third experiment, 26 older toddlers were first exposed to two word labels in a quiet environment. Next, the toddlers were taught the meanings of four word labels — two they had just heard and two new ones. Toddlers were taught the meanings of all these labels in the same noisy environment that impaired learning in the second experiment. The children learned the new words and their meanings only when they had first heard the labels in a quiet environment, suggesting that experience with the sounds of the words without distracting background noise helps children subsequently map those sounds to meaning.

    In sum, the study shows that while louder background speech hindered toddlers’ ability to learn words, cues in the environment helped them overcome this difficulty. “Hearing new words in fluent speech without a lot of background noise before trying to learn what objects the new words corresponded to may help very young children master new vocabulary,” suggests Jenny Saffran, College of Letters & Science Professor of Psychology at the University of Wisconsin-Madison, who coauthored the study. “But when the environment is noisy, drawing young children’s attention to the sounds of the new word may help them compensate.”

    Children will rarely be in a completely quiet environment when learning. Parents and teachers may find that reducing background noise or highlighting important information can help children learn even when there is background noise.These suggestions may be especially important for low-income households because research shows that such homes on average have higher noise levels due to urban settings and crowding.

    Background speech hindered toddlers' ability to learn words, according to a new study.
  • Neural networks: Why larger brains are more susceptible to mental illnesses

    {In humans and other mammals, the cerebral cortex is responsible for sensory, motor, and cognitive functions. Understanding the organization of the neuronal networks in the cortex should provide insights into the computations that they carry out. A study publishing on July 21st in open access journal PLOS Biology shows that the global architecture of the cortical networks in primates (with large brains) and rodents (with small brains) is organized by common principles. Despite the overall network invariances, primate brains have much weaker long-distance connections, which could explain why large brains are more susceptible to certain mental illnesses such as schizophrenia and Alzheimer disease.}

    In earlier work, Zoltán Toroczkai, from the University of Notre-Dame, USA, Mária Ercsey-Ravasz, from Babes-Bolyai University, Romania and Henry Kennedy, from the University Lyon, France, and colleagues combined tracing studies in macaques, which visualize connections in the brain, with network theory to show that the cortical network structure in this primate is governed by the so-called exponential distance rule (EDR).

    The EDR describes a consistent relationship between distances and connection strength. Consistent with the tracing results, the EDR predicts that there are many fewer long-range axons (nerve fibers that function as transmission lines of the nervous system) than short ones, and this can be quantified by a mathematical equation. At the level of cortical areas (such as visual cortex or auditory cortex) examined by the tracing studies, this means the closer two areas are to each other, the more connections exist between them.

    In this study, the researchers compare the features of the cortical networks in the macaque — a mammal with a large cortex — with those in the mouse, with its much smaller cortex. They used detailed tracing data to quantify connections between functional areas, and those formed the basis for the analysis. Despite the substantial differences in the cortex size between the species and other apparent differences in cortex organization, they found that the fundamental statistical features of all networks followed the EDR.

    Based on these results, the researchers hypothesize that the EDR describes an effective design principle that remains constant during the evolution of mammalian brains of different sizes. They present mathematical arguments that support the universal applicability of the EDR as a governing principle of cortical connectivity, as well as further experimental support from high-resolution tracer experiments in small brain areas from macaque, mouse, and mouse lemur (a primate with a very small brain).

    Their results, the researchers conclude, “suggest that the EDR plays a key role across the mammalian order to optimize the layout of the inter-areal cortical network allowing larger-brained animals to maintain communication efficiencies combined with increased neuron numbers.”

    As the EDR predicts and the tracing data here confirm, neuronal connections weaken exponentially with distance. Assuming the EDR can be applied to all mammalian brains, this suggests that long-distance connections could be quite weak in the human cortex, which is approximately five times larger than that of the macaque. If true, the researchers say, one could speculate that the low weight of human long-range connections may contribute to an increased susceptibility to disconnection syndromes, such as have been proposed for Alzheimer disease and schizophrenia.”

    Macaque brain showing the inter-areal network. Nodes roughly correspond to actual areas, thickness of lines proportional to the logarithm of the numbers of axons.
  • What hunter-gatherers can tell us about fundamental human social networks

    {Long before the advent of social media, human social networks were built around sharing a much more essential commodity: food. Now, researchers reporting on the food sharing networks of two contemporary groups of hunter-gatherers in the Cell Press journal Current Biology on July 21 provide new insight into fundamental nature of human social organization.}

    The new work reveals surprising similarities between the Agta of the Philippines and Mbendjele of the Republic of Congo. In both places, individuals maintain a three-tiered social network that appears to buffer them against day-to-day shortfalls in foraging returns.

    “Previous research has suggested that social networks across human cultures are structured in similar ways,” says Mark Dyble of University College London. “Across societies, there appear to be similar limits on the number of social relationships individuals are able to maintain, and many societies are said to have a ‘multilevel’ structure. Our work on contemporary hunter-gatherer groups sheds light on how this distinctive social structure may have benefited humans in our hunting-and-gathering past.”

    While previous studies have identified similarities in social structure across hunter-gatherer populations, the researchers say that the new work is the first to explore how hunter-gatherers’ distinctive, “multilevel” social organization structures social life and cooperation in important activities such as foraging and food sharing.

    “No other apes share food to the extent that humans do,” says Andrea Migliano, principal investigator of the Leverhulme Trust-funded Hunter-Gatherers Resilience Project. “Hunter-gatherers’ multi-level social structure exists in different groups, to help regulate these cooperative systems. Furthermore, multi-level social structures regulate social rules, friendship and kinship ties, and the spread of social norms, promoting a more efficient sharing and cooperation. Sharing is a crucial adaptation to hunter-gatherers’ lifestyles, central to their resilience — and central to the evolution of mankind.”

    The Agta live in northeast Luzon, Philippines. Their primary source of protein is fish, supplemented by inter-tidal foraging, hunting, honey collecting, and gathering of wild foods. The Mbendjele live in an area spanning northern Republic of Congo and southern Central African Republic, where they hunt for meat in the forest. Both groups also trade wild-caught meat or fish for cultivated foods, including rice and manioc.

    Dyble, Migliano, and their colleagues collected data on food sharing by living with the two communities for many months, making observations on how often households shared food with each other. From this they constructed social networks of food sharing.

    “Although we had an idea of how camps split into food sharing clusters ‘on the ground,’ we were able to test these using algorithms which are able to identify sub-communities within the nine camps we studied,” Dyble explains.

    Their analysis showed that food sharing is closely related to social organization. In both communities, individuals maintain a three-tiered social network. First is their immediate household, most often consisting of five or six individuals, second is a cluster of three to four closely related households who share food frequently, and third is the wider camp.

    “Despite being from different continents and living in very different ecologies, both groups of hunter-gatherers had a strikingly similar social organization,” Dyble says.

    “Cooperation and especially food sharing are essential for survival in a hunting-and-gathering economy,” Dyble says. “The proverb that ‘it takes a village to raise a child’ is certainly true for hunter-gatherers, who, without food sharing to mitigate the day-to-day shortfalls in foraging, could simply not survive.”

    Dyble says that they now intend to explore the structure of other types of social networks in the hunter-gatherer communities, such as cooperation in childcare, and their overlap with food sharing.

    This image shows food sharing among the BaYaka.
  • In gauging and correcting errors, brain plays confidence game, new research shows

    {The confidence in our decision-making serves to both gauge errors and to revise our approach, neuroscientists have found. Their study offers insights into the hierarchical nature of how we make choices over extended periods of time, ranging from medical diagnoses and treatment to the strategies we use to invest our money.}

    “What is challenging about comprehending why we make certain choices over long periods is to determine the true causes of the outcomes of our decisions,” explains Braden Purcell, an NYU post-doctoral fellow and the lead author the study, which appears in the journal Proceedings of the National Academy of Sciences. “When we make a mistake, it might mean we were simply unlucky or it could indicate a deeper flaw in our overall strategy. For instance, if a patient’s health gets worse, should the doctor just try another treatment or should she revise the original diagnosis altogether? Our findings map out a framework of how we make such evaluations.”

    The key element in this process, the study shows, is confidence.

    “Overall, we found that the brain uses confidence to gauge errors and revise decision strategy,” adds co-author Roozbeh Kiani, an assistant professor in NYU’s Center for Neural Science. “Specifically, the confidence in our initial assessments influences how we revisit them.”

    In general, the aim of the research was to not only understand simple decisions about information immediately available to us, but also to capture decisions about the strategies that guide multiple decisions over time.

    To do so, the researchers devised an experiment in which subjects judged the net motion direction of multiple dots on a computer screen; the subjects’ judgments were recorded by gauging their eye movement toward one of several targets on the screen. Notably, the correct target for a motion direction could change every few trials without any explicit cue—subjects had to infer these environment changes from error feedbacks. However, errors could arise from two sources: a mistake in perceiving the net motion of dots moving in multiple directions or a change in the correct targets.

    Researchers discovered that subjects disambiguated the source of error using their confidence. When confident about motion direction, subjects attributed negative feedbacks to a change in the environment and quickly explored new targets that indicated such a change. When they were less confident, they counted negative feedbacks as partial evidence for an environment change but withheld exploring a new target until the sum of evidence—i.e., confidence on error trials—reached a threshold.

    According to Purcell and Kiani, an optimal decision-maker should do exactly as participants in their experiment did: summing evidence to a threshold ensures that the environment change is detected as soon as possible. Further, optimal decision-makers should adjust the threshold for switching strategy based on the volatility of the environment—lower thresholds for environments that change more often. They tested this possibility and showed that subjects were quicker to explore new targets when changes in the environment happened more frequently.

    The confidence in our decision-making serves to both gauge errors and to revise our approach, New York University neuroscientists have found.
  • Thinking inside the box: How our brain puts the world in order

    {The world around is complex and changing constantly.To put it in order, we devise categories into which we sort new concepts. To do this we apply different strategies.A team of researchers wanted to find our which areas of the brain regulate these strategies.}

    The results of their study using magnetic resonance imaging (MRI) show that there are indeed particular brain areas, which become active when a certain strategy of categorisation is applied.

    When we categorise objects by comparing it to a prototype, the left fusiform gyrus is activated. This is an area, which is responsible for recognising abstract images. On the other hand, when we compare things to particular examples of a category, there is an activation of the left hippocampus. This field plays an important role for the storage or retrieval of memories.

    Categories reduce information load

    Thinking in categories or pigeonholing helps our brain in bringing order into a constantly changing world and it reduces the information load. Cognitive scientists differentiate between two main strategies which achieve this: the exemplar strategy and the prototype strategy.

    When we want to find out, whether a certain animal fits into the category “bird” we would at first apply the prototype strategy and compare it to an abstract general “bird.” This prototype has the defining features of the class, like a beak, feathers or the ability to fly. But when we encounter outliers or exceptions like an emu or a penguin, this strategy may be of no use. Then we apply the exemplar strategy and compare the animal to many different known examples of the category. This helps us find the right category, even for “distant relations.”

    {{Complex interaction}}

    To find out where our brain is activated, when it is ordering the world, the neuroscientists in Bochum performed an MRI scan, while volunteers were completing a categorisation task. The functional imaging data showed that both strategies are triggered by different areas of the brain.

    The scientists believe that there is a complex interaction between both learning patterns. “The results implicate that both strategies originate from distinct brain areas. We also observed that, during the learning process, the rhythm of activation in the two areas synchronised. This shows that both cognitive processes cannot be neatly separated,” explains Boris Suchan. Further modelling and research must now clarify this interaction.

    Neuroscientists have found the sorting center in the brain.