Category: Science News

  • Why we walk on our heels instead of our toes: Longer virtual limbs

    {Walking heel-to-toe gives humans the mechanical advantage of longer ‘virtual limbs’.}

    James Webber took up barefoot running 12 years ago. He needed to find a new passion after deciding his planned career in computer-aided drafting wasn’t a good fit. Eventually, his shoeless feet led him to the University of Arizona, where he enrolled as a doctoral student in the School of Anthropology.

    Webber was interested in studying the mechanics of running, but as the saying goes, one must learn to walk before they can run, and that — so to speak — is what Webber has been doing in his research.

    His most recent study on walking, published in the Journal of Experimental Biology, specifically explores why humans walk with a heel-to-toe stride, while many other animals — such as dogs and cats — get around on the balls of their feet.

    It was an especially interesting question from Webber’s perspective, because those who do barefoot running, or “natural running,” land on the middle or balls of their feet instead of the heels when they run — a stride that would feel unnatural when walking.

    Indeed, humans are pretty set in our ways with how we walk, but our heel-first style could be considered somewhat curious.

    “Humans are very efficient walkers, and a key component of being an efficient walker in all kind of mammals is having long legs,” Webber said. “Cats and dogs are up on the balls of their feet, with their heel elevated up in the air, so they’ve adapted to have a longer leg, but humans have done something different. We’ve dropped our heels down on the ground, which physically makes our legs shorter than they could be if were up on our toes, and this was a conundrum to us (scientists).”

    Webber’s study, however, offers an explanation for why our heel-strike stride works so well, and it still comes down to limb length: Heel-first walking creates longer “virtual legs,” he says.

    We Move Like a Human Pendulum

    When humans walk, Webber says, they move like an inverted swinging pendulum, with the body essentially pivoting above the point where the foot meets the ground below. As we take a step, the center of pressure slides across the length of the foot, from heel to toe, with the true pivot point for the inverted pendulum occurring midfoot and effectively several centimeters below the ground. This, in essence, extends the length of our “virtual legs” below the ground, making them longer than our true physical legs.

    As Webber explains: “Humans land on their heel and push off on their toes. You land at one point, and then you push off from another point eight to 10 inches away from where you started. If you connect those points to make a pivot point, it happens underneath the ground, basically, and you end up with a new kind of limb length that you can understand. Mechanically, it’s like we have a much longer leg than you would expect.”

    Webber and his adviser and co-author, UA anthropologist David Raichlen, came to the conclusion after monitoring study participants on a treadmill in the University’s Evolutionary Biomechanics Lab. They looked at the differences between those asked to walk normally and those asked to walk toe-first. They found that toe-first walkers moved slower and had to work 10 percent harder than those walking with a conventional stride, and that conventional walkers’ limbs were, in essence, 15 centimeters longer than toe-first walkers.

    “The extra ‘virtual limb’ length is longer than if we had just had them stand on their toes, so it seems humans have found a novel way of increasing our limb length and becoming more efficient walkers than just standing on our toes,” Webber said. “It still all comes down to limb length, but there’s more to it than how far our hip is from the ground. Our feet play an important role, and that’s often something that’s been overlooked.”

    When the researchers sped up the treadmill to look at the transition from walking to running, they also found that toe-first walkers switched to running at lower speeds than regular walkers, further showing that toe-first walking is less efficient for humans.

    Ancient Human Ancestors Had Extra-Long Feet

    It’s no wonder humans are so set in our ways when it comes to walking heel-first — we’ve been doing it for a long time. Scientists know from footprints found preserved in volcanic ash in Latoli, Tanzania, that ancient hominins practiced heel-to-toe walking as early as 3.6 million years ago.

    Our feet have changed over the years, however. Early bipeds (animals that walk on two feet) appear to have had rigid feet that were proportionally much longer than ours today — about 70 percent the length of their femur, compared to 54 percent in modern humans. This likely helped them to be very fast and efficient walkers. While modern humans held on to the heel-first style of walking, Webber suggests our toes and feet may have gotten shorter, proportionally, as we became better runners in order to pursue prey.

    “When you’re running, if you have a really long foot and you need to push off really hard way out at the end of your foot, that adds a lot of torque and bending,” Webber said. “So the idea is that as we shifted into running activities, our feet started to shrink because it maybe it wasn’t as important to be super-fast walkers. Maybe it became important to be really good runners.”

    Researchers examined differences between heel-first and toe-first walking.
  • Researchers uncover how hippocampus influences future thinking

    {Over the past decade, researchers have learned that the hippocampus — historically known for its role in forming memories — is involved in much more than just remembering the past; it plays an important role in imagining events in the future. Yet, scientists still do not know precisely how the hippocampus contributes to episodic imagining — until now. Researchers from Boston University School of Medicine (BUSM) have determined the role of the hippocampus in future imaging lies in the process of constructing a scene in one’s mind.}

    The findings, which appear in the journal Cerebral Cortex, shed important light on how the brain supports the capacity to imagine the future and pinpoints the brain regions that provide the critical ingredients for performing this feat.

    The hippocampus is affected by many neurological conditions and diseases and it also can be compromised during normal aging. Future thinking is a cognitive ability that is relevant to all humans. It is needed to plan for what lies ahead, whether to navigate daily life or to make decisions for major milestones further in the future.

    Using functional Magnetic Resonance Imaging, BUSM researchers performed brain scans on healthy adults while they were imagining events. They then compared brain activity in the hippocampus when participants answered questions pertaining to the present or the future. After that, they compared brain activity when participants answered questions about the future that did or did not require imagining a scene. “We observed no differences in hippocampal activity when we compared present versus future imaging, but we did observe stronger activity in the hippocampus when participants imagined a scene compared to when they did not, suggesting a role for the hippocampus in scene construction but not mental time travel,” explained corresponding author Daniela Palombo, PhD, postdoctoral fellow in the memory Disorders Research Center at BUSM and at the VA Boston Healthcare System.

    According to the researchers the importance of studying how the hippocampus contributes to cognitive abilities is bolstered by the ubiquity of hippocampal involvement in many conditions. “These findings help provide better understanding of the role of the hippocampus in future thinking in the normal brain, and may eventually help us better understand the nature of cognitive loss in individuals with compromised hippocampal function,” she added.

    Palombo believes that once knowledge about which aspects of future imagining are and are not dependent on the hippocampus, targeted rehabilitation strategies can be designed that exploit those functions that survive hippocampal dysfunction and may provide alternate routes to engage in future thinking.

  • Nearly one in five young Ontario adults shows problematic use of electronic devices

    {Survey also shows many Ontario adults report texting and driving, and increasing mental distress days.}

    As many as 19 per cent of Ontario adults aged 18 to 29 experience moderate to severe problematic use of electronic devices, which includes smartphones and tablets as well as computers and video game consoles, according to the latest CAMH Monitor survey. It’s the first time the ongoing survey has measured the impact of our increasing reliance on electronic devices.

    “Today’s young adults entered their adolescent or adult years with a wide range of social media, apps, videos and other information and entertainment available to them 24/7,” says Dr. Hayley Hamilton, Scientist in CAMH’s Institute for Mental Health Policy Research and co-principal investigator of the CAMH Monitor.

    Although problematic use was most prevalent among young adults, it affected all ages. Overall, seven per cent of all Ontario adults — representing an estimated 716,100 people — experienced moderate to severe problematic use, defined as experiencing three or more out of six symptoms related to problematic use.

    “It’s clear that, for most of us, our use of electronic devices has skyrocketed over the past five to 10 years, which is why it’s important to study if this use can be problematic,” says Dr. Nigel Turner, Scientist in CAMH’s Institute for Mental Health Policy Research and an expert in gambling and behavioural addictions. “While our understanding of problematic use is evolving, we know that some people do end up harming their career or educational opportunities by excessive use.”

    These results from the 2015 CAMH Monitor are based on responses from 3,007 adults aged 18 and older across the province. The survey asked about personal device use, other than for work or school. Questions about problematic use asked whether individuals or their family members believed they had a problem, if they tried to cut back on their use, if they experienced anxiety that could only be relieved by using electronic devices, or if they missed school, work or important social activities because of device use, for example.

    “Research has shown that high use of electronic devices, as well as social media, are linked to problems with mental health, including increased psychological distress and poorer self-rated mental health,” says Dr. Hamilton. “Our new findings underscore the need for each of us to define healthy limits, and to monitor our use of electronic devices before it becomes a problem.”

    {{Rising risks on the road}}

    A high number of drivers reported texting while driving, another new question on this year’s survey. More than one in three drivers — 37 per cent — confirmed texting while driving at least once in the past year, and 11 per cent of drivers had texted while driving 30 or more times in the past year.

    “An estimated 3.3 million adult drivers in Ontario are involved in this hazardous behaviour on the road,” says Dr. Robert Mann, Senior Scientist in CAMH’s Institute for Mental Health Policy Research and co-principal investigator of the survey. “The province’s stronger penalties for distracted driving came into effect in the fall of 2015, so we don’t yet know the effects of these penalties — we’ll be watching this closely in future years.”

    Mental health concerns increasing

    Over the past year, the percentage of people who experienced frequent mental distress days rose significantly, from six per cent in 2014 to nearly 10 per cent in 2015. Frequent mental distress days are defined as 14 or more days, out of the last 30 days, in which a person rated his or her mental health as not good, which included stress, depression and problems with emotions.

    “Generally, frequent mental distress days and self-rated fair or poor mental health have climbed over the last decade, which is concerning,” says Dr. Hamilton.

    {{Changes in smoking patterns}}

    During the past year, 11 per cent of Ontario adults used e-cigarettes, a significant increase from seven per cent in 2013, which was the first year that the CAMH Monitor asked about e-cigarette use. By comparison, 13 per cent of people said they currently smoke tobacco cigarettes, down from 17 per cent in 2013 and a sharp drop from nearly 27 per cent in 1996.

    {{Growing number of cannabis users aged 50 and older}}

    Cannabis use has increased over the past 20 years, from nine per cent in 1996 to more than 14 per cent in 2015. An important change in cannabis use over the past decade has been the aging of cannabis users. Among Ontarians aged 50 and older, past-year use of cannabis increased significantly from about three per cent in 2005 to seven per cent in 2015. “In 2015, 23 per cent of cannabis users were 50 years of age and older, a substantial increase from six per cent in 2005,” says Dr. Mann.

    A high number of drivers reported texting while driving, another new question on this year’s survey.
  • Researchers map neural circuitry of songbird learning

    {How do juvenile songbirds learn to sing in a way that preserves both the unique features of local song culture and their specifics-specific song “signature”? Researchers have begun to map the brain circuitry responsible for cultural transmission and species specificity in birdsong.}

    Two studies appearing in the December 9 issue of Science shed light upon the neuronal architecture of birdsong. In one experiment, Dr. Vikram Gadagkar, postdoctoral fellow and neurobiologist at Cornell University, and his colleagues found that dopaminergic neurons in the ventral tegmental area (VTA) of the brain encode errors in singing performance. This dopaminergic error signal may also help juvenile zebra finches learn to accurately imitate the song of their tutor.

    In the second study, investigators studied songbird hatchlings fostered by another species. Dr. Makoto Araki, Neuronal Mechanism of Critical Period Unit, 2 3 Collective Interactions Unit at Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan and colleagues determined that, while juvenile zebra finches imitated the song syllables of their adoptive Bengalese finch parents, they adjusted song cadence towards the rhythm typical of their own species, whose song they had never heard, suggesting that songbirds learn rhythm from an innate template rather than from other birds.

    In this same issue of Science, Drs. Ofer Tchernichovski and Dina Lipkind, psychology researchers at Hunter College, City University of New York (CUNY), offer a perspective on the above studies. Drs. Tchernichovski and Lipkind, who were not affiliated with either study, propose that the findings may shed light on how songbirds maintain a species-specific song signature despite the random changes that occur in local populations and accumulate over generations. According to the Hunter researchers, two types of neurons in the auditory cortex of songbirds may code independently for the sound of song syllables and for rhythm — with song notes likely more dependent on input from adult tutors and cadence on an innate template or “barcode.”

    While scientists are only beginning to understand the neural mechanisms that support vocal learning in songbirds, Drs. Tchernichovski and Lipkind point out that this research is relevant to many animal communication systems, including stable cultural transmission in humans.

    Dr. Tchernichovski heads the Laboratory of Vocal Learning at Hunter College, CUNY, and uses the songbird to study the mechanisms of vocal learning. Like early speech development in the human infant, the songbird learns to imitate complex sounds during a critical period of development. The adult bird cannot imitate any more — we do not know why. His lab studies the animal behavior and dynamics of vocal learning and sound production across different brain levels. The lab aims to uncover the specific physiological and molecular (gene expression) brain processes that underlie song learning.

    Bengalese Finch
  • Genetic memory of starvation may curtail lifespan of men and male descendants

    {Famine may have a lasting impact on male descendants of its victims.}

    New Tel Aviv University research suggests that periods of fasting or starvation may significantly shorten the lifespans of both children and their male descendants.

    The study focused on survivors of a mass famine that took place in the early 1920s in several rural regions of Russia. It was led by Prof. Eugene Kobyliansky of TAU’s Sackler School of Medicine and conducted by doctoral student Dmitry Torchinsky of TAU’s Raymond and Beverly Sackler Faculty of Exact Sciences, in collaboration with Dr. Leonid Kalichman of Ben-Gurion University’s Department of Physical Therapy and Prof. David Karasik of Bar Ilan University’s Faculty of Medicine in the Galilee. Its conclusions were published in The American Journal of Clinical Nutrition.

    “A variety of experimental and epidemiological studies have tried to propose that intermittent or periodic fasting, like caloric restriction, may slow the aging process and extend lifespans,” said Prof. Kobyliansky. “But there is also evidence demonstrating that even moderate caloric restriction may not extend but, on the contrary, can shorten the human lifespan.”

    A lesson from Russia

    Past research suggests a strong correlation between telomere dynamics and the processes that determine human aging and lifespan. Telomeres, compound structures at the end of each chromosome that protects the end of the chromosome from deterioration, are the genetic key to longevity. They shorten with every chromosome replication cycle.

    The team evaluated telomere lengths in a population-based sample composed of survivors of the mass famine of the early 1920s and in the survivors’ descendants, who originated from Chuvashia, a rural area in the mid-Volga region of Russia. In Chuvashia, the proportion of starving inhabitants reached 90% in late March 1922, and mortality among starving peasants reached between 30-50%. The situation only began to improve in April 1923. By the end of that year, the mass famine in Chuvashia was considered over.

    {{The researchers arrived at three major discoveries:}}

    (1) There were shorter leukocyte telomeres in men born after 1923 after the mass famine ended than in men born before 1922;

    (2) there was a stable inheritance of shorter telomeres by men born in ensuing generations; and

    (3) there was an absence of any correlation between shorter telomeres and women born before or after the event.

    “This study, while demonstrating that starvation has the potential to shorten telomere length, raises several questions,” said Prof. Kobyliansky. “Does starvation exert a stronger effect on telomere length in the reproductive cells of adults than in the leukocytes of children? Is starvation-induced telomere shortening a sex-dependent phenomenon? And would fasting regimens exerting beneficial effects be accompanied by telomere shortening in descendants?”

    The team is currently considering experimental in vivo studies to answer these and other questions.

    Starvation has the potential to shorten telomere length.
  • Half of people believe fake facts, ‘remember’ events that never happened

    {Many people are prone to ‘remembering’ events that never happened, according to new research by the University of Warwick.}

    In a study on false memories, Dr Kimberley Wade in the Department of Psychology demonstrates that if we are told about a completely fictitious event from our lives, and repeatedly imagine that event occurring, almost half of us would accept that it did.

    Over 400 participants in ‘memory implantation’ studies had fictitious autobiographical events suggested to them — and it was found that around 50% of the participants believed, to some degree, that they had experienced those events.

    Participants in these studies came to remember a range of false events, such as taking a childhood hot air balloon ride, playing a prank on a teacher, or creating havoc at a family wedding.

    30% of participants appeared to ‘remember’ the event — they accepted the suggested event, elaborated on how the event occurred, and even described images of what the event was like. Another 23% showed signs that they accepted the suggested event to some degree and believed it really happened.

    Dr Wade and colleagues conclude that it can be very difficult to determine when a person is recollecting actual past events, as opposed to false memories — even in a controlled research environment; and more so in real life situations.

    These findings have significance in many areas — raising questions around the authenticity of memories used in forensic investigations, court rooms, and therapy treatments.

    Moreover, the collective memories of a large group of people or society could be incorrect — due to misinformation in the news, for example — having a striking effect on people’s perceptions and behaviour.

    Dr Wade comments on the importance of this study: “We know that many factors affect the creation of false beliefs and memories — such as asking a person to repeatedly imagine a fake event or to view photos to “jog” their memory. But we don’t fully understand how all these factors interact. Large-scale studies like our mega-analysis move us a little bit closer.

    “The finding that a large portion of people are prone to developing false beliefs is important. We know from other research that distorted beliefs can influence people’s behaviours, intentions and attitudes.”

    Scientists have been using variations of this procedure for 20 years to study how people can come to remember wholly false experiences.

    Over 400 participants had fictitious autobiographical events suggested to them (such as taking a childhood hot air balloon ride, playing a prank on a teacher, or creating havoc at a family wedding) and it was found that around 50% of the participants believed, to some degree, that they had experienced those events.
  • Brain blocks new memory formation on waking to safeguard consolidation of existing memories

    {Throughout our waking lives we are exposed to a continuous stream of stimuli and experiences. Some of these experiences trigger the strengthening of connections between neurons in the brain, and begin the process of forming memories. However, these initial memory traces are fragile and only a small number will become long-term memories with the potential to last a lifetime. For this transition to occur, the brain must stabilize the memory traces through a process called consolidation.}

    {{Let’s sleep on it}}

    During consolidation, the brain produces new proteins that strengthen the fragile memory traces. However, if a new experience occurs while an existing memory trace is being consolidated, the new stimuli could disrupt or even hijack the consolidation process.

    The brain partially solves this problem by postponing some of the memory consolidation to a period in which new experiences are minimalized, that is, while we are asleep. But what happens if we wake up while consolidation is taking place? How does the brain prevent events that occur just after awakening from interrupting the consolidation process?

    A new study by Prof. Abraham Susswein of the Mina and Everard Goodman Faculty of Life Sciences and The Leslie and Susan Gonda (Goldschmied) Multidisciplinary Brain Research Center at Bar-Ilan University, has now answered this question. Published in eLife, the article’s first author is Roi Levy, whose doctoral research — conducted in Prof. Susswein’s lab — is described in the present study, which also includes part of the doctoral research of David Levitan.

    Susswein and his colleagues have used a seemingly unlikely subject for their study, namely the sea hare Aplysia. These marine slugs are convenient for neuroscientific investigation because of their simple nervous systems and large neurons, and because they have been shown to be capable of basic forms of learning.

    Just after training during waking hours, proteins are synthesized to initiate the consolidation of new memory. Consolidation proteins are produced again in greater quantities during sleep for subsequent processes on the memory trace. The researchers found that blocking the production of consolidation proteins in sleeping sea slugs prevents these creatures from forming long-term memories, confirming that, like us, they do consolidate memories during sleep.

    {{Overcoming Memory Block}}

    Susswein, Levy and Levitan now show that exposing sea slugs to new stimuli immediately after they wake up does not trigger the formation of new memories. In a learning paradigm affecting sea slugs’ feeding activity, the animals were trained after being awakened from sleep. On awakening, interactions between new experiences and consolidation are prevented because the brain blocks long-term memory arising from the new stimuli. However, when the researchers treated the slugs just prior to the training with a drug that inhibits protein production, they found that the new stimuli could generate long-term memory. These findings show that proteins blocking the formation of new memories prevent an experience upon waking from being effective in producing memory. Removing this block — by inhibiting protein production — allows experiences just after waking to be encoded in memory. This even applies to experiences that are too brief to trigger memory formation in fully awake sea slugs.

    Susswein says, “The major insight from this research is that there is an active process in the brain which inhibits the ability to learn new things and protects the consolidation of memories.”

    {{Two Heads are Better than One}}

    The researchers also compared learning by fully awake sea slugs trained in isolation and those trained with companions. They discovered that training in social isolation appears to inhibit new learning, and identified similar molecular processes common to both training in isolation and to training on waking from sleep.

    {{For the Future}}

    “Our next step following on from this work,” says Susswein, “is to identify these memory blocking proteins and to fathom how they prevent the formation of new memories.” He adds: “We may also find that the blocking process accounts for why we cannot remember our dreams when we wake up.”

    An important future challenge is to investigate whether the same proteins could ultimately be used to block unwanted memories, for example, in cases of Post-Traumatic Stress Disorder.

    Neurons in the brain (stock image). When we experience something, initial memory traces are fragile and only a small number will become long-term memories. For this transition to occur, the brain must stabilize the memory traces through a process called consolidation.
  • Increased UVB exposure associated with reduced risk of nearsightedness, particularly in teens, young

    {Higher ultraviolet B (UVB) radiation exposure, directly related to time outdoors and sunlight exposure, was associated with reduced odds of myopia (nearsightedness), and exposure to UVB between ages 14 and 29 years was associated with the highest reduction in odds of adult myopia, according to a study published online by JAMA Ophthalmology.}

    Myopia is a complex trait influenced by numerous environmental and genetic factors and is becoming more common worldwide, most dramatically in urban Asia, but rises in prevalence have also been identified in the United States and Europe. This has major implications, both visually and financially, for the global burden from this potentially sight-threatening condition.

    Astrid E. Fletcher, Ph.D., of the London School of Hygiene and Tropical Medicine, and colleagues examined the association of myopia with UVB radiation, serum vitamin D concentrations and vitamin D pathway genetic variants, adjusting for years in education. The study included a random sample of participants 65 years and older from 6 study centers from the European Eye Study. Of 4,187 participants, 4,166 attended an eye examination including refraction, gave a blood sample, and were interviewed by trained fieldworkers using a structured questionnaire. After exclusion for various factors, the final study group included 371 participants with myopia and 2,797 without.

    The researchers found that an increase in UVB exposure at age 14 to 19 years and 20 to 39 years was associated with reduced odds of myopia; those in the highest tertile (group) of years of education had twice the odds of myopia. No independent associations between myopia and serum vitamin D3 concentrations or variants in genes associated with vitamin D metabolism were found. An unexpected finding was that the highest quintile (group) of plasma lutein concentrations was associated with reduced odds of myopia.

    “The association between UVB, education, and myopia remained even after respective adjustment. This suggests that the high rate of myopia associated with educational attainment is not solely mediated by lack of time outdoors,” the authors write.

    “As the protective effect of time spent outdoors is increasingly used in clinical interventions, a greater understanding of the mechanisms and life stages at which benefit is conferred is warranted.”

    Teens at the beach. Researchers found that an increase in UVB exposure at age 14 to 19 years and 20 to 39 years was associated with reduced odds of myopia.
  • Portions of the brain fall asleep and wake back up all the time

    {When we are in a deep slumber our brain’s activity ebbs and flows in big, obvious waves, like watching a tide of human bodies rise up and sit down around a sports stadium. It’s hard to miss.}

    Now, Stanford researchers have found, those same cycles exist in wake as in sleep, but with only small sections sitting and standing in unison rather than the entire stadium. It’s as if tiny portions of the brain are independently falling asleep and waking back up all the time.

    What’s more, it appears that when the neurons have cycled into the more active, or “on,” state they are better at responding to the world. The neurons also spend more time in the on state when paying attention to a task. This finding suggests processes that regulate brain activity in sleep might also play a role in attention.

    “Selective attention is similar to making small parts of your brain a little bit more awake,” said Tatiana Engel, a postdoctoral fellow and co-lead author on the research, which is scheduled to publish Dec. 1 in Science. Former graduate student Nicholas Steinmetz was the other co-lead author, who carried out the neurophysiology experiments in the lab of Tirin Moore, a professor of neurobiology and one of the senior authors.

    {{Cycling on and off}}

    Understanding these newly discovered cycles requires knowing a bit about how the brain is organized. If you were to poke a pin directly into the brain, all the brain cells you’d hit would respond to the same types of things. In one column they might all be responding to objects in a particular part of the visual field — the upper right, for example.

    The team used what amounts to sets of very sensitive pins that can record activity from a column of neurons in the brain. In the past, people had known that individual neurons go through phases of being more or less active, but with this probe they saw for the first time that all the neurons in a given column cycled together between firing very rapidly then firing at a much slower rate, similar to coordinated cycles in sleep.

    “During an on state the neurons all start firing rapidly,” said Kwabena Boahen, a professor of bioengineering and electrical engineering at Stanford and a senior author on the paper. “Then all of a sudden they just switch to a low firing rate. This on and off switching is happening all the time, as if the neurons are flipping a coin to decide if they are going to be on or off.”

    Those cycles, which occur on the order of seconds or fractions of seconds, weren’t as visible when awake because the wave doesn’t propagate much beyond that column, unlike in sleep when the wave spreads across almost the entire brain and is easy to detect.

    {{Pay attention}}

    The team found that the higher and lower activity states relate to the ability to respond to the world. The group had their probe in a region of the brain in monkeys that specifically detects one part of the visual world. The monkeys had been trained to pay attention to a cue indicating that something in a particular part of the visual field — the upper right, say, or the lower left — was about to change slightly. The monkeys then got a treat if they correctly identified that they’d seen that change.

    When the team gave a cue to where a change might occur, the neurons within the column that senses that part of the world all began spending more time in the active state. In essence, they all continued flipping between states in unison, but they spent more time in the active state if they were paying attention. If the stimulus change came when the cells were in a more active state, the monkey was also more likely to correctly identify the change.

    “The monkey is very good at detecting stimulus changes when neurons in that column are in the on state but not in the off state,” Engel said. Even when the monkey knew to pay attention to a particular area, if the neurons cycled to a lower activity state the monkey frequently missed stimulus change.

    Engel said this finding is something that might be familiar to many people. Sometimes you think you are paying attention, she pointed out, but you will still miss things.

    The scientists said the findings also relate to previous work, which found that more alert animals and humans tend to have pupils that are more dilated. In the current work, when the brain cells were spending more time in an active state the monkey’s pupils were also more dilated. The findings demonstrate an interaction between synchronous oscillations in the brain, attention to a task and external signs of alertness.

    “It seems that the mechanisms underlying attention and arousal are quite interdependent,” Moore said.

    {{Low energy states}}

    A question that comes out of this work is why the neurons cycle into a lower activity state when we’re awake. Why not just stay in the more active state all the time in case that’s when the saber tooth tiger attacks?

    One answer could relate to energy. “There is a metabolic cost associated with neurons firing all the time,” Boahen said. The brain uses a lot of energy and maybe giving the cells a chance to do the energetic equivalent of sitting down allows the brain to save energy.

    Also, when neurons are very active they generate cellular byproducts that can damage the cells. Engel pointed out that the low-activity states could allow time to clear out this neuronal waste.

    “This paper suggests places to look for these answers,” Engel said.

    There are newly discovered cycles of 'sleep' and wakefulness occurring in parts of the brain when we are awake.
  • A watershed moment in understanding how water conducts electricity

    {Scientists have taken spectroscopic snapshots of nature’s most mysterious relay race: the passage of extra protons from one water molecule to another during conductivity.}

    The finding represents a major benchmark in our knowledge of how water conducts a positive electrical charge, which is a fundamental mechanism found in biology and chemistry. The researchers, led by Yale chemistry professor Mark Johnson, report their discovery in the Dec. 1 edition of the journal Science.

    For more than 200 years, scientists have speculated about the specific forces at work when electricity passes through water — a process known as the Grotthuss mechanism. It occurs in vision, for example, when light hits the eye’s retina. It also turns up in the way fuel cells operate.

    But the details have remained murky. In particular, scientists have sought an experimental way to follow the structural changes in the web of interconnected water molecules when an extra proton is transferred from one oxygen atom to another.

    “The oxygen atoms don’t need to move much at all,” Johnson said. “It is kind of like Newton’s cradle, the child’s toy with a line of steel balls, each one suspended by a string. If you lift one ball so that it strikes the line, only the end ball moves away, leaving the others unperturbed.”

    Johnson’s lab has spent years exploring the chemistry of water at the molecular level. Often, this is done with specially designed instruments built at Yale. Among the lab’s many discoveries are innovative uses of electrospray ionization, which was developed by the late Yale Nobel laureate John Fenn.

    Johnson and his team have developed ways to fast-freeze the chemical process so that transient structures can be isolated, revealing the contorted arrangements of atoms during a reaction. The practical uses for these methods range from the optimization of alternative energy technologies to the development of pharmaceuticals.

    In the case of the proton relay race, previous attempts to capture the process hinged on using infrared color changes to see it. But the result always came out looking like a blurry photograph.

    “In fact, it appeared that this blurring would be too severe to ever allow a compelling connection between color and structure,” Johnson said.

    The answer, he found, was to work with only a few molecules of “heavy water” — water made of the deuterium isotope of hydrogen — and chill them to almost absolute zero. Suddenly, the images of the proton in motion were dramatically sharper.

    “In essence, we uncovered a kind of Rosetta Stone that reveals the structural information encoded in color,” Johnson said. “We were able to reveal a sequence of concerted deformations, like the frames of a movie.” Johnson’s lab was assisted by the experimental group of Knut Asmis at the University of Leipzig and the theory groups of Ken Jordan of the University of Pittsburgh and Anne McCoy of the University of Washington.

    One area where this information will be useful is in understanding chemical processes that occur at the surface of water, Johnson noted. There is active debate among scientists regarding whether the surface of water is more or less acidic than the bulk of water. At present, there is no way to measure the surface pH of water.

    The paper’s first author is Conrad Wolke, a former Yale doctoral student in Johnson’s lab. Co-authors of the paper are from the University of Chicago, Ohio State University, the University of Pittsburgh, the University of Washington, the University of Leipzig, and the Fritz Haber Institute of the Max Planck Society.

    Financial support for the research came from the U.S. Department of Energy, the National Science Foundation, the Ohio Supercomputing Center, and the Collaborative Research Center of the German Research Foundation DFG.

    Water