Feeling sluggish? Chew gum for a brain boost

Mona Lisa chomping some gum.

Mona Lisa boosting brain cells while chomping some gum.

Monday mornings. They drag. Getting the ol’ noodle back into work-mode, especially after a fun summer weekend, can be a tall order. Many of us head straight for the classic boost – a cup of Joe – to help combat a case of the Monday’s but some new studies suggest that chewing gum could also provide some relief by enhancing our brain’s arousal, alertness, and attention.

Om, nom, nom. Yes, we feel more alert.

In a recent study published in the British Journal of Psychology, Morgan and colleagues assessed the performance of 40 psychology undergraduate students on an auditory vigilance task while chomping on a wad of gum.

Study participants were split into two groups: (1) no-gum and (2) gum-chewing. They listened in a pair of headphones to a computerized voice reading a series of random numbers and were asked to press a computer spacebar when they identified the target sequence, an odd number followed by an even number and another odd number (i.e. 7-2-1). Reaction time and accuracy to each target-response were recorded over the 30-minute task. Following the task, participants were asked to assess how alert they felt.

Researchers found that, as the task went on, the reaction time and accuracy of identifying the target sequence declined in non-gum chewers. That makes sense. Think of doing a monotonous task, like signing your name on 300 letters, or stuffing 1,000 envelopes. You’re probably not as efficient towards the end of the task as when you started.


Mean self-rated alertness pre and post task. Gum-chewer (dark gray); no-gum (light gray). F(1,32) = 14.25, p = .001

Interestingly, in contrast to the no-gum group, gum-chewers had a smaller decrease in performance during the later stages of the task, meaning they performed better overall. Additionally, gum chewers rated themselves as more alert compared to non-gum chewers following the test.

So working out your jaw results in better cognitive performance and a greater feeling of alertness, but how is the brain affected? Well as it turns out, gum chewing increases blood flow to the brain, providing it with more oxygen, and ultimately improving brain power. In another new study, Hirano et al. assessed which brain regions receive more blood flow while chewing gum during an attention task.

Seventeen participants underwent a 30-minute functional magnetic resonance imaging (fMRI) brain scan. fMRI is a brain imaging technique that assesses changes in cerebral blood flow, which is thought to correlate with neural activity. To assess the effect of gum chewing on alertness, subjects were put through two 10-minute periods of a visual attention task, once while chewing gum, and once without. The task required participants to press a button with their right or left thumb corresponding to the direction of an arrow that was presented to them.

Hirano and colleagues identified 8 brain regions that increased activity during performance of the task while chewing. Several of these regions correlated with alertness (premotor cortex), arousal (reticular activating system via the thalamus), and attention (anterior cingulate cortex, left frontal cortex).

fMRIRegions highlighted in yellow indicate areas of increased blood flow
during attention task and gum chewing.
Abbreviations: pm (premotor cortex), aci (anterior cingulate cortex), th (thalamus).

Chewing stimulates the trigeminal nerve, the fifth cranial nerve, which in turn sends information to the brain regions responsible for alertness. Additionally, the trigeminal nerve is known to increase heart rate, which increases blood flow to the brain.

As far as Monday mornings go, it looks like you might need to get yourself going and then chewing a piece of gum will help keep you trucking throughout the work day. Personally, I’m patiently waiting for the launch of Wrigley caffeinated gum – it could be the ultimate one-two punch for the Monday blues!

Morgan K., Johnson A.J. & Miles C. (2013). Chewing gum moderates the vigilance decrement, British Journal of Psychology, n/a-n/a. DOI:

Hirano Y., Obata T., Takahashi H., Tachibana A., Kuroiwa D., Takahashi T., Ikehira H. & Onozuka M. (2013). Effects of chewing on cognitive processing speed., Brain and cognition, PMID:

When brains venture into outer space (Part I): Bone density partially at inner ear’s beckoning

“Looking outward to the blackness of space, sprinkled with the glory of a universe of lights, I saw majesty – but no welcome. Below was a welcoming planet… That’s where life is; that’s were all the good stuff is.” Loren Acton

Loren Acton, an American physicist that flew into space with NASA’s Space Shuttle program in 1985, makes an excellent point that is relevant to human physiology. Earth “welcomes”, or is ideal for, our bodies because this is where we evolved to live. The human brain was designed for life on Earth, not space travel. Not to say that we should abandon exploration of this final frontier, but strange things start to happen to the body above the Earth’s atmosphere. One such anomaly is highlighted in a new study by Vignaux et al. that suggests workings of the inner ear have a say in changing the body’s bone density during space flight and potentially for aging individuals here at home on Earth.

A common misconception is that gravity does not exist in outer space, but for typical human space flight altitudes (120 – 360 miles above the Earth’s surface) this is simply not true. It does exist! The reality is that in space the gravitational force is much, much smaller than on Earth, almost zero (~ 0G).


So let’s suppose that you’re up in the space station in near zero gravity and you drop an apple out of your lunch bag. What happens? The apple appears to float, right? In actuality, the apple is falling. As a matter of fact the apple, the rest of your lunch, you, and the entire space station are falling together, not towards Earth, but around it. UPDATE (7//9/13): The condition of microgravity that is achieved through the active state of free-falling creates the experience of weightlessness. And weightlessness has been documented to wreak havoc on the human body – particularly the bones.

What happens to an astronaut’s bones in space?


Bone loss.

Astronauts lose bone mass in space at an alarming rate. After several months on the international space station, a study found that astronauts lose 1-2% of bone mass on average per month in space, mainly in the lumbar spine and legs.

Why do astronauts lose bone density? On Earth, gravity exerts external forces on the body that challenge your bones, but in space astronauts experience prolonged weightlessness, their bones no longer bear weight, and their bodies adapt to this new environment by reducing bone mass. Basically, if you “float” everywhere and don’t need to your legs to walk around you don’t require dense bones.

The changes that occur in the space only become apparent upon returning to Earth’s gravitational forces and manifest in an increased risk of bone fracture. To counteract the musculoskeletal changes astronauts experience in space, exercise programs have been established but do not fully preserve bone mass in microgravity. Even after returning from space astronauts must go through several weeks of rehabilitation.

So how do changes in bone density occur in space, or on Earth for that matter?

Electron microscope image of an osteoclast resorbing bone. Photo credit: PathologyOutlines.com, Inc.

Electron microscope image of an osteoclast resorbing bone. Photo credit: PathologyOutlines.com, Inc.

Bone in the body is constantly renewing itself through a maintenance procedure known as bone remodeling. Bone remodeling primarily involves two specialized types of cells: osteoclasts, bone dissolving cells, and osteoblasts, bone forming cells. In a normal healthy person, these cells strike a balance and bone is consistently being torn down and remade. In space it is thought that the working rate of the osteoclasts is increased and exceeds that of the osteoblasts, which fair less well in a microgravity environment, resulting in a bone mineral density deficit.

Bone remodeling is quite complex and several factors impact the process such as gravity, hormones, and sympathetic nervous system signaling.  The sympathetic nervous system controls several of the body’s internal organs (i.e. pupil diameter, heart rate, gut mobilization) and is also involved in the body’s stress response.

One particular component of the inner ear, the vestibular system, has been shown to alter sympathetic nerve outputs to the body. Cats with damaged vestibular nuclei, the groups of brain cells in the brainstem that receive reports from the inner ear, showed altered sympathetic nerve outputs with drops in blood pressure and increased heart rate when the animals were tilted up on a table at different angles.

So what exactly does the vestibular system do? The vestibular system acts as a sensor collecting data about the body’s position/motion and informs the brainstem, which then sends signals to various brain regions that coordinate body motion and balance. It is one of the body’s primary systems for integrating information about the force of gravity on the body. Unfortunately for the vestibular system, it was designed to function under Earth’s gravitational forces and when those are taken away in space, the system reports incorrect and disorienting information to the brain.

The big picture

Okay so let’s break down this whole scenario:  The vestibular system organizes information about gravity and body position and can dictate sympathetic nerve output. In turn sympathetic nerve output has power to alter the happenings of the osteoclast-osteoblast system of bone remodeling. Vignaux et al. realized that the vestibular system is altered in the microgravity environment of space and postulated that if the vestibular system is providing the brain with incorrect information, which is what happens to astronauts in space, then incorrect sympathetic output could partially account for decreases in bone mineral density observed after extended space travel.

To test this hypothesis, Vignaux and colleagues injected rats with a chemical that damaged the inner ear, rendering it unable to sense information, mimicking how the vestibular system behaves in space. Then after a period of time they evaluated the rat’s bone mass.

A month after creating the vestibular lesions they found lower bone mineral density in the femur, the body’s largest leg bone. Numbers of osteoblasts were also reduced on the surface of the bone. Osteoclast numbers were unaffected by damaging the vestibular system. Notably, body weight as well as food and water intake were not different compared to controls suggesting that bone loss was not a result metabolic changes but rather specifically due to bone remodeling.

BMDBone mineral density (BMD) in varying regions.
The whole femur, as well as the particular portions of the femur
(distal metaphysis and disphysis regions), have significantly
lower BMD in 
damaged vestibular rats (VBX – gray bars)
compared to  controls (Sham – white bars)

Additionally, rats were fitted with telemetric recording devices that provided information about how much or how little they moved throughout the experiment. Rats with damaged inner ears moved around twice as much as control rats with intact vestibular systems. The reason for this increase in locomotion is not well understood but it does dispel concerns that bone mineral density loss could be due to a decrease in movement and bone strain following changes to vestibular input.

To assess whether or not the changes in bone mass originated from sympathetic nervous system output to the bones, researchers used a chemical, propranolol, to block the sympathetic system output. After one month of treatment with propranolol following damage to the vestibular system, they found an increased number of osteoblast cells present on the surface of the bone and that bone loss was blunted.

What does this all mean for astronauts in space and individuals with vestibular dysfunction on Earth?

Improper vestibular signaling is an issue for astronauts living in a microgravity environment, but also for people here on Earth. Individuals with vestibular dysfunction, which can gradually occur with age, may have lower bone mineral densities which could lead to the bone loss disease known as osteoporosis. Vignaux’s results suggest that future clinical work could target the vestibular system and it’s sympathetic nervous system outputs from the brain to improve bone health. Milk! Step aside. You just aren’t getting the job done and we’re looking the inner ear for answers.

Vignaux G., Besnard S., Ndong J., Philoxène B., Denise P. & Elefteriou F. (2013). Bone remodeling is regulated by inner ear vestibular signals., Journal of bone and mineral research : the official journal of the American Society for Bone and Mineral Research, PMID:

Tingling palms and knocking knees: Why do we fear heights?

Kennywood – Pittsburgh’s premier amusement park – has filled my childhood with magical memories. Riding the stunning carousel horses around-and-around to the accompaniment of big band music. The scent of Potato Patch fries and funnel cake wafting through air. But what is the one memory I’ll never forget? The 251-foot Pitfall scaring the living daylights out of me.

The Pitfall at Kennywood Park

The Pitfall at Kennywood Park. Photo credit: Scott Jones

The Pitfall, now a retired ride, would leisurely take you 251 feet up into the sky and pause at the top so you could take in the view. With nothing more than a subtle *click* the brakes release and in a brief terrifying moment riders scream as they plummet towards earth.

One year during my 7th grade Kennywood school picnic, in a heroic effort I convinced some nervous friends that going on the Pitfall was the best idea ever. A few were crying actual tears in protest but ultimately we decided to ride. You’re only in 7th grade once, right?

We got in line and when our turn arrived a park employee locked us down in our seats, legs dangling below us. I had built the Pitfall up so much in my mind and I was confident about our decision to ride until we made it halfway up the ascent.

Halfway up. That’s where the whole idea turned sour. Turns out I have acrophobia – I’m deathly afraid of heights!

 Just looking at this picture makes my hands start to tingle!

The neurobiology of fear

Where does fear originate in our brain? Scientists believe that the brain region known as the amygdala plays an important role in triggering fear. Greek for ‘almond’, describing it’s shape, the amygdala sits in the medial temporal lobe of the brain.

The amygdala highlighted in orange.

The amygdala highlighted in orange.

A fear stimulus, such as heights, activates a cascade of events in the brain. The sensory cortex acknowledges something as frightening and signals the amygdala. In turn, the amygdala notifies the hypothalamus and the brainstem and you feel fear.

In humans, a very rare hereditary illness, Urbach-Wiethe disease, can cause bilateral symmetrical loss of all the cells in the amygdala. Without the amygdala the fear signaling cascade is broken and individuals are unable to experience fear. One famous patient, a 44-year-old woman known only as SM, exemplifies this fearlessness.

One night while walking home past a park, SM was grabbed by a man who put a knife to her throat, and exclaimed, “I’m going to cut you!”. Without feeling afraid she replied, “If you’re going to kill me, you’re gonna have to go through my God’s angels first.” Ballsy and effective. He responded by releasing her and she calmly walked back home. Extraordinarily, the following night she walked past the very same park again!

SMBrainScanMRI brain scan of SM compared to a healthy control subject.
The regions circled in red represent the amygdala and in SM is void of tissue.

Like others with Urbach-Wiethe disease, SM has characteristic bilateral amygdala lesions. SM’s IQ, memory, and language are normal. Although she experiences a wide range of other emotions, in her adult life there has never been an instance in which she has felt fear.  She appears to understand the concept of fear, having personally felt it once as a child, when she was cornered by a friend’s Doberman Pinscher. This incidence was presumably prior to loss of her amygdala. As an adult SM is able to recognize fear from body language and prosody of an individual’s voice, but interestingly she is unable to discern fear in static facial expressions.

In a study by Feinstein et al., SM was asked to participate in a number of frightening tasks. She was shown terrifying horror films, asked to hold large snakes and spiders, and taken to a haunted house. She never once showed signs of fear and when prompted described her experiences as feeling overwhelmingly “curious”.

SMFear(A) SM holding a snake,
(B) A spider SM tried to touch,
(C) Waverly Hills Sanatorium Haunted House SM toured

Self-described levels of fear following carbon dioxide inhalation.

Self-described levels of fear following carbon dioxide inhalation.

In a new study Feinstein and colleagues tried a different approach to induce fear and panic in SM and two other subjects with Urbach-Wiethe disease by exposing them to carbondioxide (CO2). Breathing in CO2 creates the sensation of suffocation and upon inhalation the subjects described feelings of being “overwhelmed by the panic and fear of dying”. It worked! SM felt fear for the first time in her adult life. But how?

The brainstem controls basic bodily functions such as breathing and heart rate. SM’s sensation of fear suggests that ultimately the brainstem, the endpoint for the fear cascade, holds the key to the conscious experience of fear. Specific threats, such as CO2, may bypass the circuit and impact the brainstem directly, eliciting fear without receiving a signal that has been processed through higher brain regions and the amygdala.

While scientists have not yet performed a field experiment with SM riding a Kennywood-escque Pitfall, Daniel Tranel, a professor of neurology and psychology at the University of Iowa, has been studying SM for years and tells NPR that SM reports being unafraid of heights. So unless she is sucking in CO2 on the ride, a simple roller coaster used as a fear stimulus that would be processed through the amygdala rather than directly impacting the brainstem, is unlikely to faze her.

Do we have an innate fear of heights?

Barring that you have an intact amygdala, are we programmed to fear heights from birth? Several studies have addressed this issue using a visual cliff.

Visual Cliff

A visual cliff is a trick-of-the-eye testing apparatus where an opaque patterned surface is connected to a transparent glass surface. Below the transparent side is a lower level that has the same pattern as the opaque surface. The visual cliff creates the illusion that you could fall over the edge.

Researchers noticed that if an infant was able to crawl and move on their own accord then they were also more wary of the “cliff”. However, if the child was “prelocomotor”, or younger than crawling age, they did not fear the edge. This is likely because as a baby learns to move around they also become aware of distances, depth perception, and begin coordinating their visual system with movement through their environment.

This experiment has been reproduced in a number of animal species including kittens that also utilize visual cues in movement. The visual cliff did not deter animals, such as rats, which predominately rely on tactile cues by whisking surfaces with their whiskers rather than vision to navigate their environment.

Thus it would seem as though fear of heights is a learned response to experiences, such as falling or near falling incidences, rather than something we are born with.

So what happened with the Pitfall after I realized that being more than 15 feet up in the air was too high for me? Well I survived. I actually rode the ride several more times since, rationalizing that the ride was engineered well and I was probably statistically safe.  If you ever get the chance to ride something like the Pitfall, try placing a dime on your knee to watch it levitate in front of you as you drop.  It might just keep your mind off of the heights.

Feinstein J.S., Adolphs R., Damasio A. & Tranel D. (2010). The human amygdala and the induction and experience of fear., Current biology : CB, PMID:

Feinstein J.S., Buzza C., Hurlemann R., Follmer R.L., Dahdaleh N.S., Coryell W.H., Welsh M.J., Tranel D. & Wemmie J.A. (2013). Fear and panic in humans with bilateral amygdala damage., Nature neuroscience, PMID:

Campos J.J., Bertenthal B.I. & Kermoian R. (1992). EARLY EXPERIENCE AND EMOTIONAL DEVELOPMENT: The Emergence of Wariness of Heights, Psychological Science, 3 (1) 61-64. DOI:

Bombs and Brains

Great minds met to conceive the first atomic bomb. Now the atomic bomb has helped researchers confirm some long-held suspicions about the human brain.

Up until the 1960s, it was a widely accepted belief that we are born with a finite number of neurons that last our entire life. In 1965, researchers presented the first contrary evidence to this popular theory by showing neurogenesis, the production of new neurons, occurs in the rat brain. Fast-forward to 1998. Ericksson et al. demonstrated for the first time in adult human brain the birth of new neurons by labeling dividing cells in patients and analyzing their brains for new cells after death. As groundbreaking as this new study was, the results left a number of questions surrounding the quantity of cells produced and whether or not this number was great enough to actually impact brain function. Were these new neurons just duds?

Human post-mortem tissue studies are descriptive by nature. The brains have been preserved either through freezing or fixation, a chemical process used to preserve tissue from decay, and researchers are unable to look at dynamic functions of the brain, such as the creation of new cells, in the same capacity as in experimental animal models and culture systems. Brilliantly, Spalding et al. circumvented these issues with a novel strategy using radiocarbon (14C) dating and knowledge of the atomic bomb tests to evaluate neurogenesis in the adult human brain.

Kirsty L. Spalding et al. (2013). Dynamics of Hippocampal Neurogenesis in Adult Humans. Cell 153(6): 1219-1227. Click HERE to go to this article.

What is 14C? 14C is a radioactive carbon isotope, meaning that it has an unstable nucleus due to an increased number of neutrons – normally carbon has 6, radiocarbon has 8. 14C is found to naturally exist on Earth but only in trace amounts that can be formed through the interaction of nitrogen and cosmic rays in the atmosphere. Radiocarbon in the atmosphere is integrated into carbon dioxide molecules. Plants absorb carbon dioxide from the environment and incorporate the carbon into their fibers. Our food chain begins with plants. People eat plants. People also eat animals that consume plants. And the result of all this eating? The more 14C that is in the atmosphere, the more it is integrated into our own body’s DNA when new cells are created.

Radiocarbon dating using 14C measurements was first pioneered in the fields of geology and archeology and were used to look at the age of really old rocks and ancient artifacts. Spalding and colleagues put a spin on this and developed a retrospective dating system to identify the birth date of neurons.

Changes in radiocarbon over time in New Zealand and Switzerland. Graph (source) U.S. Department of Commerce, NOAA

How was retrospective dating neurons possible? In the mid-1950s the country was in the heart of the Cold War. Atmospheric detonation of atomic bombs from 1955-1963 shot worldwide 14C levels through the roof. In 1963, the United States, Soviet Union, and United Kingdom governments signed the Partial Nuclear Test Ban Treaty prohibiting all nuclear detonations unless they were performed underground. Upon establishment of this treaty, atmospheric levels of 14C began to decline and it is through this documented radiocarbon timetable that researchers were able to determine when each new neuron clocked-in at birth by looking at the 14C DNA concentration signature in each cell.

Great! Now we know that each neuron’s DNA should have a particular amount of 14C because it  correlates with atmospheric 14C at the time it was born and there was a huge spike in it during atomic bomb testing, so  birth date can be determined. But how the heck can we measure 14C in neuronal DNA? Enter accelerator mass spectrometry (AMS). This technique allows scientists to scan infinitesimal DNA samples and accurately report concentrations of 14C.

Spalding et al. isolated cells nuclei from human post-mortem hippocampus, the brain region that plays an important role in memory and has been shown to undergo neurogenesis in animal models. Then using AMS they measured the amount of 14C in the neurons and developed a sophisticated biological transport equation to look at cell turnover dynamics.


Ah yes, this equation takes me back to my undergraduate days of biological transport class with Dr. Patzer. The man made learning the transport dynamics of cooking a Thanksgiving turkey fun. Basically the equation contains elements that track with the age of the person and the age of each cell and makes sure to account for cell death processes as well. When you apply particular settings and solve the equation you can obtain neuron density and importantly evaluate how many new neurons are born in the hippocampus throughout life.

Having the model down pat, researchers then looked at the data and reconfirmed that neurogenesis occurs after birth by observing four phenomena:

Hippocampal neurogenesis in the adult human brain.

1. 14C concentrations in hippocampal neurons correspond to atmospheric concentrations with dates after subject birth. New neurons are being made!
2. Some of the oldest subjects in the study had higher amounts of 14C integrated into their DNA than were present preceding bomb testing, when atmospheric 14C levels were low. Thus 14C must have been incorporated into hippocampal neuronal DNA later in life.
3. There doesn’t appear to be any dramatic decline in hippocampal neurogenesis with aging because individuals born prior to 1955 have incorporated high levels of 14C into their DNA even if they were born several years earlier.
4. Subjects born before 1955 have lower levels of 14C in their DNA than anyone born after 1955, suggesting that although the hippocampus does create new neurons, a large number of neurons are not new.

Combined these facts once again help dispel the myth that we are born with all the neurons we will ever have in our lifetime.

Knowing that new neurons are generated in the hippocampus, the next questions became how many neurons are born and how quickly does this renewal happen? By modeling the cells with the biological transport equation described above, data suggests that a subpopulation of hippocampal neurons renew constantly whereas other neurons are non-renewing. Spalding and colleagues estimate the renewing population could be as many as one-third of all hippocampal neurons, around 700 new neurons a day! A great deal more than originally suspected. Additionally, neurons that are non-renewing do not seem to be replaced following death.

Critically, Spalding et al. addressed whether or not these adult born neurons could impact brain function. While this is based in conjecture, they suggest that the large number of neurons being born in the hippocampus is sufficient to contribute to cognitive function because young hippocampal neurons have enhanced synaptic plasticity, which impacts learning and memory.

History recorded the dark side of atomic bomb testing and the looming danger of nuclear war. Today research is showing us a bright side… Many many new neuron birthdays throughout your life!

ResearchBlogging.orgSpalding K., Bergmann O., Alkass K., Bernard S., Salehpour M., Huttner H., Boström E., Westerlund I., Vial C. & Buchholz B. & (2013). Dynamics of Hippocampal Neurogenesis in Adult Humans, Cell, 153 (6) 1219-1227. DOI:

Memory problems? Brain rhythm therapy while you sleep could help


 “Sleep is… a seizure of the primary sense-organ, rendering it unable to actualize its powers; arising of necessity… for the sake of its conservation.” – Aristotle, 350 B.C.

Statue of Sleeping Eros. Photo credit: The Metropolitan Museum of Art

Ahhh sleep, beautiful sleep. It’s the blissful state of inactivity that renews the body and mind.  Greek philosopher Aristotle described the restorative value of suspending consciousness in slumber as far back as 350 BC. Today researchers have taken Aristoltle’s understanding of sleep being a physical state of renewal to a deeper level and tapped into mechanisms that may enhance your memory by improving the efficiency of your sleep.

So how does sleep work? Sleep consists of natural cycles of brain activity that involve both non-rapid eye movement (NREM) and rapid eye movement (REM) periods. During NREM sleep, brain activity begins to oscillate in a low frequency (< 1 Hz) rhythm that is easily recognized on an electroencephalogram (EEG), a machine used to graphically represent brain waves, as slow-wave sleep.  This slow-wave sleep activity represents networks of neuronal ensembles performing in unison with waves of active to inactive states.  The orchestration of these neurons creates the particular form of slow-wave sleep brain activity and is theorized to play a critical role in memory performance.

As a sophomore in college, I would study long hours into the night, fall asleep, and then wake up to a biology quiz first thing in the morning.  During the quiz, quite often and much to my surprise, I would recall some minute detail from my notes that I surely did not have a firm grasp on the night before. Recent studies suggest my biology recollections, as well as the consolidation of every day memories, likely occur during slow-wave sleep. Consolidation is a process that converts fluid memories into more concrete representations that become part of the bigger system of existing long-term memories that are encoded by a neuronal network. The possibility of improving your brain’s ability to solidify and strengthen memories through slow-wave sleep has important implications for people with reduced slow-wave amplitudes, such as people that suffer from sleeping and psychiatric disorders. The phenomenon of slow-wave amplitude reductions also occurs naturally with aging and affects a large portion of the population.

In previous studies, scientists have attempted to artificially enhance slow-wave sleep using rhythmic electrical, magnetic, or sound pulses. These studies have shown limited memory improvement, likely because they did not synchronize stimulations with the natural ebb and flow of activity in the brain. However, in a new study by Ngo et al., researchers identified innate brain activity and created a feedback loop used to strategically enhance slow-wave oscillations using sound stimulation to maximize memory benefits.

Hong-Viet V. Ngo et al. (2013). Auditory closed-loop stimulation of the sleep slow oscillation enhances memory. Neuron 78: 545–553. Click HERE to go to this article.

Eleven subjects were hooked-up to EEGs and their brain activity during NREM sleep was monitored for two nights. On one of the nights, two auditory tones were played each time the computer reading the natural rhythm of the brain predicted an up-state of the slow-wave. On the other night of testing, subject brain activity was monitored but auditory tones were not played.

Slow-wave train following auditory stimulation (red) compared to control (black)

Ngo and colleagues found that presenting the tones in-phase with the up-state of the waveform increased slow-wave activity by increasing the amplitude of the waveform itself but also by creating a “train” of slow waves, one wave right after the other. Knowing that the waveform is changed with auditory stimulation, researchers wanted to see if this “improved” waveform actually alters memory performance.

Prior to sleep, subject performance on a declarative memory task was assessed. Declarative memory is one type of long-term memory that involves the conscious recall of memories, such as extracting biology facts from your brain during a quiz. The test began with subjects learning 120 associated pairs of words (e.g. brain, consciousness). After learning the word pairs, declarative memory was tested to see how many associations they remembered. Subjects were shown the first word of each pair one at a time and asked to recall as many matching second words as possible. Declarative memory was the tested again in the morning.

Researchers found that subject exposure to auditory stimulation during sleep improved their memory retention almost 2 fold. Interestingly, memory retention was also positively correlated with the percentage of slow-wave sleep experienced by a subject, suggesting that enhancing slow-wave oscillations is better for memory.

Memory retention following stimulation during the up-phase of slow-wave oscillation (p < 0.001)

Memory retention following stimulation during the down-phase of the slow-wave sleep form (p = 0.93)

Additionally, this memory-enhancing phenomenon also appears to be up-state stimulation specific. Researchers tested stimulation during the down-phase of the waveform and found that slow-wave trains did not form and subsequently memory did not improve.

Summing it all up, what’s the best way to get the most out of your NREM slow-wave sleep and improve your memory? This study would suggest the trick is to monitor your brain rhythm during sleep and insert an auditory stimulus at the appropriate moment to see results. Let’s hope the entrepreneurs at Necomimi jump on this opportunity to figure out how to tweak their brain wave cat ears by adding a set of headphones for some blissful memory enhancing sleep. I know I’ll be placing my order as soon as they’re available.

ResearchBlogging.orgNgo H.V., Martinetz T., Born J. & Mölle M. (2013). Auditory Closed-Loop Stimulation of the Sleep Slow Oscillation Enhances Memory, Neuron, 78 (3) 545-553. DOI:

Researchers seek CLARITY for scientific questions from a transparent brain

Photo credit: Deisseroth Lab (source)

Imagine if Miss Frizzle, science teacher extraordinaire, and the Magic School Bus really existed. We could all shrink down to the size of microscopic particles, fly up someone’s nose and enter their brain to investigate the inner workings of the mind – all in IRB-approved fashion of course. It certainly would help us in our attempt achieve the primary goal of biological sciences, which is to understand organisms in their most natural state of function.

Since there are no magical shrinking automobiles, in order to see inside the brain neuroscientists have to be able to label structures AND let light reach them to capture an image. One major branch of the existing techniques used to understand the brain’s structure involves cutting the brain into incredibly thin sections (40 um – 200 nm thick), which are then labeled with molecular markers and then reconstructed into 3D images.  These studies are often limited to very small tissue volumes because the brain contains opaque lipids.

Lipids are a group of molecules that include fats, waxes, fat-soluble vitamins, and phospholipids. Phospholipids form a continuous barrier around every cell in the body, with their hyrdophillic head (water-loving) and a hydrophobic tail (water-fearing) they band together in a ‘tails in – heads out’ formation to create a lipid bilayer. Within tissue, lipids are important for shaping cells and holding molecules in place, but their presence has been a major barrier in fluorescence microscopy that up to this point has been difficult to address. Lipids limit access of fluorescent molecular labels to their targets and scatter light photons resulting in blurred images. Karl Deisseroth and his team at Stanford realized that if lipids could be removed they could achieve higher quality images of brain tissue and thus pioneered a new technique they named CLARITY.

Kwanghun Chung et al. (2013). Structural and molecular interrogation of intact biological systems. Nature 497: 332–337.  Click HERE to go to this article.

Don’t mind me, just reading through this invisible brain! Photo credit: Deisseroth Lab (source)

CLARITY allows light and molecular markers to reach targets within large tissue volumes all while maintaining the tissue’s natural structural landscape.  How is this accomplished? Essentially the process involves infusing the tissue with hydrogel monomers and formaldhyde, a fixative that links proteins in the tissue together. The formaldehyde also binds the hydrogel molecules to biomolecules present in the tissue, importantly keeping everything in place. Next the hydrogel monomers are actively polymerized, which involves the small molecules binding together to create larger chains that form a mesh. Once the mesh is in place an electric field is created across the tissue, which forces the lipids out of the sample. The result is a see-through brain that can be easily labeled with molecular markers.

For an entire mouse brain this process only takes about 8 days. But the technique is not limited to mice or even just brains. CLARITY will work on any organ.

One important question for anyone involved in translational neuroscience is will CLARITY work in human tissue that has been stored in fixative for years on end? Deisseroth and colleagues investigated a thick block of frontal lobe tissue from an autistic patient that had been in storage for over 6 years. Not only were they able to successfully label and image the tissue but they also made some fascinating observations.

Within the tissue a particular class of neurons, parvalbumin interneurons, that are thought to regulate synchronicity of network neuronal firing showed neuronal structures wrapping around and connecting back to themselves rather than other neurons, a phenomena not normally observed in subjects with healthy brains.  This is just one example of how CLARITY is blazing a new path for acquiring previously unobtainable information about  structural and molecular systems in the human brain.

What’s the most exciting news about CLARITY for this graduate student? It means I might regain feeling in my fingers because tissue sectioning on the cryostat may have just become obsolete!

Check out this incredible video about CLARITY:

ResearchBlogging.orgChung K., Wallace J., Kim S.Y., Kalyanasundaram S., Andalman A.S., Davidson T.J., Mirzabekov J.J., Zalocusky K.A., Mattis J. & Denisin A.K. & (2013). Structural and molecular interrogation of intact biological systems, Nature, 497 (7449) 332-337. DOI:

Future dads of the world relax! Stressing now could affect the health of your future kids

Parents pass on a unique genetic makeup to their children. Little Johnny is a blonde-haired, blue-eyed boy with a one-of-a-kind thumbprint all because it is encoded in the DNA provided by his parents. But did you know that the quality of the DNA you pass on to your children can be impacted by stressful events in your life such as disease, malnutrition, and old age, ultimately influencing the way your children’s bodies respond to stress? That is what a new study  looking at the impact of paternal stress on sperm genetics suggests.

First let’s take a look at how the body reacts to a stressful situation. The hypothalamic-pituitary-adrenal axis, or HPA-axis, is a major component of the neuroendocrine system, a system in our body that allows our brain to communicate with hormone producing glands, such as the adrenal gland. During a stressful event, for example happening upon a giant, hungry-looking bear during a hike in the woods, the HPA-axis is activated resulting in the brain telling the adrenal gland to release cortisol, a major stress hormone that tells your body to fight the bear (poorer option) or get the heck out of there as fast as humanly possible (better option).  Actually, trying to outrun a bear is probably a terrible idea because they’re wicked fast. Check back for future posts on bear encounter etiquette. Anyway back to the HPA-axis…

The HPA-axis is remarkably susceptible to environmental stressors and improper management of the HPA-axis has been linked to a number of psychiatric disorders including autism, schizophrenia, and depression.

Ali B. Rodgers et al. (2013). Paternal Stress Exposure Alters Sperm MicroRNA Content and Reprograms Offspring HPA Stress Axis Regulation. The Journal of Neuroscience33(21): 9003-9012

So can the stresses of life, occurring years prior to conception, really impact how your future kid’s HPA-axis handles stress? Rodgers and colleagues exposed both adolescent and adult male mice to prolonged stresses before breeding, such as leaving the lights on all night, creating funny odors, or introducing novel objects into their home. Researchers stressed male mice instead of female mice to limit the number of variables that could potentially influence their findings. Male mice only contribute their DNA to the pups, whereas in addition to changes in DNA female mice can also affect pups through life events during pregnancy or rearing.

When evaluating the pups from stressed fathers, HPA-axis response was significantly reduced compared to pups of unstressed fathers. Decreased HPA-axis responsiveness is bad news for pups because it means they’re unable to respond appropriately to changes in their environment. Consistent with HPA-axis dysregulation, transcription, which is the first step of gene expression, was altered in brain regions responsible for regulating the body’s response to stress.

But how exactly were these changes passed on? Did the father’s DNA change? Not exactly.

Epigenetics is the study of changes in gene expression by modifying DNA without changing the actual nucleic acid sequence often by adding small molecules to the DNA nucleotides or altering the accessibility of the sequence. Several studies suggest that stress can lead to epigenetic modifications in sperm germ cells. Rodgers et al. found considerable changes in 9 sperm microRNA, molecules responsible for epigenetic alterations, within the group of fathers exposed to environmental stressors. Thus when the fathers were stressed over a prolonged period of time, it caused changes in the how the DNA they would eventually pass on to their offspring would be expressed.

What’s the bottom line? Relax, eat well, and stay healthy to keep your sperm and subsequently all your future children happy and healthy! Oh, and stay away from bears.

ResearchBlogging.orgRodgers A.B., Morgan C.P., Bronson S.L., Revello S. & Bale T.L. (2013). Paternal Stress Exposure Alters Sperm MicroRNA Content and Reprograms Offspring HPA Stress Axis Regulation, Journal of Neuroscience, 33 (21) 9003-9012. DOI:

In the quest to slow aging, have researchers found the Holy Grail?

What if I told you that it may be possible to live longer than ever while remaining sharp as a tack? Now what if I told you that you’re NOT going to like the key to achieving this immortality?

Ready? To live longer and prosper all you have to do is restrict your calorie intake by at least 30%. That’s it. So kiss that dessert and all future indulgences good-bye! According to several studies conducted in rodents, caloric restriction can increase lifespan by slowing the aging process, preserving brain structure and cognition, and even reducing the occurrence of Alzheimer’s disease pathology. But just as you sink into a deep dark depression over never sampling another Ben & Jerry’s flavor, there may be a light at the end of the tunnel.

Johannes Gräff et al. (2013). A Dietary Regimen of Caloric Restriction or Pharmacological Activation of SIRT1 to Delay the Onset of Neurodegeneration. The Journal of Neuroscience, 33(21): 8951-8960

Adding to the mounting pile of evidence in support of caloric restriction, a recent study conducted by Gräff et al. demonstrated that expression of SIRT1, a protein critically involved in cellular metabolism, stress, and survival mechanisms, reduces neurodegeneration, which is the breakdown of brain cells, caused by aging.

Using a transgenic mouse that experiences neurodegeneration, brain atrophy, and memory loss the authors of the study reduced mouse food intake by 30% and saw protection against synapse loss in the hippocampus, the brain region responsible for memory.

This neurodegeneration is visualized in the image below. Neurons in the hippocampus shown in green and their synapses shown in red are significantly reduced in the transgenic animals without caloric restriction (left) compared to animals undergoing caloric restriction (right).


Prognosis for the caloric restriction group looks good, showing a significantly greater number of synapses, but how are their brains functioning? Do the increased number of synapses even work properly? Memory task performances of the calorie-restricted mice were also unimpaired compared to mice that that were free to eat as much as they liked. Importantly, SIRT1 activation was also increased in the calorically restricted group of animals.

Great! So calorie restriction seems to be saving the minds of these otherwise doomed mice from dementia through SIRT1 activation, but how does this impact your Rita’s Italian Ice gelato habit?

Well, if SIRT1 is directly involved in keeping our neurons in tip-top shape following caloric restriction, then maybe it’s possible to take caloric restriction out of the equation and deal with SIRT1 directly. And that’s exactly what Gräff and colleagues did.

Instead of undergoing caloric restriction, neurodegenerative mice ingested a SIRT1-activating compound (STAC) and still reaped the aging benefits of caloric restriction without the severe reduction in calorie consumption. Both synapse density and memory scores were improved in the STAC supplement group compared to the control group that did not receive STAC. Bring on the feasts!

While other factors in addition to caloric restriction likely impact longevity, such as diet and exercise, I’ll play the odds and ask for some STAC-laced sprinkles on my next ice cream sundae!

ResearchBlogging.orgGraff J., Kahn M., Samiei A., Gao J., Ota K.T., Rei D. & Tsai L.H. (2013). A Dietary Regimen of Caloric Restriction or Pharmacological Activation of SIRT1 to Delay the Onset of Neurodegeneration, Journal of Neuroscience, 33 (21) 8951-8960. DOI: