On March 8th, 2018 we learned about

Handwriting’s complicated connections to how well students perform in school

“Daddy, look, I can write my name!”

It was a triumphant moment for my third grader, not because she finally figured out how to spell her name, but because she’d learned to write it in cursive. It clearly meant a lot to her, and helped me rationalize why kids were still being taught to write in a way that they would probably be barred from doing in the future if they had teachers like I had in high school. With the additional need to learn to type in today’s world, worrying about handwriting seems a bit outdated, but there’s some evidence that it’s more useful than I’d imagined. Even hand-written printing may help kids develop both motor and academic skills, even in an era of buttons and touch screens.

How penmanship helps expose problems

Older studies have linked handwriting to kids’ academic success, although it wasn’t entirely clear if snappy penmanship actually promoted good grades, or just correlated with them. While kindergartners with better handwriting were found to have higher math and reading scores in second grade, it was possible that both skills were the result of some other factor that promoted academic success.

Other studies have since tried to pick apart how better fine motor skills might improve one’s math and writing ability. Researchers have started to find that handwritten words require more than just motor skills, as language comprehension can play a role in one’s use of a pen as well. For instance, children with developmental coordination disorder (“dyspraxia”) will write fewer words per minute than their peers, but there will be no sign of motor impairment in the way they wield a pencil. When it comes to writing words though, these children often pause in the middle of writing a word, leading to poorly formed, and spelled, words. This doesn’t show a definitive cause and effect, but it does reveal another aspect of how the motor control necessary to write can have an influence on the expression of language.

Screen time seems safe

While most schools still expect kids to become comfortable with a pencil, there is growing concern about that kids’ use of touch screens will stunt their handwriting skills, thereby hurting their academic skills in the process. More data needs to be collected on this topic, but preliminary studies are finding no cause for alarm. If anything, young kids who spent time poking, scrolling and swiping at screens actually achieved hit fine motor milestones earlier than expected. This isn’t to say that more screen-time will make for increasingly superior students, but poking at on-screen targets might be a way for children to practice their coordination before they’ve learned their ABCs.

Source: We can't say if touchscreens are impacting children's handwriting—in fact, it may be quite the opposite by Melissa Prunty & Emma Sumner, The Conversation

On March 1st, 2018 we learned about

Investigating how our brains determine when our bodies should have a drink

Your skin may seem dry, your mouth may be parched, but the experience of thirst is all in your brain. To be more precise, it’s in your ‘thirst center,’ a bundle of brain cells found in your hypothalamus. These neurons are tasked with the job of telling an animal when to get a drink, and just as importantly, when to stop sipping. Weirdly, that second command takes place well before the rest of the body can benefit from any new fluids, which has made untangling the exact functionality of this system a surprising tricky problem.

When the body wants water

The thirst center isn’t making us go for a drink at random. The hypothalamus takes in information about a lot of our physiology, including blood pressure, sodium concentration and more. When these systems are out of balance and more water is needed, one response is to create more of the hormone vasopressin in the pituitary gland, which then makes the kidneys try to reclaim water from any available urine. The second, but probably preferable response is to have the thirst center have us look for a drink.

As obvious as that response may seem, it’s taken years for researchers to isolate which cells were actually in charge of feeling thirsty. To identify exact which parts of the brain were tied to thirst, researchers turned to optogenetics, a technique that uses engineered neurons to be activated by light via implanted fiber optics. In this case, mice had their brain cells monitored and stimulated until they either became thirsty, or artificially had their thirst quenched without needing a drink.

In this process, it became clear that the amount of water an animal drinks is entirely up to the brain. Water won’t do a body much good until 10 to 15 minutes after it’s been ingested, whereas mice’s thirst center stopped them from drinking after only a minute each time. Somehow the brain felt it had had enough liquid, and that it was time to stop drinking.

Drinking disorders

This may sound pointlessly academic, but that’s only because it works so well for most of us. Whatever mechanism the thirst center may be using to measure our water intake, it’s generally helping us obtain a safe amount of water. If we ignore the thirst center, as some marathon runners have done, or the mechanism is damaged, as with people with psychogenic polydipsia, the results can be dangerous or even fatal. Cells swell and burst like balloons as they try to absorb the incoming water, a fate particularly damaging when it occurs in the brain. On the flip side, as people age they often have a weaker thirst response, and may forget to hydrate on a regular basis.

At this point, the investigation into how we feel thirst is actually moving out of the brain. Researchers are interested in muscle cells in the throat that help with swallowing, as there’s a chance that our sip-stopping thirst center may be gauging our water intake based on how quickly we swallow.

Source: Still Thirsty? It's Up To Your Brain, Not Your Body by Jon Hamilton, The Salt

On February 25th, 2018 we learned about

Younger kids are more likely to need help managing their prospective memory

When my third-grader gets home from school each day, she nearly always bursts in the door, dropping her bag and jacket on the floor. She leaves the front door open, which then reliably causes my blood pressure to tick up as I gripe at her about this little afternoon routine. For all the requests I’ve made that she close the door, hang up her jacket, and maybe even empty her lunch box, she still somehow claims to have forgotten those tasks, which seems absurd in my mind. How could she forget after all my nagging? Apparently quite easily, according to research from The University of Queensland.

The mechanism that, as annoyed parent, seems deficient in these scenarios is known as ‘prospective memory.’ This is the memory that helps us keep track of mundane tasks on a regular basis, and is actually quite unreliable in just about everyone. Adults forget to take out the trash or turn off the lights too, but we’ve simply developed strategies to deal with those lapses in memory. This can be by avoiding having someone around to chastise us for our mistake, or by expecting that we’ll forget something and setting up reminders in advance. To put it another way, kids prospective memory is no worse than adults, but they just don’t have the to-do lists and calendar reminders that adults do.

Learning their limitations

To test how kids manage their prospective memory, children from age seven to 13 were asked to play a video game that required them to remember on to three things in order to be successful. They were also given the option to set reminders for themselves while playing, theoretically relieving some their burden. While kids of all ages understood that remembering three things would be trickier than one, the younger kids didn’t seem to appreciate how a reminder would help. Only children over nine years old ever set the reminders, meaning all the younger kids were asking a lot more of their memory.

These results fit with other studies of children’s memories. While even a three-year-old can understand that remembering one thing is easier than remembering ten, younger kids don’t seem to have a sense of how to work within their cognitive boundaries. Only after age nine are kids more likely to pay extra attention to more difficult sets of information, or take advantage of tools that might reduce the effort needed in the first place.

No more nagging

As many adults have figured out, the easiest way to keep track of things is to work around the limitations of your prospective memory. Being repeatedly told to remember something probably won’t help as much as simply writing a note for yourself. This is in contrast to say, a parent repeatedly nagging a child to remember some particular chore, as all that nagging isn’t going to make the kid’s prospective memory any more effective.

Basically, kids can benefit from the same tools and tricks adults use. If a child isn’t quite ready for a phone beeping away with reminders, building other memory aids into their day, like chore checklist on their bedroom door, or placing a lunchbox near their bag, is a good way to put tasks in places where kids will catch them. Over time, these behaviors will hopefully become habitualized enough that they don’t slip through the cracks. Or at least the kid will grow up enough to start writing their own reminders on their hands like the rest of us.

Source: Parents, stop nagging kids not to forget – set visual cues instead by Adam Bulley, Jonathan Redshaw & Sam Gilbert, The Conversation

On February 13th, 2018 we learned about

The way we sense sound is strangely tied to our sight

If you’re having a hard time hearing something, try looking more carefully. Despite what every kid learns by kindergarten, there’s mounting evidence that our hearing isn’t just up to our ears. Multiple studies are finding aspects of hearing that are shaped at least partially by our eyes, and that the two sensory systems are actually intertwined in our brains. Of course, that doesn’t mean it’s all perfectly clear, as some of the dynamics in this relationship remain a bit confusing, starting with our eardrums.

Ears move with eyes

The tiniest bones in your body are in your ear. While you probably can’t wiggle your ears as well as some other animals, you do still have some control over this anatomy. Usually, the three bones, or ossicles, in the middle ear move to help or hinder sounds passing through your ear, modulating their volume, and their movement is seen as a reaction to outside sounds. However, researchers using very sensitive microphones in a quiet space were able to listen to the vibrations created by a person’s ossicles, and found that they move in tandem with the eyes. In fact, they seem to actually start vibrating in the ear before eye movement can be detected, suggesting that both bits of anatomy are being triggered by the same motor functions in the brain.

The ossicles’ movement doesn’t appear to be completely arbitrary either. The bones will move inward or outward according to the direction the eyes are looking. They continue this vibration until the eyes stop. It’s unclear at this point why this occurs, although it’s thought that our brains may be merging the visual and auditory information gathered during this activity to build a more cohesive sense of the space around us.

Looking to listen

This kind of cross-sensory augmentation was a bit clearer in a second study that linked visual cues to sound clarity. Previous work had found that around 25 percent of the auditory cortex in the brain is actually responsive to light, despite the fact that light is… generally silent. It now appears that this sensory overlap is a way for our brains to pick which sounds are most deserving of our attention.

An experiment watched the brain activity of ferrets while they were presented with visual and auditory stimuli. Simulating a noisy environment, the ferrets were made to listen to multiple sounds layered on top of each other. While the ferrets listened, a light would flash at different rhythms, sometimes syncing up with one particular set of sounds or another. This synchronization was apparently picked up by the ferrets’ brains, which then started filtering the competing noises to prioritize that particular sound. When the visuals somehow matched the sound, that sound was processed to be easier to perceive.

While you’ve probably never noticed your eardrums wiggling, you may have noticed this second phenomenon at a party. By watching someone’s lips as they spoke, you were seeing visual information that was synced to the timing of their voice. Your brain could then prioritize sounds on that same rhythm, helping you make sense of what was said. However, the ferrets demonstrate that this wasn’t because you were lip-reading necessarily, as the furry critters experience a similar effect without anything resembling human speech. Instead, it seems that many of the relationships between our hearing and our site likely evolved in a distant ancestor, helping them make sense of the clattering sounds to their left, right, or right in front of them.

Source: Visual cues amplify sound by University College London, Medical Xpress

On January 30th, 2018 we learned about

Thinking with our body and getting hungry with our brain

Cognition occurs in the brain. Millions of specialized neurons send signals to each other, processing stimuli and sending out new commands to our bodies that help us understand and interact with the world. Of course, this system seems to be overridden when we’re feeling particularly hungry, in which case a lot of rational thinking seems to go out the window until we satisfy our tummy again. While there’s obviously no neurons working directly in our digestive tract, researchers studying the relationship between thought and physiology are finding some interesting dynamics that may help explain how we might sometimes find ourselves ‘thinking with our stomach.’

Figuring things out with physiology

As one of the larger-brained animals on the planet, humans generally deride the idea of being guided by hunger or other biological needs. However, researchers from the University of Exeter argue that a complex, calorie-hungry brain isn’t necessarily every species’ best option. Many animals do quite well using things like hunger as a sort of analog for memory in the brain. If an animal feels especially hungry, it doesn’t need to do a lot of complex analysis to know that its needs aren’t being met in its current environment, and so either its location or behavior needs to change. In this model, physiology can step in to motivate animals to seemingly smart choices, reducing the amount of calorie-hungry gray matter an animal needs to survive.

Stressed cells seek sugar

This isn’t to say that our brain plays no role in making choices in our lives. Indeed, tests with mice have shown that brain activity may effectively override physiological needs under the right conditions. The mice had brain cells in their paraventricular hypothalamus, which are associated with social stress, artificially stimulated. When offered different foods rich in either fat or sugar, the mice overwhelmingly binged on carbohydrates, beyond any dietary need for that much starch. With the similarities between human and mouse brains, this is likely tied to the concept of ‘stress eating,’ where we load up on foods even though we don’t necessarily need them. It’s a good reminder that neither our stomach nor our brain operates in isolation, and that what may feel like a choice or craving is probably the result of interactions between multiple systems in our body.

Source: Gut instinct makes animals appear clever by University of Exeter, Phys.org

On January 17th, 2018 we learned about

Identifying the brain cells that mammals use to make mental maps of where their peers are moving

You can, presumably, walk through your house without needing to concentrate on where you are. You’ll probably make a note of a chair out of place, or shoes left on the floor, but you won’t have to think much about your relationship to the space around you thanks to your hippocampus. This small structure in the center of your brain has been found to use neurons known as “place cells” to essentially build a map of wherever you are. In addition to keeping track of your position in your immediate surroundings, researchers are now finding that the hippocampus is also tracking where social peers are moving, as well as the trajectories of other objects.

To see these varying layers of functionality in action, researchers monitored the brain activity of both rats and Egyptian fruit bats while they moved through a closed environment. They compared which neurons in the hippocampus were active when the primary animal was moving to when they watched a peer moving through the same space. For rats, this was easier, as the primary rat could watch from a closed space while their peer explored a simple T-shaped maze. The bats initially wanted to fly right alongside their peers though, so researchers had to rely on social hierarchies a bit. They allowed an alpha male in the group to fly between designated waypoints first as a “teacher,” while monitoring the brain activity in a subservient “student” bat who had to wait its turn.

Brain cells for spatial awareness

With lightweight “neural loggers” in place, researchers were able to precisely observe which neurons in the hippocampus were needed for each phase of each experiment. With both rats and bats, the place cells seemed to be sorted into a few general groupings. Some were active only when the primary animal was traveling through the maze, or flying to the designated perches. Other place cells seemed to be designated for tracking the peer animals’ travel, although there was some overlap in these groups, possibly for when the primary animal arrived in a location previously occupied by its peer.

Finally, there were cells that were activated by watching non-animals in the space. The bats, for instance, were shown bat-sized balls that were moved through the same course they’d flown through themselves. This movement was still mapped to place cells in the bat’s hippocampus, but they didn’t show the same overlap as before. There seemed to be a bigger distinction being made for the ball, although at this point it’s hard to be sure if this was because the bat recognized that the ball wasn’t alive, or if it was simply not a bat.

More than a simple map

While the bundles of place cells weren’t described identically in both studies, they both point to similar categorization in rat and bat hippocampuses. The place cells for the primary animal partially overlapped with the cells activated by watching a peer, and these were separate from inanimate objects on the same map. Researchers suspect that the overlap with living peers may be informed by their social relationship to the primary bat or rat. This would mean that the hippocampus isn’t only charting the physical world, but also the animal’s social relationships as well. If this is true, a bat watching a rat would map the rat’s location more like it mapped the ball- as a discrete group, instead of overlapping with the bat’s own location.

The functionality of these animals’ hippocampuses isn’t only relevant to how a rat finds its way through burrow. The similarity between these two species’ brains is very likely to originate from a common mammal ancestor, which means that our own hippocampus is mapping the world in the same way. Even if we don’t normally need to consciously think about these relationships when walking through our house, knowing how our brains deal with the world will help us work with these functions break down due to aging, disease or other damage.

Source: ‘Bat-nav’ reveals how the brain tracks other animals by Alison Abbott, Nature

On January 11th, 2018 we learned about

Mammals keep our offspring on our left side so we can more easily identify their emotions

By the time my son was two-years-old, my left tricep was in great shape. Neither of my kids are petite, and they both loved to be carried around, giving my arms and shoulders plenty of exercise. Being right-handed, I tried to keep the kids perched on my left arm most of the time, leaving my “good” hand to interact with the rest of the world. There may have been a more tender side to this preference though, as researchers are building the case that the left-arm bias may have actually been about accommodating the right side of my brain, not my hand.

Looking closely from the left

According to this study, the key reason to hold a baby in your left arm is so their face will be more clearly visible to your left eye. That eye is connected to the right hemisphere of your brain, which is more directly associated with processing spatial and emotional information. You probably don’t notice any that gap in your day to day activity, but most of the stimuli you’re looking at isn’t being clutched right next to your face (and pulling on your glasses). So by picking up small changes in your baby’s emotional state more easily, a parent can either head off trouble or just provide emotional feedback a bit faster.

Now even if parents have been proven to more accurately assess their babies’ moods when relying on their left eye, that doesn’t prove that emotional communication is the root cause behind this pattern. Maybe our right hemispheres became attuned to facial expressions because so many of us were already right-handed, and needed to access the ancient equivalent of a cell-phone all day? To test this, researchers looked at other mammalian parents to see if they had a similar preference in how they viewed their kids. Non-primates in particular were of interest, as their forelimbs should make less of a difference in how they interact with their offspring.

Gauging the maternal gaze of bats and walruses

The two other mammal species didn’t have much in common, except for the fact that they were known to spend some time looking at the faces of their offspring. Walruses and flying fox fruit bats were both observed interacting with their young, and as expected, they were more likely to orient themselves to keep their kin in their left visual field. In the case of the walruses, researchers noted that interactions from the animals’ left side weren’t only more common, but they lasted longer than engagement from the right.

In all of these cases, this pattern turned up in the babies as well. Offspring tended to approach their parents from the left, and when facing a parent could watch with their left eyes, just like their mothers.

Looking and leaning when kissing

Looking back to humans, a separate study may have inadvertently found a similar connection between adult humans. Adult couples were asked to kiss their spouse, and then record their descriptions of how the kiss occurred. Among other trends, researchers found that most people prefer to avoid mirroring their partner’s head position, and will tilt their head in the opposite direction, usually to each person’s right.

This was attributed to the handedness of each participant, and who initiated the kiss (usually men in heterosexual couples.) However, it seems like the tension between handedness and emotional observation is relevant here, because tilting your head to the right means that you’re giving your left eye the best view of your partner’s face. It seems fair to say that reading the emotional state of the person you’re about to kiss would be important, giving people all the more reason to lean right as they smooch. Unfortunately, this study didn’t specifically mention how often folks were kissing with their eyes open or closed.

Source: Mammals prefer to cradle babies on the left, study demonstrates by Nicola Davis, The Guardian

On December 18th, 2017 we learned about

Liquefied brains let researchers count and compare the number of neurons in mammals’ cerebral cortices

It’s hard to put a measure on intelligence. Humans compare ourselves with things like IQ tests, but they’re known to be an imperfect metric at best. Comparisons between species gets even harder, since what’s clever for one animal may be useless for another. Nonetheless, scientists have been looking for ways to understand and compare brain functionality for years, considering brain sizes, body sizes and more. Researchers from the University of Wyoming have taken a new approach to this question, putting a number on the number of neurons, or brain cells, an animal has in its head. Since connections between neurons are critical to cognition and perception, this may be a fair reference point for assessing what an animal’s brain can handle.

Between a brain’s complexity and the size of individual cells, counting every neuron wasn’t really possible. Every fold and crevice in a brain increases its effective surface area, making more room to pack in more cells. To simplify the counting procedure, researchers essentially eliminated that barrier, liquefying brains before counting the neurons. They seeded the brain slurry with a marker molecule that attached only to neurons, then counted those markers to get a tally. There was also a comparison between the total number of brain cells, and the number of cells in the animal’s cerebral cortex, which is the outer layer of tissue that handles most complex thinking and problem solving.

How mammal brains measure up

To keep the scope of the study manageable, researchers only looked at mammal brains. They did compare the brain cell counts between predator and prey species though, as they expected predators to have more neurons thanks to the cognitive demands of capturing their meals, rather than simply grazing or browsing. This prediction didn’t really hold true, as species had surprising variations in brain cell density, raising a number of questions about what those extra neurons do for each species.

For the most part, bigger brains had more neurons, but not always. A greater kudu (Tragelaphus strepsiceros), for instance, has around 307 grams of gray matter, and nearly 5 billion brain cells. A ferret, on the other hand, has only 5.4 grams of brain tissue, containing 404 million cells. Aside from the predator/prey relationship, this all seems fairly intuitive. The weirder findings concerned bears, dogs and raccoons. Brown bears (Ursus arctos) have hefty brains, with over nine billion cells overall, but their cerebral cortex had a paltry 251 million neurons. That’s fewer neurons than the cortices of raccoons, hyenas, lions and amazingly, golden retriever dogs. Playing fetch may be more complex than we appreciate, as golden retrievers somehow had over twice the connections of a bear. Fortunately, the two biggest outliers in this study matched expectations, as elephants humans and both eclipsed the other creatures measured, with over trillion and 16 trillion neurons in their cerebral cortices respectively.

Bigger, better, or just specialized?

So is your dog likely to outsmart a lion? Not necessarily. Researchers stress that these numbers don’t represent IQ points for each species. The raccoon’s high neuronal density may be due to its sensitive tactile abilities, which allow it to feel and manipulate it’s environment with its hands. Brown bear brains might be explained by a need for efficiency, since the bears hibernate for six months each year, and save energy by not feeding as many calorie-hungry neurons year-round. At the very least, this technique will let future studies have a new metric to compare against, seemingly providing a more detailed measurement than brain weight and volume alone.

Source:

On December 12th, 2017 we learned about

Even as an adult, language is easier to listen to in your right ear

The next time you have a bad connection for a phone call, make sure to listen to the call with your right ear. Your phone won’t know the difference, but the sound of the caller’s voice will be easier to understand because of the way our ears are connected to our brains. Most language processing takes place in our brain’s left hemisphere, and that side of the brain is more directly connected to your right ear (along with everything else on the right side of your body). While both ears can theoretically capture sounds equally well, your brain will do a better job of parsing speech if you give it this small assist, especially when there are other competing sounds.

You probably don’t notice this right-ear bias too often in your daily life, because most adults can basically compensate for it. Children’s brains, on the other hand, are still learning the complicated task of sorting and filtering all the different sounds that may or may not be relevant to speech, and so the right-ear bias is more pronounced. By age 13, most kids seem to have this auditory puzzle sorted out, but researchers wanted to confirm that it’s truly eliminated in adult hearing.

Finding the limits of parsing language

To find out how well a person can juggle sounds in their right and left ears, researchers employed what’s known as a dichotic listening test. Using headphones, test participants would hear different verbal statements in either ear simultaneously, such as “I have a red car” on the right and “You eat ice cream” on the left. They then had to report which side heard what, and hopefully repeat both phrases back entirely. As expected, most adults had little trouble with this task, at least up to a point.

Hearing these sounds wasn’t any special challenge for people’s ears, since one word is as tough to hear as another. Since the point of failure would likely be in people’s brains, researchers tried increasing the number of words or list items that people needed to keep track of at a time, putting pressure on people’s working memory. Once each ear had more than six items to keep track of, the right-left bias reemerged. As people’s memory demanded more attention, the efficiency of the connection between right ears and language processing boosted people’s performance by an average of eight percent.

So the efficiency of listening to language with your right ear isn’t really lost after age 13, but most of the time it just isn’t needed. Earlier testing likely missed this, because they hadn’t demanded very much of people’s brains. When you’re in a challenging listening environment, or feel like you’re just juggling a lot of stimuli at once, save your brain some trouble and try to listen in with your more reliable right ear.

Source: Want to listen better? Lend a right ear, Science Daily

On December 11th, 2017 we learned about

Mental complexity makes changing your actions slower than changing your mind

A sprinter can begin moving just 150 milliseconds after hearing a starting pistol. If they’ve made an error though and started early, a new cascade of commands is needed to stop their feet from going further. There’s very likely a moment where the runner has come to the decision to stop running, although their legs don’t seem to have gotten the message yet, carrying them further forward for a few steps. Some of that is due to balance and inertia, but researchers from Johns Hopkins University have found that there’s plenty of mental inertia to overcome as well.

Steps needed to stop

To put it briefly, changing your mind is complicated. A decision isn’t a single thing in your brain, as scans of human and monkey brains have identified at least 11 brain areas that are involved in the process of changing your mind. This sort of mental bureaucracy isn’t just needed for complex tasks that require coordination of whole body, like sprinting or dancing. Even if you’re simply trying to reign in your gaze while looking at a computer screen, at least three brain areas need to issue new commands while communicating with eight others.

The latter scenario was obviously a little easier to test in a laboratory setting. Test participants were asked to look at the central part of a screen while waiting for a target to appear. The target didn’t appear in the center, and sometimes people were allowed to follow the instinct to glance at that new stimulus. Other times, the appearance of the target went along with a reminder to keep their eyes in the center of the screen, requiring a quick mental change in course. By measuring brain activity, researches could track which brain areas were then required to realize the “mistake” and issue new commands to the eyes.

The timing of ‘too late’

The sequence that emerged revealed a critical moment about one-tenth of a second after participants were reminded to look at the center of the screen. By that time, the command to look at the distracting target was being sent to the eyes, and could no longer be aborted by new thoughts forming in the brain. This would be the brief window where participants would be aware of their error but unable to do anything about it. It’s brief in the grand scheme of things, but just long enough to reveal an incongruity in how our brains interact with our bodies.

Knowing this may be useful for more than just informing us about our next moment of regret. Researchers suspect that disruptions in the process to change one’s course of action may have serious consequences, as it likely plays a role in self-preservation, risk-taking, and possibly drug addiction. A separate study of methamphetamine users found that their cravings extended the amount of time needed to change their behavior once an action had been started. This isn’t to say that this mental process explains drug dependencies, but it may help explain some of the tertiary effects of drug use that can make a bad situation even worse.

Source: Why Your Brain Has Trouble Bailing Out Of A Bad Plan by Jon Hamilton, NPR Shots