On February 13th, 2018 we learned about

The way we sense sound is strangely tied to our sight

If you’re having a hard time hearing something, try looking more carefully. Despite what every kid learns by kindergarten, there’s mounting evidence that our hearing isn’t just up to our ears. Multiple studies are finding aspects of hearing that are shaped at least partially by our eyes, and that the two sensory systems are actually intertwined in our brains. Of course, that doesn’t mean it’s all perfectly clear, as some of the dynamics in this relationship remain a bit confusing, starting with our eardrums.

Ears move with eyes

The tiniest bones in your body are in your ear. While you probably can’t wiggle your ears as well as some other animals, you do still have some control over this anatomy. Usually, the three bones, or ossicles, in the middle ear move to help or hinder sounds passing through your ear, modulating their volume, and their movement is seen as a reaction to outside sounds. However, researchers using very sensitive microphones in a quiet space were able to listen to the vibrations created by a person’s ossicles, and found that they move in tandem with the eyes. In fact, they seem to actually start vibrating in the ear before eye movement can be detected, suggesting that both bits of anatomy are being triggered by the same motor functions in the brain.

The ossicles’ movement doesn’t appear to be completely arbitrary either. The bones will move inward or outward according to the direction the eyes are looking. They continue this vibration until the eyes stop. It’s unclear at this point why this occurs, although it’s thought that our brains may be merging the visual and auditory information gathered during this activity to build a more cohesive sense of the space around us.

Looking to listen

This kind of cross-sensory augmentation was a bit clearer in a second study that linked visual cues to sound clarity. Previous work had found that around 25 percent of the auditory cortex in the brain is actually responsive to light, despite the fact that light is… generally silent. It now appears that this sensory overlap is a way for our brains to pick which sounds are most deserving of our attention.

An experiment watched the brain activity of ferrets while they were presented with visual and auditory stimuli. Simulating a noisy environment, the ferrets were made to listen to multiple sounds layered on top of each other. While the ferrets listened, a light would flash at different rhythms, sometimes syncing up with one particular set of sounds or another. This synchronization was apparently picked up by the ferrets’ brains, which then started filtering the competing noises to prioritize that particular sound. When the visuals somehow matched the sound, that sound was processed to be easier to perceive.

While you’ve probably never noticed your eardrums wiggling, you may have noticed this second phenomenon at a party. By watching someone’s lips as they spoke, you were seeing visual information that was synced to the timing of their voice. Your brain could then prioritize sounds on that same rhythm, helping you make sense of what was said. However, the ferrets demonstrate that this wasn’t because you were lip-reading necessarily, as the furry critters experience a similar effect without anything resembling human speech. Instead, it seems that many of the relationships between our hearing and our site likely evolved in a distant ancestor, helping them make sense of the clattering sounds to their left, right, or right in front of them.

Source: Visual cues amplify sound by University College London, Medical Xpress

On January 30th, 2018 we learned about

Thinking with our body and getting hungry with our brain

Cognition occurs in the brain. Millions of specialized neurons send signals to each other, processing stimuli and sending out new commands to our bodies that help us understand and interact with the world. Of course, this system seems to be overridden when we’re feeling particularly hungry, in which case a lot of rational thinking seems to go out the window until we satisfy our tummy again. While there’s obviously no neurons working directly in our digestive tract, researchers studying the relationship between thought and physiology are finding some interesting dynamics that may help explain how we might sometimes find ourselves ‘thinking with our stomach.’

Figuring things out with physiology

As one of the larger-brained animals on the planet, humans generally deride the idea of being guided by hunger or other biological needs. However, researchers from the University of Exeter argue that a complex, calorie-hungry brain isn’t necessarily every species’ best option. Many animals do quite well using things like hunger as a sort of analog for memory in the brain. If an animal feels especially hungry, it doesn’t need to do a lot of complex analysis to know that its needs aren’t being met in its current environment, and so either its location or behavior needs to change. In this model, physiology can step in to motivate animals to seemingly smart choices, reducing the amount of calorie-hungry gray matter an animal needs to survive.

Stressed cells seek sugar

This isn’t to say that our brain plays no role in making choices in our lives. Indeed, tests with mice have shown that brain activity may effectively override physiological needs under the right conditions. The mice had brain cells in their paraventricular hypothalamus, which are associated with social stress, artificially stimulated. When offered different foods rich in either fat or sugar, the mice overwhelmingly binged on carbohydrates, beyond any dietary need for that much starch. With the similarities between human and mouse brains, this is likely tied to the concept of ‘stress eating,’ where we load up on foods even though we don’t necessarily need them. It’s a good reminder that neither our stomach nor our brain operates in isolation, and that what may feel like a choice or craving is probably the result of interactions between multiple systems in our body.

Source: Gut instinct makes animals appear clever by University of Exeter, Phys.org

On January 17th, 2018 we learned about

Identifying the brain cells that mammals use to make mental maps of where their peers are moving

You can, presumably, walk through your house without needing to concentrate on where you are. You’ll probably make a note of a chair out of place, or shoes left on the floor, but you won’t have to think much about your relationship to the space around you thanks to your hippocampus. This small structure in the center of your brain has been found to use neurons known as “place cells” to essentially build a map of wherever you are. In addition to keeping track of your position in your immediate surroundings, researchers are now finding that the hippocampus is also tracking where social peers are moving, as well as the trajectories of other objects.

To see these varying layers of functionality in action, researchers monitored the brain activity of both rats and Egyptian fruit bats while they moved through a closed environment. They compared which neurons in the hippocampus were active when the primary animal was moving to when they watched a peer moving through the same space. For rats, this was easier, as the primary rat could watch from a closed space while their peer explored a simple T-shaped maze. The bats initially wanted to fly right alongside their peers though, so researchers had to rely on social hierarchies a bit. They allowed an alpha male in the group to fly between designated waypoints first as a “teacher,” while monitoring the brain activity in a subservient “student” bat who had to wait its turn.

Brain cells for spatial awareness

With lightweight “neural loggers” in place, researchers were able to precisely observe which neurons in the hippocampus were needed for each phase of each experiment. With both rats and bats, the place cells seemed to be sorted into a few general groupings. Some were active only when the primary animal was traveling through the maze, or flying to the designated perches. Other place cells seemed to be designated for tracking the peer animals’ travel, although there was some overlap in these groups, possibly for when the primary animal arrived in a location previously occupied by its peer.

Finally, there were cells that were activated by watching non-animals in the space. The bats, for instance, were shown bat-sized balls that were moved through the same course they’d flown through themselves. This movement was still mapped to place cells in the bat’s hippocampus, but they didn’t show the same overlap as before. There seemed to be a bigger distinction being made for the ball, although at this point it’s hard to be sure if this was because the bat recognized that the ball wasn’t alive, or if it was simply not a bat.

More than a simple map

While the bundles of place cells weren’t described identically in both studies, they both point to similar categorization in rat and bat hippocampuses. The place cells for the primary animal partially overlapped with the cells activated by watching a peer, and these were separate from inanimate objects on the same map. Researchers suspect that the overlap with living peers may be informed by their social relationship to the primary bat or rat. This would mean that the hippocampus isn’t only charting the physical world, but also the animal’s social relationships as well. If this is true, a bat watching a rat would map the rat’s location more like it mapped the ball- as a discrete group, instead of overlapping with the bat’s own location.

The functionality of these animals’ hippocampuses isn’t only relevant to how a rat finds its way through burrow. The similarity between these two species’ brains is very likely to originate from a common mammal ancestor, which means that our own hippocampus is mapping the world in the same way. Even if we don’t normally need to consciously think about these relationships when walking through our house, knowing how our brains deal with the world will help us work with these functions break down due to aging, disease or other damage.

Source: ‘Bat-nav’ reveals how the brain tracks other animals by Alison Abbott, Nature

On January 11th, 2018 we learned about

Mammals keep our offspring on our left side so we can more easily identify their emotions

By the time my son was two-years-old, my left tricep was in great shape. Neither of my kids are petite, and they both loved to be carried around, giving my arms and shoulders plenty of exercise. Being right-handed, I tried to keep the kids perched on my left arm most of the time, leaving my “good” hand to interact with the rest of the world. There may have been a more tender side to this preference though, as researchers are building the case that the left-arm bias may have actually been about accommodating the right side of my brain, not my hand.

Looking closely from the left

According to this study, the key reason to hold a baby in your left arm is so their face will be more clearly visible to your left eye. That eye is connected to the right hemisphere of your brain, which is more directly associated with processing spatial and emotional information. You probably don’t notice any that gap in your day to day activity, but most of the stimuli you’re looking at isn’t being clutched right next to your face (and pulling on your glasses). So by picking up small changes in your baby’s emotional state more easily, a parent can either head off trouble or just provide emotional feedback a bit faster.

Now even if parents have been proven to more accurately assess their babies’ moods when relying on their left eye, that doesn’t prove that emotional communication is the root cause behind this pattern. Maybe our right hemispheres became attuned to facial expressions because so many of us were already right-handed, and needed to access the ancient equivalent of a cell-phone all day? To test this, researchers looked at other mammalian parents to see if they had a similar preference in how they viewed their kids. Non-primates in particular were of interest, as their forelimbs should make less of a difference in how they interact with their offspring.

Gauging the maternal gaze of bats and walruses

The two other mammal species didn’t have much in common, except for the fact that they were known to spend some time looking at the faces of their offspring. Walruses and flying fox fruit bats were both observed interacting with their young, and as expected, they were more likely to orient themselves to keep their kin in their left visual field. In the case of the walruses, researchers noted that interactions from the animals’ left side weren’t only more common, but they lasted longer than engagement from the right.

In all of these cases, this pattern turned up in the babies as well. Offspring tended to approach their parents from the left, and when facing a parent could watch with their left eyes, just like their mothers.

Looking and leaning when kissing

Looking back to humans, a separate study may have inadvertently found a similar connection between adult humans. Adult couples were asked to kiss their spouse, and then record their descriptions of how the kiss occurred. Among other trends, researchers found that most people prefer to avoid mirroring their partner’s head position, and will tilt their head in the opposite direction, usually to each person’s right.

This was attributed to the handedness of each participant, and who initiated the kiss (usually men in heterosexual couples.) However, it seems like the tension between handedness and emotional observation is relevant here, because tilting your head to the right means that you’re giving your left eye the best view of your partner’s face. It seems fair to say that reading the emotional state of the person you’re about to kiss would be important, giving people all the more reason to lean right as they smooch. Unfortunately, this study didn’t specifically mention how often folks were kissing with their eyes open or closed.

Source: Mammals prefer to cradle babies on the left, study demonstrates by Nicola Davis, The Guardian

On December 18th, 2017 we learned about

Liquefied brains let researchers count and compare the number of neurons in mammals’ cerebral cortices

It’s hard to put a measure on intelligence. Humans compare ourselves with things like IQ tests, but they’re known to be an imperfect metric at best. Comparisons between species gets even harder, since what’s clever for one animal may be useless for another. Nonetheless, scientists have been looking for ways to understand and compare brain functionality for years, considering brain sizes, body sizes and more. Researchers from the University of Wyoming have taken a new approach to this question, putting a number on the number of neurons, or brain cells, an animal has in its head. Since connections between neurons are critical to cognition and perception, this may be a fair reference point for assessing what an animal’s brain can handle.

Between a brain’s complexity and the size of individual cells, counting every neuron wasn’t really possible. Every fold and crevice in a brain increases its effective surface area, making more room to pack in more cells. To simplify the counting procedure, researchers essentially eliminated that barrier, liquefying brains before counting the neurons. They seeded the brain slurry with a marker molecule that attached only to neurons, then counted those markers to get a tally. There was also a comparison between the total number of brain cells, and the number of cells in the animal’s cerebral cortex, which is the outer layer of tissue that handles most complex thinking and problem solving.

How mammal brains measure up

To keep the scope of the study manageable, researchers only looked at mammal brains. They did compare the brain cell counts between predator and prey species though, as they expected predators to have more neurons thanks to the cognitive demands of capturing their meals, rather than simply grazing or browsing. This prediction didn’t really hold true, as species had surprising variations in brain cell density, raising a number of questions about what those extra neurons do for each species.

For the most part, bigger brains had more neurons, but not always. A greater kudu (Tragelaphus strepsiceros), for instance, has around 307 grams of gray matter, and nearly 5 billion brain cells. A ferret, on the other hand, has only 5.4 grams of brain tissue, containing 404 million cells. Aside from the predator/prey relationship, this all seems fairly intuitive. The weirder findings concerned bears, dogs and raccoons. Brown bears (Ursus arctos) have hefty brains, with over nine billion cells overall, but their cerebral cortex had a paltry 251 million neurons. That’s fewer neurons than the cortices of raccoons, hyenas, lions and amazingly, golden retriever dogs. Playing fetch may be more complex than we appreciate, as golden retrievers somehow had over twice the connections of a bear. Fortunately, the two biggest outliers in this study matched expectations, as elephants humans and both eclipsed the other creatures measured, with over trillion and 16 trillion neurons in their cerebral cortices respectively.

Bigger, better, or just specialized?

So is your dog likely to outsmart a lion? Not necessarily. Researchers stress that these numbers don’t represent IQ points for each species. The raccoon’s high neuronal density may be due to its sensitive tactile abilities, which allow it to feel and manipulate it’s environment with its hands. Brown bear brains might be explained by a need for efficiency, since the bears hibernate for six months each year, and save energy by not feeding as many calorie-hungry neurons year-round. At the very least, this technique will let future studies have a new metric to compare against, seemingly providing a more detailed measurement than brain weight and volume alone.


On December 12th, 2017 we learned about

Even as an adult, language is easier to listen to in your right ear

The next time you have a bad connection for a phone call, make sure to listen to the call with your right ear. Your phone won’t know the difference, but the sound of the caller’s voice will be easier to understand because of the way our ears are connected to our brains. Most language processing takes place in our brain’s left hemisphere, and that side of the brain is more directly connected to your right ear (along with everything else on the right side of your body). While both ears can theoretically capture sounds equally well, your brain will do a better job of parsing speech if you give it this small assist, especially when there are other competing sounds.

You probably don’t notice this right-ear bias too often in your daily life, because most adults can basically compensate for it. Children’s brains, on the other hand, are still learning the complicated task of sorting and filtering all the different sounds that may or may not be relevant to speech, and so the right-ear bias is more pronounced. By age 13, most kids seem to have this auditory puzzle sorted out, but researchers wanted to confirm that it’s truly eliminated in adult hearing.

Finding the limits of parsing language

To find out how well a person can juggle sounds in their right and left ears, researchers employed what’s known as a dichotic listening test. Using headphones, test participants would hear different verbal statements in either ear simultaneously, such as “I have a red car” on the right and “You eat ice cream” on the left. They then had to report which side heard what, and hopefully repeat both phrases back entirely. As expected, most adults had little trouble with this task, at least up to a point.

Hearing these sounds wasn’t any special challenge for people’s ears, since one word is as tough to hear as another. Since the point of failure would likely be in people’s brains, researchers tried increasing the number of words or list items that people needed to keep track of at a time, putting pressure on people’s working memory. Once each ear had more than six items to keep track of, the right-left bias reemerged. As people’s memory demanded more attention, the efficiency of the connection between right ears and language processing boosted people’s performance by an average of eight percent.

So the efficiency of listening to language with your right ear isn’t really lost after age 13, but most of the time it just isn’t needed. Earlier testing likely missed this, because they hadn’t demanded very much of people’s brains. When you’re in a challenging listening environment, or feel like you’re just juggling a lot of stimuli at once, save your brain some trouble and try to listen in with your more reliable right ear.

Source: Want to listen better? Lend a right ear, Science Daily

On December 11th, 2017 we learned about

Mental complexity makes changing your actions slower than changing your mind

A sprinter can begin moving just 150 milliseconds after hearing a starting pistol. If they’ve made an error though and started early, a new cascade of commands is needed to stop their feet from going further. There’s very likely a moment where the runner has come to the decision to stop running, although their legs don’t seem to have gotten the message yet, carrying them further forward for a few steps. Some of that is due to balance and inertia, but researchers from Johns Hopkins University have found that there’s plenty of mental inertia to overcome as well.

Steps needed to stop

To put it briefly, changing your mind is complicated. A decision isn’t a single thing in your brain, as scans of human and monkey brains have identified at least 11 brain areas that are involved in the process of changing your mind. This sort of mental bureaucracy isn’t just needed for complex tasks that require coordination of whole body, like sprinting or dancing. Even if you’re simply trying to reign in your gaze while looking at a computer screen, at least three brain areas need to issue new commands while communicating with eight others.

The latter scenario was obviously a little easier to test in a laboratory setting. Test participants were asked to look at the central part of a screen while waiting for a target to appear. The target didn’t appear in the center, and sometimes people were allowed to follow the instinct to glance at that new stimulus. Other times, the appearance of the target went along with a reminder to keep their eyes in the center of the screen, requiring a quick mental change in course. By measuring brain activity, researches could track which brain areas were then required to realize the “mistake” and issue new commands to the eyes.

The timing of ‘too late’

The sequence that emerged revealed a critical moment about one-tenth of a second after participants were reminded to look at the center of the screen. By that time, the command to look at the distracting target was being sent to the eyes, and could no longer be aborted by new thoughts forming in the brain. This would be the brief window where participants would be aware of their error but unable to do anything about it. It’s brief in the grand scheme of things, but just long enough to reveal an incongruity in how our brains interact with our bodies.

Knowing this may be useful for more than just informing us about our next moment of regret. Researchers suspect that disruptions in the process to change one’s course of action may have serious consequences, as it likely plays a role in self-preservation, risk-taking, and possibly drug addiction. A separate study of methamphetamine users found that their cravings extended the amount of time needed to change their behavior once an action had been started. This isn’t to say that this mental process explains drug dependencies, but it may help explain some of the tertiary effects of drug use that can make a bad situation even worse.

Source: Why Your Brain Has Trouble Bailing Out Of A Bad Plan by Jon Hamilton, NPR Shots

On December 5th, 2017 we learned about

Pigeons parsing time and space suggest that our brains might not be as special as we thought

Thanks to a test of pigeons’ sense of space and time, researchers may be casting doubts on the evolution of human brains. That’s not a knock on the people studying pigeons— these birds are capable of a lot, right down to helping diagnose cancer. The issue is that the birds seem to have a quirk in their perception that has previously only been seen in primates like us. It’s been understood to be tied to the specific structures in our brain that assist with processing spatial and temporal information, but that can’t be the case with these pigeons, because their brains simply don’t have any of the structures in question.

How long is a line, and how long does it last?

The first phase of this test trained pigeons to watch a screen, and then poke a response on a touch screen in order to earn a snack. When looking at a two-inch line and a nine-inch line, they needed to select the longer option. When lines were flashed on the screen for either two or eight seconds, the birds needed to choose the shape shown for the longer duration. This was nothing to sneeze at, but it was only the training for the real testing in phase two.

Once the pigeons seemed comfortable with looking for lengthy lines for longer time periods, researchers complicated their task by mixing in more intermediate choices. Instead of the seven-inch difference between the first sets of lines, the birds now had to consider lines that were only off by one inch. The attribute that was being tested was also less clear, with length and duration both being tested at random. This forced the birds to really pay attention to both space and time, which lead to some interesting blurred lines in their perception.

As the pigeons progressed, a pattern emerged that showed how their brains handled this spatial and temporal information. When the birds saw a longer line, they were also likely to react to it as if it were on screen longer. The reverse was true as well, with lines displayed for longer amounts of time apparently appearing lengthier to the pigeons as well. It may sound strange on paper, but as a primate you’re probably more familiar with this than you might think, as we do this too. The major difference is that we do it thanks to both types of information being processed in the same place in our brains— the parietal cortex in our cerebral cortex. Without such a structure to mash that information together, why does pigeon perception seem to work the same way?

Explaining the overlap in pigeon perception

There are two hypotheses at this point, and both of them reduce the prestige of a primate’s parietal cortex. The first hypothesis is that pigeons, and probably other birds as well, evolved similar cognitive abilities independently of mammals, essentially reproducing what our primate brains do in these tests, right down to the errors. This kind of functional overlap does occur in what’s called convergent evolution, but not usually to this degree of specificity. The strikingly similar overlap in pigeons’ spatial and temporal perception has with primates seems unlikely to have occurred by chance, particularly without any clear evolutionary benefit to promote its growth in two family trees.

The second explanation is that this mental circuitry evolved once, long ago in a common ancestor, and bird and mammal brains have just packed it into different structures in our brains. So instead of using a parietal cortex, those circuits, quirks and all, were packed into birds’ palliums instead. The catch here is that mammals haven’t shared an ancestor with birds for millions of years, meaning this specialized perception has been getting passed down through a lot of different species, well beyond the chimpanzees and pigeons we’ve tested. Since the underlying structure that handles this thinking might not be exclusive to primates, it suggests that some of our amazing cognition just needs to be properly tested in other animals.

Source: Pigeons can discriminate both space and time, Iowa Now

On November 30th, 2017 we learned about

Overactive olfactory abilities provide clues about people who can’t understand their own emotions

Many nursery rhymes are meant to teach simple lessons, but “If you’re happy and you know it” may also be a step towards diagnosing a neurological disorder. While most toddlers are probably more concerned with learning the order of hand-clapping and foot-stomping, the simple song’s question about knowing when you’re happy is actually a significant challenge for people with alexithymia. There’s a range of symptoms in alexithymic people, but they mostly revolve around difficulty identifying one’s own emotional state, from understanding what is being felt to figuring out how to express those feelings to others. Weirdly, the other major symptom associated with alexithymia was recently discovered to be an altered sense of smell, which may prove to be useful in understanding the neurology behind the condition.

Connecting odor and emotion

The nexus of smell and emotional awareness seems to be tied to where some of this mental processing takes place in the brain. Previous studies found that there was overlap in some of the brain areas that handle emotion and olfactory perception, prompting researchers to look at how alexithymic people might experience smell differently than the general population. If a pattern could be detected, it would hopefully shed light on the relationship between emotional and olfactory understanding, and why that overlap might exist in the first place.

The experiments divided alexithymic participants into various sub-groups. For instance, some people were found to struggle with identifying emotions to themselves, while others only had problems describing those emotions to others, and everyone operated on a spectrum of how severe the symptoms were. Once sorted, volunteers were asked to sniff and rate various smells from Sniffin’ Sticks, which are standardized odor samples for these kinds of experiments. Since reactions to many odors are specific to a person’s cultural background, researchers tried to avoid getting people’s opinions on smells. Instead, they asked them about how strong smells were while measuring physiological responses, such as heart rate and breathing, to each scent.

Doing less with more

The pattern that emerged was about as intuitive as the overlap between odor and emotional perception as a whole. People who had more severe alexithymia symptoms also had more acute senses of smell, detecting smaller traces of scents than people who had an easier time parsing their emotions. Instead of facing confusion because of muted odor and emotional response, researchers now believe that these people may have a hard time making sense of emotions because of an overwhelming amount of activity. If the olfactory and emotional centers in the brain are activated constantly, getting the signal from all that noise likely becomes difficult, leaving people with fewer cues about when exactly to clap their hands or stomp their feet, even if they can smell them across a room.

Source: The nose reveals our relationship with our emotions, EurekAlert!

On November 12th, 2017 we learned about

The brain activity that can lead to seemingly irrational cost comparisons

In the book Predictably Irrational, behavioral economist Dan Ariely points out numerous scenarios where people reliably make illogical choices related to money or other measurements of value. For example, most of us would stop to consider spending an additional 75 cents to get a pen with a one dollar notebook, but wouldn’t blink before spending an extra ten dollars for an unnecessary drawstring bag to go with a five-hundred dollar phone. On the surface, closely scrutinizing the cheaper pen while not thinking about the more expensive bag makes no sense, apparently stemming from a weird quirk of the human brain that leads to weird allocations of resources and exploitation by marketers. Neuroscientists have dug a bit deeper though, revealing the benefit of this seemingly illogical thought pattern.

From an economics standpoint, this truly is illogical behavior. In theory, we’d be better served if we scrutinized our purchases based on their individual costs, and how much value they might provide. The cost of other purchases shouldn’t matter, unless we’re trying to build a budget. Our brains obviously don’t work this way though, and seem to decide if something is affordable or costly based on the costs of related purchases, which can of course throw off our sense of economic perspective.

Monkeys judging juices

For better or for worse, this kind of thinking has been built into primates for a long time. Monkeys picking between apple and grape juice had their brain activity monitored to figure out exactly what happens in our brain when we’re making a decisions like this. When something desirable is being considered, sets of neurons in the orbitofrontal cortex start firing faster and faster. So when a monkey was choosing between apple and grape juice, it’s neurons worked faster when concentrating on the apple juice. When the portions of each juice were shifted, such as a smaller cup of apple juice or a larger cup of grape juice, the neurons adjusted their firing rates, seemingly weighing the pros and cons of either choice.

The catch is that these neurons can only fire so many times a second. If a cup of apple juice is more exciting than a cup of grape, you might see one group firing 500 times per second. When looking at 10 cups of apple juice to one cup of grape juice, it’s a significantly more attractive offer for the monkey, but those cells can still only fire 500 times a second. Since brains can’t scale their response to every value proposition, the scale has to be adjusted constantly. For choices that are close in value, this 0-500 scale can be fairly fine-grained and precise. On the other hand, when comparing choices with a big gap in values, like a bag and a phone, the scale is sort of maxed-out, and the comparisons have to be more like crude estimations rather than careful evaluations.

Sliding scales help preserve our preferences

The upside of all this is that quantity doesn’t always drown out quality. In the case of the monkeys, huge portions of less desirable grape juice never completely eclipsed the monkey’s preference for apple juice. The neurons couldn’t make the most nuanced decisions in those circumstances, but the apple juice always held some value in the monkey’s mind. Assuming monkeys like the apple juice for a healthy reason, this system can ensure that they, and by extension we, don’t end up ignoring options that really matter to us, even if they’re less abundant. It may not be the calculation an economist would come up with on paper, but this neurological circuit can truly work in our favor, allowing for some unnecessary accessories along the way.

Source: Penny-Wise, Pound-Foolish Decisions Explained By Neurons’ Firing, Scienmag