On April 6th, 2017 we learned about

At six-months old, human babies admire those who help others

Our favorite fictional characters may be dressed up in magic cloaks, move things with their minds or even run as fast as the speed of light, but there’s a very strong chance they wouldn’t be nearly as attractive if they always acted selfishly. A few flaws may add some intrigue and texture to a story, but if the hero or heroine doesn’t help those less powerful than themselves, there’s a good chance we’re not going to be as enchanted by the character. Researchers are finding that this preference for pro-social behavior is likely built into our brains, as interest in sharing has turned up in everything from rats to monkeys. In case those animals were taught these ideas, researchers have also been working with human babies and finding that even before a person can talk, the idea of helping out is already attractive.

Saving less-fortunate shapes

While babies are always doing their best to soak up information about the world around them, at six months old, they usually can’t speak yet, and thus will be less influenced by cultural norms. Such an audience was shown a short movie about abstract, geometric shapes interacting with each other, avoiding other emotional influences like facial expressions or size differences. In the movie, two of the three shapes bump into each other while a third shape “observes” from a distance. Sometimes the third shape would intervene to help the bumped shape, while other times the observer would just exit the scene without interacting.

When later shown replicas of these abstract actors, most babies understood enough of their blocky drama to have a favorite character, usually selecting the intervening shape over all others. Even without words, the six-month-olds seemed to value a character that would help others. When ten-month-olds were tested, they added a new layer of discrimination in their judgement. If the intervening “hero” shape seemed to help by accident, the kids were not as impressed. Making an effort to help was more appreciated, a pattern that holds true in nearly all adult storytelling as well.

Giving is good, even if it’s hard

This may make things more straight-forward for writers, but researchers were more interested in just how innate human concepts of justice may be. Other research with toddlers found them to be surprised when shown other children receiving unequal amounts of food, even if their own snack supply was unaffected. Those same kids were also very likely to share their own favorite toys when asked later on. This isn’t to say that sharing and helping others is necessarily easy, or that being a good Samaritan is some kind of biological default— we know that sharing even unneeded resources can be difficult. But perhaps that difficulty is part of why we also appreciate moments of sharing and giving so much. We’re impressed by anyone who can do something to benefit the group at large, even if that action doesn’t require flying, shooting lasers or fighting off bad guys.

Source: Born to love superheroes, Scienmag

On April 4th, 2017 we learned about

British babies weep more each week than other infants in cross-cultural survey

Science has finally found the world’s biggest crybabies, and they’re in England. This isn’t a veiled comment about Brexit supporters, but the results of a world-wide study of how much infants cry per week. Since parents often feel completely powerless to control their babies’ outbursts, it might not be obvious how we can help humanity by studying the whimpering of babes around the globe. The answer is that it’s a first step, and the fact that trends were found at all suggests that there may be ways to a happier, or at least quieter, baby that could eventually be accessible to everyone, especially those living in the United Kingdom.

Summing up sobs and sniffles

Drawing on data from nearly 8,700 babies, researchers from the University of Warwick took steps to quantify baby crying. Breaking crying into something countable is important, since when a parent hears a baby’s cry it’s hard to focus on anything else— every minute of crying feels like too much, especially if you’re a woman. But “too much” doesn’t lend itself to real comparisons, so the compiled data allowed for real comparisons across countries to see where the babies spend more time in tears.

Obviously, every child is an individual, interacting with the individual sensitivities of their own particular caregivers, so the results had to allow for ranges of crying each week. On the extremes of this spectrum, some babies cried for as little as 30 minutes a day while others spent over five hours each day wailing. One a macro scale, the United Kingdom, Italy, Canada and the Netherlands had more crying babies overall. The more stoic children were found in Denmark, Germany and Japan. Trends were also found with the babies ages, with six-week-olds being the biggest bawlers but tear-time dropping by 50% only six weeks later (so hang in there, mom and dad!)

Nature, nurture, or both?

The fact that there are patterns is actually a reason for hope. It suggests that the time babies spend crying isn’t hard wired into our brains, and thus is subject to other variables that we might be able to control. So from these rankings, researchers are digging into the causes to see if there are economic, genetic, nutritional or cultural differences that make some babies louder than others. Past research into the minds of moms (ok, mouse moms) has already found a mix of causes. A crying mouse pup will trigger oxytocin release in its mother’s brain to spur a care response, but only the mother has had some practice and learned that feedback loop to begin with.

In the mean time, remember that babies won’t be crying forever, although maybe think twice about booking your flights out of Heathrow.

Source: Babies cry most in UK, Canada, Italy and Netherlands, Scienmag

On February 27th, 2017 we learned about

We shape and style ourselves to meet cultural expectations for our names

It’s not obvious, but my daughter is named after a fruit. My son, thanks to a bit of uncertainty on his parents’ part, goes by his middle name. As such, both kids were very interested in how they might fit into a study on how well people can guess a stranger’s name based on a photo of their face. Even if their unusual names were edge-cases, the study found that as long as you share your local culture’s expectations about your name, other people will be able to read those ideas in your appearance. Basically, it’s easy to spot a ‘Bob’ as long as we all agree what a ‘Bob’ should look like, including Bob themselves.

Putting names to portraits

The study had a variety of variations, but the core idea was to show people photos of strangers, and have them pick each stranger’s name from a list of five options. Even with faces that had some features hidden, sometimes revealing as little as just hairstyles, people could generally pick the correct name around 35 percent of the time, which was significantly better than they would have done just guessing randomly. Some conditions could push that number higher, but overall, researchers were impressed at how well people could assign the correct name to the photo.

By testing different combinations of features in different rounds of testing, researchers were able to isolate a few influences on how well we can pick a name. The biggest factor was the test-subject’s native culture. French people could do quite well assigning French names, Israelis could assign Israeli names, but pan-cultural names fell apart. This strongly suggests that people were reacting to attributes embedded in the names that were culturally assigned, rather than innate in the phonemes of the name, or anatomical features of a face. In the end, the French test subjects had the easiest time spotting Veroniques, and the Israelis could easily name Toms, probably thanks to expectations about those names, more than the photos.

Fitting your face to societal standards

This isn’t to say that names are entirely in the eye of the beholder. There were cues in the faces, and they were probably unconsciously developed as those people grew up thinking about their own names. Analysis found that people with matching names had the most physical similarities around the eyes and mouths, which are muscles that can be controlled. If you grew up being taught your name represented something light and playful, you’d probably hold your face accordingly, eventually growing smile lines and wrinkles associated with happier emotions. The fact that hairstyles were found to be strong indicators for names fits this model, as hair is another feature people can control, and likely unintentionally style to fit their self image as a Veronique or Tom, for instance.

This isn’t totally definitive, as name popularity may have allowed people to eliminate some options from each multiple choice list, making guessing correctly a little easier. However, computer algorithms were taught to successfully guess names as well (although it only had to pick between two names per photo,) which suggests that there are some wider patterns to be recognized. This may help program facial recognition software at some point, but right now it’s a weird reminder of how we shape our own faces, sometimes to match expectations or ideas that were selected before we were even born. Hopefully my kids don’t mind too much.

Source: Your Name Might Shape Your Face, Researchers Say by Angus Chen, NPR Shots

On January 19th, 2017 we learned about

Making a political or personal point by pitching your poop

Poop Week!

The first, and most important thing my kids need to know on this topic, is that we do not throw poop. It’s yucky, you shouldn’t be touching it, much less throwing it, etc. That said, humans actually have a decent track record of throwing poop, particularly as an act of angry protest. The reasoning is probably obvious— if someone is going to go against the very important warnings their parents told them about handling pathogen-carrying feces, then presumably whatever they’re upset about is important enough to risk exposure to say… cholera, maybe? There’s also the ick factor, the shock value, and the fact that some of this behavior may be sort of built into our brains.

Forcing an issue with feces

Effective protest usually requires that the aggrieved party knows what they want, and that they can get someone else to listen to them. Poop probably can’t help with the first point, but if other outreach methods don’t work, anecdotal evidence suggests that properly-employed poop certainly captures people’s attention. Sometimes this has meant covering symbolic statues, such as at the University of Cape Town, in feces. In India, a rally of 30,000 people was somehow overlooked, but when protesters publicly defecated on copies of the land bill that had caused the uproar, more attention was given to the matter.

Unfortunately, not every poop gets thrown with such clear purpose. In some cases people have used feces mainly for the sake of offending others, with very little justice in mind beyond their immediate gratification. Documented cases of smearing abandoned dog poo in it’s negligent owners hair, or throwing turds at a neighbor’s house, usually lead to little more than criminal charges. Even with something as viscerally arresting as unexpected piece of poo, there are still nuances to be protesters need to appreciate in order to wield them effectively.

Scat and the beginnings of self expression

The weirdest details about tossed turds may be what they say about the evolution of our brains. We’re not the only animal to fling our feces, but we might want to feel proud to be counted as members of this unhygienic club. A study of chimpanzees found that the chimps who threw poo more often had more brain development between the their motor cortex and a speech-oriented structure called the Broca’s area. Chimps that didn’t throw poop didn’t show the same degree of connectivity in those locations, or the left brain hemispheres in general.

Researchers aren’t suggesting that throwing poop boosts language ability, or that human evolution was set to its present course when an ancestor threw their feces. Instead, they suggest that throwing any object, even cleaner ones, may have arisen at least partly in thanks of primates’ drive for self expression and communication with their peers. It wasn’t so much that the poop-tossers were better athletes, but that they had something to say, and tossing a poop was sometimes the best way to say it.

Source: ​A Brief History of People Protesting Stuff with Poop by Mark Hay, Vice

On October 30th, 2016 we learned about

The melange of monsters that make up modern zombie mythology

What my kids think of as zombies are actually modern amalgams of at least three other horrifying concepts, glued together in the 1968 film, Night of the Living Dead. Ok, what they actually think of as zombies are probably the green, block-headed groaning foes you find staggering around in Minecraft, although even those follow in the zombie meta-tradition of attaching various horror tropes to each other. In the case of Minecraft, the zombies burst into flames in daylight, a trait usually reserved for vampire stories. Of course, since the template for contagious zombies is already coming from the vampire story, I Am Legend, this really doesn’t seem out of bounds. At this point, in a world with fiction about fast zombies, smart zombies and even zombies in love, it’s no longer obvious where the core concepts of zombie lore really come from.

Slaves as mindless minions

The word “zombie” is a good place to start though, because it leads us to the origins of a zombie’s mindless, relentless nature. American English picked up zombie about hearing about the Haitian creole term zonbi, which itself derive from West African words nzumbi for corpse, and nzambi for “spirit of a dead person.” Thanks to the occupation of Haiti in the early 20th Century, Americans learned of the Voodoo concept of a zombie, and how it tied in with the island’s history of colonial slavery. Zombies weren’t necessarily reanimated corpses looking to eat flesh, but were people whose will and autonomy had been stolen from them by a bokor, or priest, then forced into servitude. These people were sometimes thought of as dead, but more often would be described with dead, lifeless eyes as they worked in fields or waited for more nefarious instructions from the bokor. The connections to slavery are more than just symbolic, with local mythology stating that a slave who committed suicide would have been forced to labor forever as a zombie, rather than ascend to heaven when they died.

This idea of mindless minions caught the attention of both scientists and popular culture. Right after the American occupation of Haiti ended in 1931 the movie White Zombie brought some of these concepts to America, in which zombies were essentially drug-induced henchmen. Some research was attempted to pin down the scientific underpinnings of Haitian zombies, with some reports of meeting “real zombies,” and even some supposed recipes for the coupe poudre, or zombie powder a bokor would use to zombifiy someone’s brain. The key ingredient was tetrodotoxin, a dangerous and frequently fatal neurotoxin found in pufferfish. However, since none of the behaviors of supposed zombies really match the known symptoms of tetrodotoxin, there’s been a lot of doubt about this pharmaceutical explanation. A more plausible hypothesis is that victims of zombificaiton were most likely people living on the fringes of society due to being homeless and/or suffering from mental illness. Sufferers of epilepsy, catatonic schizophrenia and even fetal alcohol syndrome, coupled with a culture primed to “recognize” symptoms of zombification, provided anecdotal evidence to back up stories everyone was familiar with.

Ghouls feasting on flesh

So while slavery and exploitation are obviously horrible, they don’t cover all of the horrors we associate with a modern zombie. It’s hard to find a zombie in modern media that isn’t interested in chomping down on the flesh of the living (or recently living,) although this trait was basically adopted from a host of other myths and monsters. A Norse creature called the Draugr was said to be a spirit or body that walked the Earth, lumbering along with the single goal of eating people. If that weren’t enough, they were also thought to have superhuman strength and be able to grow larger at will, which bafflingly hasn’t been widely reproduced in many zombie stories. More widely known are ghouls (aka ghūls), originally from Arabian myths, which are shape-shifting demons known to lurk around grave sites while looking for humans to eat. Early ghoul stories often focused on the threat of unknown, attractive women, but grave-robbing was eventually added, which fits into our zombie concepts quite nicely.

Vampires carrying contagion

Finally, even the zombies in Minecraft are capable of creating new zombies though a bite, which is thanks to I Am Legend‘s influence on Night of the Living Dead. Vampire myths often included elements of the living dead, and in many cases may have been based on people’s misunderstanding of disease and decay. Contagious diseases may have lead to surprising deaths, followed by frightening and confusing patterns in the corpses’ decay, which may have left more flesh on the bones than untrained observers would have expected. While zombies didn’t end up appropriating the blood-sucking, adding the threat of contagion to ghoulish behavior now seems feels central to the zombie myth.

When all these elements were combined in Night of the Living Dead, it wasn’t intended as an overhaul of what a zombie was meant to be. George Romero, the film’s director, thought of zombies as the Haitian slaves, and that he was perhaps blending ghouls with the contagious vampirism of I Am Legend. The impact of these new, hybrid monsters has been huge though, and much has been written about how these creatures have flooded popular culture since their debut in 1968. With that in mind, perhaps we should look into new hybrid monsters, like… were-witches? sea-mummies? Maybe giant-ghost-kraken? They’d be worth adding to Minecraft, anyway.

Source: Where Do Zombies Come From? by Roger Luckhurst, BBC Culture

On October 25th, 2016 we learned about

Humanity’s confused history of branding bats as baneful or beneficial

Bats have had a strange reputation in human imaginations. We love Batman for fighting crime, but Bram Stoker taught us to mistakenly worry about all bats being vampires. Some cultures have seen bats as harbingers of death, while others marked them as a sign of good luck. These days they often pop up in Halloween motifs, but that’s only one of the many roles they’ve held in human mythology from around the world. This conflicted reputation seems appropriate, because one of the more common themes about bats is fretting over just what kind of animal they even are in the first place.

Disturbing in the dark

One of the big sources of concern with bats is their nocturnal lifestyle. Things that only came out at dusk were harder to see, study and understand, and therefore easier to mistrust. In ancient Greece this led to ideas like bat heads in a bag near your left arm being a way to avoid sleep, but by the Middle Ages Europeans had come to associate bats with more sinister themes, like witchcraft. Bats were referenced by Shakespeare as part of a witches brew, and were thought of as familiars to witches since they both worked the night shift. These fears were taken seriously enough to convict accused witches to death, such as Lady Jacaume of Bayonne who was burned for having “crowds of bats” living near her home in 1332.

Not every culture saw bats’ nocturnal feeding habits in such stark terms though. Some stories said that bats were active at night because they were avoiding creditors during the day. Some African myths gave bats a bit more credit, regarding them as intelligent and possibly a guide during dark times thanks to their prowess as nighttime aviators.

Freaky in flight

If being nocturnal weren’t enough, people have traditionally had a hard time figuring out where bats even came from. Were they birds missing their feathers? Some kind of mouse or squirrel? And why did they have to fly in such an unpredictable (but really just highly maneuverable) way? Many more myths center around the apparent conflict inherent in being a flying mammal, even though bats have been at this for longer than humans have even existed.

The Cherokee people have a fable tying bats’ flight to a game of stickball between birds and other animals. What was originally wingless, mouse-like creatures are rejected by the other mammals, but given wings by the birds, at which point they switch teams and win the game. The Creek people flip this, saying bats try to play on the bird team, but after being rejected are given teeth by the mammals, allowing the bat to better carry the ball. Aesop’s take on this kind of conflict is a bit more judgmental, saying that bats repeatedly switch allegiance in a conflict between birds and beasts so that they can always be on the winning team. In the end, they’re rejected by both factions, leaving them in exile.

Trinkets and talismans

While many of the above stories aren’t exactly ringing endorsements for bats, in application people aren’t always so down on the flying mammals. Chinese culture thought highly enough of these winged mammals to make their name a homonym for happiness, and considered them a sign of good luck if they turned up at your house. Charms have been made from bat anatomy reputed to increase your luck at cards, turn you invisible, or make help you see in the dark. A whole bat nailed to your door was obviously bad for the bat, but promised to protect your house from demons. When you factor in the critical role these animals play in ecosystems around the world, it seem like bats should actually be seen as one of the happiest part of any Halloween party.

Source: The Symbolism of a Bat in Cultures of the World by Paweł Ciołkiewicz, Gotham in the Rain

On October 25th, 2016 we learned about

Costly cameras helped create the concept of green-skinned witches

If you ask an elementary school student today how to spot a witch, there’s a good chance they’ll rattle off a few obvious giveaways— broomsticks, black cats, pointy hats and of course, the unmistakable green skin. Now, looking through historical deceptions of fictional and accused witches, it quickly becomes clear that the green skin hasn’t been part of witch mythology for very long. Various witch trials wouldn’t have really worked if the accused could have quickly pointed out how they lacked a deep, emerald hue. Unfortunately for the hundreds of men, women and children put on trial for witchcraft throughout history, witches only turned green in 1939.

This change in skin tone was brought about because of technological and economic forces, rather than anything supernatural. The first color movies were made in 1917, but they were expensive to produce. When MGM started producing 1939’s The Wizard of Oz, however, they decided to go all in, and they wanted to make sure the public appreciated their investment. Filming in color required elaborate, unwieldy Technicolor cameras that would film a different hue on three pieces of celluloid simultaneously so that they could later be combined into a full-color image. This required specialists, noise-dampening and more, so to make the most of it, producers pumped as much color into the art direction as possible.

Going green for the silver screen

While scenes of Kansas were shot in a desaturated sepia tone that was closer to the black and white audiences were accustomed to, Oz was an amazingly saturated world, complete with yellow-bricks and emerald buildings. Dorothy’s magical slippers were switched from the silver described in the original novel to ruby red, and of course, the wicked witch’s originally nondescript skin was painted a deep green. Unfortunately for actress Margaret Hamilton, the make-up used to accomplish this color was copper-based, and thus poisonous if ingested. Beyond that, it was also flammable, and an on-set accident left Hamilton with second-degree burns on her face and hands. So was this suffering, which was more direct suffering than any historical “witch” was ever charged with, worth it?

The film was well received by critics, but it’s initial release didn’t recoup the production expenses. It was finally in the black when it was rereleased in 1949, but the biggest impact may have come when it was shown on television in 1956. The Wizard of Oz has since been released in variety of formats, and has obviously become a cultural touchstone in the last 70 years. Even if you haven’t seen the movie, it’s likely that you can now spot a villainous witch better than anyone in Salem ever could.


PS: Brooms and pointy hats actually predate green skin on witches, but their origins were a bit much for my second-grade and younger audience. Feel free to read about brooms here, and possible explanation for pointy hats here.

Source: Why Are Witches Green? by Linda Rodrguez McRobbie, Boing Boing

On August 3rd, 2015 we learned about

It’s totally a great idea to always be sarcastic

There’s an interesting risk vs. reward calculation to be made when using sarcasm. If your audience gets it, it can feel like more is expressed, with more specificity, than speaking literally possibly could. If you miss though, you can end up with bewildered or frustrated listeners, eroding the flow of communication significantly, and probably ruining the timing of what would have obviously been the best joke ever. New research says that it’s still worth the risk though, as both the speaker and listener get some good mental exercise in the process.

People were asked to engage in one-sided sarcastic,  sincere or neutral conversations. Test subjects then took a creativity assessment, revealing higher scores among people who had just been in a sarcastic conversation. As both speakers and listeners showed an uptick in creativity scores, it’s thought that the extra mental work of decoding a sarcastic statement to understand the underlying intent of the speaker stimulates the imagination and abstract thought centers in the brain. This then leaves people primed for creative thinking in a way that sincere (literal) communication does not.

A big caveat to all this is that creative thinkers might naturally use sarcasm more often than others. However, the fact that listeners showed benefit too seems to help diffuse that concern. It’s unlikely that the study managed to pair creative individuals repeatedly for both sides of the conversation.

Consider your audience

The potential downside to sarcastic communication is that when it causes confusion or misunderstandings it can create social rifts between people. The trick to handling these risks is appreciating the level of familiarity and trust between the speaker and listener. If mutual trust is already established, even unsuccessful sarcasm doesn’t seem to harm the relationship. On top of that, a trusting but confused listener can still show gains in creativity, meaning there’s little risk but nearly guaranteed reward for such a conversation. This goes against popular notions that sarcasm is only corrosive in any setting, and instead points to using it as a communication tool in appropriate contexts.


My first grader said: My wife and I have been trying to define sarcasm for a while now, and occasionally our daughter will catch on and ask “are you being sarcastic?!” in a pleased tone of voice, which seems to agree with this study’s conclusions. She’s also been practicing with her own delivery, usually with obviously ridiculous statements like “this ice cream is the worst!” The accompanying smile usually makes it pretty fun, but this makes me realize we should stress that she not bring this up with her school-mates until she’s more sure they’ll get it.

 

Source: Go ahead, be sarcastic by Christina Pazzanese, Harvard Gazette

On February 22nd, 2015 we learned about

Why Williams and Wilmas Will Like this Story

You may build your identity around your name more than you realize. Your name actually makes a measurable difference in your decision making, which has been studies for the last 30 years as the “Name Letter Effect.”

Basically, people in all languages, ages, etc. show a preference for things that start with the same letter as their name. The effect has been seen in purchasing decisions, aesthetic preferences, and even humor; cartoons signed by someone with matching initials as the reader were judged to be funnier than other cartoonists’ work.

If you change your name, it takes a little bit of time, but eventually your ego, er, preferences catch up. Women who changed their name when they married shift to preferring their new last initial after about 2 years, on average. They don’t forget their maiden initial either, but their current name seems to hold the most influence.

Some studies have tried to make even more extreme connections than these, even trying to match personal performance to letters used in ratings (eg, do people named Doug prefer to get Ds in school?) but these studies don’t hold up as well when reanalyzed.

Source: Sam Sells Seashells By the Seashore by Jessa Gamble, The Last Word on Nothing