On August 29th, 2017 we learned about

Twisted and knotted cords found to tell tales beyond basic numerical tallies

234 years ago, Felipe Tupa Inka Yupanki arrived in the small Peruvian town of Collata to organize an uprising of native Incans against Spanish colonizers. Yupanki issued decrees and tried organizing an army, although the planned revolt came to a halt once these plans were discovered, ending in the rebel’s execution. This story isn’t as well known as the larger Incan rebellions of the eighteenth century, partially because it’s largely documented in collections of yarn and string in a writing medium called a khipu.

Recording words in twisted cords

Khipus are an Incan form of logosyllabic writing built out of twisted fibers and cords made from cotton, cloth or animal fibers like alpaca and llama hair. Shorter cords of variable color, patterning and textures are individually tied to one larger cord that acts like a sort of spine for the whole message. Each khipu is capped by a cayte, which was generally a textile like a ribbon or bandanna tied to the pieces creator, asserting its authenticity like a signature. If laid out flat, the whole khipu might resemble a short, uneven fringe, but that variability is key to their utility as a communication medium.

Many of the khipus now in museums have been found to use knots to record numbers, and were simple, portable accounting tools used by herders into the twentieth century. Unlike the beads of an abacus, knots were not directly representational of quantities, instead relying on distances and size to record different numbers. By and large, these kinds of accounting khipus were made of cotton, requiring less sophistication in their craftsmanship. However, two khipus recently studied in Collata display a new degree of depth and sophistication, showing that these objects could encode entire stories, as long as you know what to look for.

Narratives beyond numbers

The two khipus from Collata are known to reference the Incan rebellions, but they’re not really being read word for word at this point. Village elders have been passing the khipus, along with other documents tied to the town’s history, down to each other for generations, telling the story of the khipus in the process. The khipus are thought to be based on Quechua, a language no longer spoken in Collata. Still, the various twists, colors, and choices in fiber support the notion that this cords represent words, with 95 individual symbols having been identified.

Since my two sentence summary of the rebellion used 51 unique words, it’s easy to see that the khipu isn’t offering any one-to-one transcription. The 95 symbols that have been identified aren’t phonetic words, instead functioning more like rebuses or pictograms. So rather than directly represent a word, symbols that can remind someone of a word thanks to their similar sounds might be combined to express an idea. This system wouldn’t have been as flexible as a fully phonetic alphabet, but the 95 symbols seen across 487 cords on the Collata khipus are enough to have hundreds of unique combinations.

At this point, researchers speculate that these khipus may be a taste of a significant and unexplored communication system. The success of the Incan empire is hard to imagine without robust means of communication, and if more examples of narrative writing can be found, it may help unlock a lot of recorded knowledge about how that society functioned.

Source: Writing with Twisted Cords: The Inscriptive Capacity of Andean Khipus by Sabine Hyland, Current Anthropology

On August 28th, 2017 we learned about

Like humans, rhesus monkeys fixate on finding false faces in photos

In a time when people have a hard time seeing eye-to-eye, it may be comforting to know that we basically see the same world monkeys do. Studies of various monkeys’ perception have found that our fellow primates react to visual stimuli similarly to humans, even making the same mistakes we do. For instance, both humans and rhesus monkeys might confuse images of large, gray animals like elephants and rhinos, even though the monkeys lack contextual information to relate those creatures to each other. The visuals were perceived along the same underlying neural pathways, making an individual response from a monkey nearly indistinguishable from that of a human.

Now researchers have found a new shared “mistake” between rhesus monkeys and humans, which is called pareidolia, or seeing patterns like faces that aren’t really there. Monkeys were shown photos of inanimate objects that they’d have no real understanding of, such as coffee cups, purses and appliances. Photos that happened to have that critical two-dots-over-a-line pattern of a simple face held monkeys’ attention much longer than versions that didn’t. Based on previous data showing monkeys prefer to look at faces over non-faces, plus eye-tracking data that indicates the monkeys were examining objects’ “eyes” and “mouths” most of all, researchers are confident the monkeys are noting faces the same way we do. In fact, the when given the choice between a photo of a monkey’s face and an object’s faux-face, the object actually seemed to be the more interesting option for most test subjects.

Inherited interests

This fascination with faces isn’t surprising. Separate studies of macaque monkeys have found that the old world primates have the same functionality in their brain’s fusiform face area that we do. In that study, the monkeys recognized human faces as faces, despite our being a different species. Now that we know rhesus monkeys extend this activity to pareidolia like humans do, it’s increasingly likely that these socially-oriented specializations evolved in a distant, shared ancestor. Our interest in finding faces may not be a human trait after all, instead being shared among many primates that make a point to look into each other’s eyes.

Source: Rhesus monkeys found to see faces in inanimate objects too by Bob Yirka, Phys.org

On August 24th, 2017 we learned about

Holding someone’s hand can convince your brain to relax your cognitive load

I like holding my kids’ hands when we go places. My third grader is starting to assert some independence, and won’t casually hold hands as much as she used to, a fact that reminds me to appreciate my four-year-old’s tiny grip all the more. Scientists are finding that holding hands isn’t just a small pleasantry though, as it seems to trigger changes in participants’ brains that affect stress levels, cognition and even pain perception. In fact, it’s possible that part of my enjoyment in holding my child’s hand is that I’m offloading cognitive duties to them, leaving me with an easier stroll down the street.

Dr. Jim Coan has been investigating the effects of holding hands on the brain for years. Experiments generally involved pairs of people, one of which was in an fMRI machine to monitor brain activity. In each round of the experiment, the subjects would see either a red “X” or a blue “O” displayed on a screen, the former of which warned of a 20 percent chance of the person being scanned receiving a small electric shock 12 seconds later. During the 12 seconds after seeing a red “X,” most people’s brains showed a flurry of activity, from increases in stress to paying attention to the site of the possible pain. The big variable in all this was the partner’s touch.

Hands that help vs. those that hurt

Throughout these experiments, people would be asked to either hold hands or sit near each other. Holding hands with a trusted companion was found to make a huge difference in people’s reactions— there were fewer signs of stress, agitation and even pain all over the brain. In some cases, hypothalamus activity changed enough that it’s suspected to be part of the mechanism that makes people with social connections generally have better health than people who live alone.  A variation on the study had children with anxiety disorders hold the hands of their mothers while reading scary words like “monster” instead of receiving shocks, and that small bit of physical contact was soothing enough for the kids to behave as if anxiety wasn’t an issue.

This isn’t to say that holding hands is a cure-all. The above effects were only seen in cases where a person’s partner was someone they trusted and were connected to, such as a spouse, friend or reliable roommate. In variations where people being shocked while holding a stranger’s hand, the positive effects were nearly absent. For people who lived in areas with higher crime rates, strangers actually made things worse, strongly indicating that the physical contact of hand-holding isn’t as important as the social relationship between the people involved.

Social support as a starting point

This may seem intuitive, but that doesn’t explain why any of this happens. When researchers looked at areas of the brain like the prefrontal cortex, they expected to see that holding hands inhibited activity, like the comforting touch of a partner helped the brain tamp down worry and pain. They were wrong though, as no such “self control” could be detected. One clue to help reformulate their model for hand-holding was a variation on the shock experiments that found that threats to shock a partner triggered activity in the “safe” person’s brain as if they were in danger. In other other words, the brain treated a trusted partner almost like an extension of itself.

The new model for all this activity is that as highly social animals, humans actually treat having social contacts as our baseline, rather than a modifier. It’s not that holding hands is better than normal, it’s that sharing experiences with other people is normal, and suffering alone is the more difficult alternative. (In fact, people who have stronger preferences to work alone have also been found to have higher resting glucose levels, meaning they’ve got more energy of their own to expend on daily tasks.)

Shared safety net

So having a spouse or friend with you helps you basically relax a bit, sharing responsibility for well-being with that other person, a state accentuated when you’re physically connected. Why worry about every detail of a possible threat when your friend is there to assist you? In the case of my kids, I’m hopefully not offloading too much responsibility for our safety onto a four-year-old, although knowing that the kid is safe next to me certainly does help lower my stress levels in a parking lot. I’ll enjoy it while it lasts.

Source: Holding Hands is More Important Than You Think by Maximus Thaler, The Evolution Institute

On August 14th, 2017 we learned about

Humans are adept at appreciating alarm in the voices of animals

You’ve probably never conversed with a groundhog or tree frog, but it might not be as futile as you’d think. Sure, it’s sometimes hard to communicate with other humans that ostensibly speak the same language as you, but there’s a good chance that some of the underlying emotion you want to express gets through in just about any conversation. For all the layers of complexity that language can have, researchers are finding that humans are actually pretty decent at picking up the basic intent of a wide range of vertebrates’ vocalizations. We might not know exactly what details a particular prairie dog has to share, but we can at least tell how strongly that critter feels about what it’s saying.

Sensing species’ sentiment

The study was fairly straightforward, asking 75 humans to listen to recordings of different animals and identify the level of arousal, or emotional energy, of that animal. The humans were native speakers of English, German or Mandarin in order to try and eliminate any bias that might arise from a human language that somehow more closely followed the grammar rules of bush elephants or pigs. In the end, people’s assessments were definitely more accurate than random guesses, although the degree to which people understood each species was sometimes surprising.

Humans could identify higher or lower arousal in other humans quite well, followed by giant pandas, tree frogs, elephants and alligators. We actually did worse with some species pigs, ravens and barbary macaque monkeys, indicating that familiarity or genetic similarities weren’t the key component to communication. Overall, the more a species relied on shifting the frequency, or pitch, of their voice, the more it made sense to human ears. Some samples from the study can be heard here if you want to try it yourself.

Shared origins for animals’ outbursts

We know that animal vocalizations can get very complex, and so nobody was expecting anyone to really parse specific messages in this test. The fact that prairie dogs seem to have specific vocabulary for details in their alarm calls is probably going to be beyond the ear of most human listeners. However, the fact that the sense of urgency of an alarm call in prairie dogs, birds and other animals may be detectable may indicate that all these air-breathing vertebrates share a common foundation in our noise-making.

Even if you’re not about to chat with your local squirrels, this study helps establish aspects of how language may have evolved in the first place. Other work has found that specific vocalizations are often taught from one generation of animal to the next in a process that closely mirrors human language acquisition. For example, baby marmoset monkeys transition from babble to specific calls in a process that is nearly identical to human babies. A baby marmoset will get feedback from adults about specific phonemes it makes, and then learns to refine and rely on those specific sounds for communication as it matures.

Source: Humans identify emotions in voices of all air-breathing vertebrates

On July 20th, 2017 we learned about

Babies pick up second languages better when spoken to in “parentese”

Baby brains are language sponges. Among all lessons the world offers a tiny child, from gravity’s effect on spoons to how expressive adults can be about picking those spoons up, babies are also doing their best to make sense of everything people around them are saying. This is true even when more than one language is being spoken by the baby’s household, which is how most bilingual people pick up their first and second languages. Beyond the practical upside of being able to speak with a wider range of people in the world, being bilingual has been linked to a number of cognitive benefits. Since not everyone has a chance to grow up in a bilingual household, researchers have been looking at ways to help more children learn more languages.

Fortunately, while the ability to learn a language seems to be part of human genes, the specific language you learn is not. That bit of cultural information can be shared from anyone, and often babies will learn a second language from child care providers, schools, or the local population in general if parents speak a different language at home. There are also programs and classes, even for the very young, to help pick up a second language. As an experiment, a sort of minimal language course was established to help Spanish speakers in Spain learn English, and the results show that full immersion isn’t necessary for a child to make progress.

Learning languages in just one hour a week

The experimental class lasted 18 weeks, but it only met for one hour a week. During that hour, instructors catered to the type of interaction the participating 7- to 33-month-olds would receive at home. This meant that rather than more formal practice and instruction in English, the babies and toddlers were taught with something closer to the baby talk their parents were using at home to teach them Spanish. The idea was that those types of interactions aren’t just fun for the babies, but they’re actually something human brains have evolved to expect in order to learn to speak.

To evaluate the effectiveness of this “parentese” approach, children were outfitted with special vests that recorded their speech, both in the experimental class and a more traditional English class offered by the Madrid school system. Researchers then tallied up how many English words the kids used, and how often they used them. To see what stuck, their English was evaluated again 18 weeks after the instruction ended. The results showed that parentese worked better with these young kids, and their English was stronger at the end of 36 weeks, with those kids having retained around 74 words compared to the 18 words retained by more traditional students. Importantly, this held true even for kids from very different backgrounds— kids’ brains from both higher and lower income neighborhoods responded well to this type of instruction.

This doesn’t necessarily prove that these kids will all speak English fluently in a couple of years, but it does show that learning a second language doesn’t require intense investments of energy. Most likely, it means that the best way to teach a language is to use the tools evolution has created for a baby’s or toddler’s brain during this crucial developmental period when they’re primed to learn.

Source: New Study Shows How Exposure To A Foreign Language Ignites Infants’ Learning, Scienmag

On July 17th, 2017 we learned about

Grandparents’ earlier bedtimes may have evolved to ensure security for the family

Humans didn’t evolve in a world of deadbolts and heavy doors. Even a cave has a entrance that you might need to keep an eye on, but the average family or village probably didn’t have a sentry staying up to look for lions. Instead, security was likely assured thanks to restless grandparents and the fact that older people’s sleep schedules rarely sync up with those of younger adults.

Duke researchers built this “poorly sleeping grandparent hypothesis” around sleep data from the Hazda people of northern Tanzania. Hazda villagers will often work separately during the day, then come together to sleep at night in groups of 20 to 30 people. With no artificial lighting and simple bedding laid out on the ground near and outdoor hearth, the Hazda are thought to be a good proxy for early human societies, possibly providing insight into how our sleep needs evolved long ago. To get the details on this lifestyle, study participants were asked to wear activity monitors so that each individual’s schedule could be compared.

Sleeping in unscheduled-shifts

The pattern that emerged is likely very familiar to many of us, even if we’re not used to sleeping outdoors. Older people tended to go to bed earlier, around 8:00 pm, while younger adults stayed up later. Appropriately, older people also woke up earlier as well, but that doesn’t fully explain how this kind of sleep schedule could be a benefit to a group’s security. The other element is how restless people were during the the night— at any given moment during the night, at a little over a third of the group was dozing or alert. There were only 18 minutes in a night when every adult was completely at rest.

This suggests that a group of mixed ages could sleep without sentries because everyone was “on duty” for part of the night. The fact that older people tend to sleep and wake early may have then evolved to reinforce this system. With sleep schedules staggered by a few hours, an extended family would have very little risk of being truly unguarded throughout the night. This is all without factoring the further disruptions of young children, who would presumably add another layer of schedules to the mix (while creating new demands of their own that a grandparent might be able to help with.)

Interestingly, these overlapping but different schedules not only avoided lion-surprises, but also provided sufficient rest. Nobody complained of sleep deprivation, suggesting that while we obviously need good amounts of sleep, the idea that a good night’s rest has to be a single, uninterrupted block of time may be a modern expectation.

Source: Live-in grandparents helped human ancestors get a safer night's sleep, Popular Archaeology

On June 7th, 2017 we learned about

Moroccan fossils may force a reevaluation of when and where Homo sapiens got our start

In 1961, miners in Morocco found what appeared to be human skulls, jaws, arm and hip bones. The bones were all fossilized, but pinning down their exact age was difficult since they’d been removed from their original location. Now a second round of excavations from the same site, known as Jebel Irhoud, has uncovered more fossils, plus stone tools and crucially, charred flint from a campfire. Using a technique known as thermoluminescence with that flint, the site is now thought to have been inhabited between close to 300,000 years ago… just 100,000 years before anyone thought humans even existed.

(Im)perfect model for modernity

The idea that there were ancient hominids in Africa 300,000 years ago isn’t that shocking by itself. Other species, like Homo naledi, lived at that time too. Neanderthals had already left Africa altogether. However, the bones from Jebel Irhoud look strikingly like modern humans, and have been labeled as Homo sapiens. They’re not a perfect match though— they don’t have a modern chin, one specimen had a rather pronounced brow, and the shape of the brain case is tapered towards the back of the head. This has some anthropologists suggesting that these people were a transitional species, rather than truly modern humans, but even if that were the case, their overall similarity and age still merit some reexamination of humanity’s origins.

Overhauling our origin story

On one hand, 300,000-year-old humans may help make sense of a few things. The so-called Florisbad skull from South Africa was dated to be 260,000 years old, which made it seem like a weird outlier in the human family tree. However, if the Jebel Irhoud do represent even older members of H. sapiens, then the Florisbad skull fits into the story more neatly. Similarly, the tools found in Morocco were generally light weight, with spears that were appropriate for throwing instead of just stabbing. They’re not the only tools from this time period to have this degree of sophistication, and the thought is that if modern humans arose around 300,000 years ago, these tools might be more tightly bound to our evolution and success.

Of course, on the other hand, the location of Jebel Irhoud opens a whole host of new questions. Previously, the leading model was that H. sapiens started in Ethiopia around 200,000 years ago, with our oldest confirmed specimen dated as 195,000 years old. From that birthplace, it was though that humans started spreading out to other parts of the world, a narrative that doesn’t have space for humans to somehow be on the opposite side of the continent 100,000 years earlier. It seems that humans, or our very close ancestors, were actually spread across Africa, with no clear point of origin standing out at this point. To fill in more gaps, more fossils are needed. For better or for worse, those fossils might be all over the continent.

Source: Scientists Have Found the Oldest Known Human Fossils by Ed Yong, The Atlantic

On April 16th, 2017 we learned about

Drilled and filled cavities put dentistry’s start date in the Stone Age

5000 years ago, someone in Sumeria was worried they had “tooth worms.” It was probably just run-of-the-mill cavities, but people didn’t really know why their teeth could break down in their mouths. Lacking a real understanding of the root causes of tooth aches didn’t stop people from looking for remedies though, and people have been removing, wiring and medicating teeth for thousands of years. A discovery in Italy shows that dentistry may predate those Sumerian tooth worms though, with evidence of Stone Age fillings from nearly 13,000 years ago.

Carving for cavities

The two teeth show a lot of damage, but the assumption is that much of it was intentional. While paleolithic peoples likely demanded a lot of their teeth, using them as a third hand to hold or soften wood, hides and plants, these teeth seem to have been scraped on purpose. Instead of rough, random damage, an impressively smooth, regular pit was carved in the center of each tooth, much like your dentist would do with their drill to remove any infected tooth around a cavity and make a better seat for a filling.

Aside from the fact that this work was done without a modern dental drill, there obviously wasn’t modern fillings available either. To fill the teeth, it appears that fillings made of bitumen was used to fill the drilled hole. Bitumen is a thick, sticky substance that is usually derived from petroleum, and is most commonly used today in making asphalt cement. In Paleolithic era, bitumen was more often used as a glue in tools, but apparently dentists at the time felt it was a good way to fill in damaged teeth as well.

No signs of Novocaine

None of this sounds terribly pleasant for the patient, but a persistent tooth ache is hard to ignore. Hopefully, the dentist in question could at least recommend some of the pain-killing techniques used by Neanderthals in what is now Spain, as their teeth have turned up with traces of salicylic acid, the active ingredient in aspirin. It wouldn’t have made this a painless procedure, but it seems better than nothing.

Source: Stone Age hunter-gatherers tackled their cavities with a sharp tool and tar by Bruce Bower, Science News

On April 10th, 2017 we learned about

Concerns and questions about the nutritional needs of ancient cannibals

From a modern human’s perspective, cannibalism seems like a particularly monstrous course of action. While my kids have heard about animals eating members of their own species, the idea that humans have been known to eat each other was still surprising enough that my four-year-old felt the need to fact-check this concept with his mom. Nonetheless, humans have been eating each other since the Stone Age, with just enough regularity that researchers believe there may have been a pattern or repeating motivation to do so. Such as… nutrition?

Calories worth the cost?

While the tragic events surrounding Uruguayan Air Force Flight 571 famously demonstrate that cannibalism can keep you alive, researchers wanted to know if it was actually well-suited for that job. Were repeated instances of cannibalism in the archaeological record, as indicated by butchered bones or opened brain cases, simply because human bodies were a good source of food, even if you hadn’t been stranded in the mountains? After a thorough survey of earlier measurements of the calorie content of human anatomy, the answer was that humans aren’t especially nutritious. A whole human body, from the brain to the bones, really only yielded around 126,000 calories, much less than the 600,000 calorie bears living in the same ecosystem.

This suggests that humans probably weren’t being eaten because of a sense of nutritional efficiency, and were thus on the menu for other, possibly more ritualistic, reasons. However, this assumes that, outside of obviously skinny compatriots, these early humans could really assess the potential nutrition of eating one creature over another. Even in a highly measured, labeled world, many people today might be surprised about the caloric gap between a gram of chicken breast versus thigh meat, much less the 100-calorie difference between lamb loin and pork loin.

Difficulty measuring meat

Part of the difficulty in measuring meat is how much cooking it changes its nutritional value. Like starches, the sugars and proteins in meat open up when they’re cooked, allowing digestive enzymes a better chance to break them down into useful nutrients. Collagen also softens, making the meat literally easier to chew and ingest. Even mice, who don’t naturally encounter cooked meat, will favor it over raw options.

Our ancient ancestors likely had access to fire by the time they were eating each other, which begs a question about how the calorie count of human flesh was calculated. Most of the numbers came from older literature, measured in a time when calories were measured simply by burning the food in a container to see how much it heated water. Modern food measurements are a little more sophisticated now, doing a better job of how food will be digested. If the human body were remeasured to modern standards, we still might not match the caloric bounty of a one-ton short faced bear, but properly roast thighs might not seem like such a waste after all.

Source: Cannibals Weren't Calorie Counters, But Humans Aren't Very Nutritious by Shaena Montanari, Forbes

On March 23rd, 2017 we learned about

Capuchin monkeys inadvertently muddle the history of human tool making

Some of the first technology created by humans were sharpened stone flakes. While they might appear to be simple rocks at first glance, archaeologists have come to note the single sharp edge and flat sides that transform these rocks into tools. The edge would have allowed early hominids to cut meat, remove skin, or work with wood and grass. To create such a tool, our ancestors are thought to have needed advanced hand-eye coordination, as well as the ability to plan the flakes’ design.

…or they would have needed to be weirdo capuchin monkeys making flakes without even noticing.

Capuchin craftsmanship

Capuchin monkeys have been observed experimenting with tool use for many years. They use rocks like shovels to dig up spiders, twigs to poke into crevices for caterpillars, and may have been cracking open cashews in a hammer-and-anvil arrangement for over 700 years. They’re one of an ever-growing list of animals that partake in at least casual tool usage, but their creativity, and perhaps dumb luck, has set them apart.

Bearded capuchins (Sapajus libidinosus) in Brazil’s Serra da Capivara National Park were observed banging rocks together, but not for the sake of getting at fruit or nuts. They were hitting pieces of quartz together with two-handed, overhead strikes, and then licking the broken stones. Researchers aren’t sure if they were trying to lick up silicon or possibly lichen to fill some sort of nutritional need, but they were sure about the stone byproducts of this activity. Many of the resulting pieces of stone were nearly identical to what has been assumed to be carefully crafted stone flakes of human ancestors. The monkeys took no notice of the fine edges or flat sides that would attract an archaeologist, but even the accidental creation of such an object is very significant.

Accident or innovation?

Sharp flakes of stone have always been assumed to be a marker of human-like intelligence and development. They were supposed to be the starting point for human artifacts, kicking off the chain of development that gives us the computer you’re reading this on. Even with other tool-using animals in the world, the sharp flakes of rock were supposed to require more deliberate creation, and were different from the broken rocks wielded by other primates like Chimpanzees and Macaques. If capuchins are making them accidentally, it means that stone flakes alone can’t be considered a reliable marker for hominid activity. Sometimes flakes have been found with other artifacts, like butchered bones or old fire pits, but on their own, we can’t be so sure we’re seeing signs of an innovative Australopithecus, or just a lucky one that wanted to lick some lichen.

In the mean time, we’ll have to watch the capuchins to see if they take notice of the utility of their flaked stones. Maybe this is a window to how specific tool creation got started in the first place?

Source: Capuchin monkeys produce sharp stone flakes similar to tools, Science Daily