On June 7th, 2018 we learned about

Identifying the factors that likely enabled humanity’s first attempts at farming

For as influential and beneficial as agriculture has been for the human race, we’re not actually sure why we started farming in the first place. Obviously people would have been happy to have better access to food for their communities, have time for skill specialization, etc., but wanting more food doesn’t explain why people would start farming one year instead of another. Since people started raising crops before they could necessarily write about doing so, researchers have had to start looking at less direct data to figure out what influenced the rise of agriculture. If the resulting models are correct, they not only help eliminate some long-standing hypothesis about the birth of agriculture, but they may be applicable to other questions about early human activities as well.

Making an analytical model

The first step in this research was to look for patterns in the intersections of cultural traits, environmental conditions and population densities in modern or recent foraging societies. Peoples that largely relied on hunting, fishing or gathering to feed themselves were used as verifiable reference point to see what conditions would be expected for a society to carry on without agriculture. Of particular interest were specific factors like environmental stability and how often people traveled in comparison to population density. All these dynamics were assembled into a predictive model that could first be tested against the observed data to ensure that further predictions about ancient populations were grounded in reality.

Researchers then used historical data from around the world to see how environmental conditions could influence a population’s food supply. Agriculture is known to have developed independently at least 12 times in human history around the world, so conditions were analyzed for each of those instances. While many specifics differed, such as the specific year, cultural norms, etc, people from New Guinea to Central America to the Middle East all followed at least one clear trend: their entry into agriculture followed improving environmental conditions.

Predicting more about the past

A possible origin story may have then started with people enjoying enough food stability in their natural environment to have a bit of extra time and population growth. That allowed for more exploration of new ideas or techniques, eventually leading to more revolutionary discoveries like crop cultivation. It seems simple enough, but this possible origin is very different from previously proposed explanations, such as agriculture being the a response to near starvation, or simply arising at random in human history.

The models and analysis that revealed these patterns could likely use further refinement, but researchers are feeling quite confident about their long-term utility. They believe that it could be applied to events further back in time than the 21,000-year-old rise of agriculture, possibly looking at other major developments in human history. By analyzing how humanity changed in response to larger environmental trends, this kind of modeling may help us make sense of otherwise sparse and spotty archaeological evidence.

Source: On the origins of agriculture, researchers uncover new clues by Colorado State University, Science Daily

On April 12th, 2018 we learned about

The adaptations Homo sapiens adopted from northern Neanderthals

Homo sapiens are the last living hominid on Earth, but some of our extinct kin “live on” in our gene pool. Many of the most obvious examples of these genetic artifacts come from Homo neanderthalensis, a species of human from Europe that we know interbred with Homo sapiens coming out of Africa. Neanderthals were generally very similar to modern humans, but they had some special adaptations that helped them live in the colder, darker reaches of Europe. Some modern humans now carry these traits as well, although the exact list of what we did, or did not, inherit from our Northern cousins hasn’t always been obvious.

DNA to deal with the dark

One common set of assumptions has to do with sunlight. Europe, particularly in the winter, receives a lot less sunshine than more equatorial regions, and so anyone living there would want lighter skin to better absorb light for the creation of vitamin D. While we do carry some Neanderthal genes associated with how our skin interacts with the Sun, they don’t overtly govern pigmentation. Red hair, often seen as tied to a very light complexion, is actually unique to Homo sapiens. On the other hand, Neanderthals did contribute genes our genome that shape how quickly you might tan or get a sunburn, so there is some logic to the idea that they carried adaptations for living with less sunlight.

Sun exposure mattered to Neanderthal mental states as well. The gene ASB1 has been linked to people’s natural circadian rhythms, making them more likely to want naps or be a “night owl.” It’s part of another set of genes Homo sapiens picked up from Neanderthals, some of which also seem to determine how exposure to sunlight affects one’s mood. It clearly wasn’t easy living up North, and makes sense that these adaptations would be appropriated by Homo sapiens looking to move to darker latitudes.

Nifty noses for better breathing

One thing Homo sapiens didn’t seem to borrow was Neanderthals’ wide nasal cavities. While both humans and Neanderthals have more sophisticated sinuses than our common ancestor, Homo heidelbergensis, Neanderthals could move, moisten and warm considerably more air per breath than we can. This is again tied to living in a cold, dry climate, but modern humans didn’t adopt these noses when they ventured north, possibly because of our metabolisms.

Neanderthals probably needed better airflow to serve their high metabolic needs. Their thick, stocky bodies are estimated to have needed as many as 4,480 calories a day, nearly twice what’s recommended for a modern human male today. To process those calories, Neanderthals would have also needed more oxygen than a modern human, so they couldn’t afford to let a cold, dry climate slow them down. So perhaps interbreeding ancestors didn’t end up with such sophisticated schnozzes simply because they didn’t need them- our lower metabolisms made us a little more flexible when it came to airflow.

Source: Neanderthals didn't give us red hair but they certainly changed the way we sleep by Darren Curnoe, Phys.org

On April 11th, 2018 we learned about

Homo sapiens got a social boost by giving up bony eyebrows

You may have your mom’s hair, or your dad’s eyes, but how about humanity’s eyebrows? As much as you may hear about big brains or opposable thumbs, anthropologists believe that our flat, fuzzy eyebrows may be a uniquely human trait. What’s more, they may have given us an edge over other hominids with bonier brows.

Benefits of a big brow

Older ancestors, like Homo heidelbergensis, generally had brows closer to what you find on other primates. Even as their heads became taller in relation to their face, they still had a prominent ridge of bone protruding above the eye sockets. That may have provided a bit of protection and shade for their eyes, but researchers have now ruled out other structural benefits. Computer simulations have found that reducing these bits of bone had no ill-effect on a virtual H. heidelbergensis‘ bite strength, nor would it make more space for a larger brain case in the skull.

Even if a smooth brow doesn’t impede an individual’s ability to bite down, it doesn’t explain how it benefited modern Homo sapiens enough to spread throughout our gene pool. It wasn’t worse than a bony brow, but how was it better? Some pitting along the brows of H. heidelbergensis provided clues, as it was remarkably similar to microscopic craters found on display features in other primates. Dominant male mandrills, for instance, have these pits on the sides of their muzzles where they grow colorful tissue to express their social status to their potential rivals and mates. Seeing these pits on the side of H. heidelbergensis‘ brow ridge suggests that that feature may have started a similar function for our ancestors, signaling fitness and status as a display structure.

Making friends with fuzz

That kind of communication is may have started human ancestors down the road to our modern eyebrows built from hair and muscle. It’s easy to take them for granted, but the movement of eyebrows plays a big role in how we express emotion and intent to people around us. We can even communicate these feelings when drawing an abstract face with simple dots for eyes, as long as the brow lines slope up, down, or asymmetrically. Some expressions, like a quick lifting of both brows to express recognition or openness, are understood by humans around the world.

It’s thought that the role of this kind of social communication became incredibly important as humans banded together to start farming forming settlements thousands of years ago. Those developments would require more trust and cooperation with other individuals, and would have depended on clear communication. Being able to form social bonds, both with close kin and newcomers, would have allowed humans to help each other through difficult circumstances. It seems there was no better way to build a friendship than by moving articulated tufts of hair up and down over your eyes.

Source: Research to raise a few eyebrows: Why expressive brows might have mattered in human evolution by University of York, Phys.org

On February 25th, 2018 we learned about

Making sure we can measure how marmosets manage their speech

Your typical marmoset conversation has a lot of “tskiks,” “ekks,” and “pheees.” From a human perspective, it’s a pretty limited set of sounds— plenty of birds and signing apes have bigger vocabularies at their disposals. However, since marmosets are primates like us, and these are sounds they naturally use to communicate, they’re a good model organism to work with; they offer a simpler version of speech that was likely inherited a shared ancestor. The catch has been that, until very recently, it looked like marmosets were breaking a key rule of how we can talk.

Previous research has found that humans can form a syllable, as the smallest unit of speech, seven times a second. This is due to both biomechanical limitations in our voice boxes and neurological limits in our brains. It makes sense that these two systems would have evolved to match each other, and researchers want to know exactly how they synced up in our species’ past. Presumably, an vocalizing primate like a marmoset would be a good reference point, but only after the confusion with their “pheee” sounds were cleared up.

Finding the shortest sound

While a marmoset’s “tsik” and “ekk” sounds clearly functioned as syllables, their “phee” sounds were problematic, because they were too long. When listening to marmosets ‘speaking’ to each other, it seemed like the “phee” sounds could be drawn out in a way that let them cram in an irregular number of syllables per second. If they could do this, it would mean that their brains were treating one sound differently than the others, raising questions about the definition of syllables as a basic unit of speech.

To test this, researchers started interrupting vocalizing marmosets. When an animal got chatty, they’d get an earful of white noise, just annoying enough to make them stop their vocalization. The idea was that they wouldn’t be able to pronounce just half a syllable when interrupted— “tsik” and “ekk” wouldn’t be divided down to “ts” or “e,” as the marmoset would at least finish that sound before their mouth and brain stopped talking. Those sounds did prove to be indivisible, but “phee” wasn’t so durable. It seemed that a long “phee” sound could indeed be split, meaning those long sounds were actually two units of speech put together. “Phee” wasn’t breaking the rules about syllables- we just hadn’t realized that those vocalizations were really “phee” and “eee” being pronounced back-to-back.

Syllables hold steady

With this expansion of marmoset vocabulary, researchers have confirmed that syllables are a durable unit of speech in primates. This way of organizing vocalizations into 100 millisecond-chunks likely evolved long ago, as humans haven’t shared a common ancestor with New World monkeys like marmosets for at least 35 million years. With this confirmation, researchers can move forward on figuring out how all primate speech ended up with our syllabic speed limits.

Source: Monkey Vocabulary Decoded by Universitaet Tübingen, Science Daily

On January 29th, 2018 we learned about

Everyone understands a lullaby: Human brains may have shaped universally-appreciated traits in music

Even if you don’t understand the lyrics, your brain may be wired to recognize the meaning of music from around the world. No matter what the finer nuances of a song may be, there’s evidence that some core element of a song’s genre may be universally understood. So just as you would never mistake a lullaby with a dance anthem in your native tongue, there’s a good chance you can also tell them apart when listening to songs by the Mentawai people of Indonesia. At least in broad strokes, it seems that humans everywhere have very similar concepts of what a dance versus a love song should sound like.

That idea may sound simple enough, but figuring out how to test is isn’t easy. Finding ways to untangle a person’s cultural background from music they’re listening to isn’t a small task, because that same background likely informs our perception of that music. To hunt for signs of universal musical concepts, researchers spent years assembling a database of music from around the world, then sorted that music by region and the type of song according to four categories: dance songs, lullabies, healing songs and love songs. Clips of those songs were played for English-speakers from 60 different countries, simply asking them to pick a category for each song they heard. If they could correctly identify a love song in a language and style they weren’t familiar with, it would suggest that some aspect of that song was shared across cultures.

With 750 listeners reporting, researchers found that dance songs and lullabies were the most easily identified. Apparently, soothing a baby is a similar process no matter where one lives. Love songs and healing songs weren’t so clear, a fact probably compounded by a lack of healing songs from Scandinavia or the United Kingdom. Overall, this first experiment suggests that there are some universally understood aspects of music, but figuring out what those are, and why they exist, are still open questions.

Possible sources of musical synchronicity

The hypothesis about how humans would end up with universal elements in our music range from ideas around our brain structure to a bit of convergent evolution. One explanation may be that some of these musical concepts, like a lullaby, are hard wired into our brains to be created and appreciated, with each culture adding their own flourish on top of a core formula for a song.

Alternatively, our brains may provide an interest in sound and stimuli, and that musicians have learned to trigger that response, with certain tunes just being the most time-tested method for doing so. In that case, a lullaby works because it takes advantage of brain functions that originally involved for other purposes, unrelated to music.

There’s also a chance that these shared traits between songs are a case of convergence. Through other environmental pressures, like a desire for community building, displaying status and more, musicians everywhere have been guided to similar-sounding songs. Each musical lineage started independently, rather than in any specific brain structure, but has nonetheless converged on a single way to perform a dance song because quicker tempos and more layered instrumentation just does that job better than anything else.

At this point, more people need to participate in the listening tests. While the initial participants hailed from many different countries, the second phase of testing will aim to include viewpoints of non-English speakers, particularly from more isolated cultures. If these songs are still universally understood at that point, we can really start looking for what defines each genre in our minds, and how that core evolved in the first place.

Source: Some Types Of Songs Are Universally Identifiable, Study Suggests by Rebecca Hersher, Goats and Soda

On December 10th, 2017 we learned about

Sorting through the spectrum of what chimpanzees regard as repulsive

You might not want to eat while reading this. According to a recent study that aimed to gross out chimpanzees, text probably isn’t enough to trigger the sense of disgust we’ve inherited from our ancestors. With that said, stories that bring up the issue of coprophagia, or eating feces, probably isn’t great for one’s appetite. At least not a first.

Digging into the details of disgust

Researchers were investigating where chimpanzees, as our closest genetic relative alive today, draw the line with what they’ll put in their mouths to eat. That line certainly wasn’t clear from the outset, as wild chimps will pick seeds out of poop to eat, and captive chimps will go a step further and snack on their poop outright. Researchers learned that there was some nuance to chimps’ consumption of crap, as the animals apparently evaluate feces based on genetic familiarity. A chimpanzee will eat their own poop, or that of closely related family members, but any other chimps’ waste elicits a clear display of disgust.

With that baseline established, researchers set up experiments to further probe chimpanzees’ criteria for when food is too revolting to eat. In one scenario, food was placed in an opaque box, either on top of soft but edible dough, or on top of a piece of rope. Chimps reaching in for the food were obviously repulsed by the soft, moist dough, yanking their hands out of the box as if it bit them. Other tests involved food being placed on what looked like feces, or near the scent of blood, and while no response was apparently quite as disgusting as something soft, wet and squishy, the chimps seemed to have similar guidelines for what was gross that humans do. They didn’t necessarily have the same standards though, as they would sometimes end up eating food from disgusting sources, but overall their criteria was pretty relatable.

Finding the value in what chimps find foul

The fact that these chimpanzees get grossed out like we do may seem obvious, but it wouldn’t have been safe to automatically assume they operated on similar criteria to humans. After all, any degree of coprophagia is probably too disgusting for humans to seriously consider, and researchers wanted to see exactly what our species had in common with our fellow primates. Avoiding substances, like poop or blood, that could easily harbor pathogens makes sense as a survival tactic, and identifying commonalities indicates that chimps and humans likely inherited some of these reference points from a shared ancestor. This work may also help zoos and conservationists manage the health of chimpanzees in their care. Dangerous substances can be presented in a more disgusting manner, and individual chimps that seem too casual about gross sources of food can be given extra attention for exposure to pathogens.

Source: What grosses out a chimpanzee? The origins of disgust by Kyoto University, EurekAlert!

On September 20th, 2017 we learned about

Experiments demonstrate how to manipulate monkey (and human) metacognition

For as many times as we tell our kids to believe in themselves, it’s good to keep in mind that confidence can sometimes be misleading. This isn’t to say that doubting your every decision is helpful or healthy, but that sometimes we don’t realize why we’re confident in the first place, opening us up to manipulation, such as putting more trust in a statement because it’s written in larger letters. This susceptibility isn’t exactly our fault though, as researchers have found that our primate cousins fall for the same tricks. While falling for these influences may seem like a drawback for us, it’s also proving that monkeys have a more sophisticated sense of self than they’re usually given credit for.

Understanding exactly what you do and do not know is called metacognition. It’s very helpful to know where your gaps in knowledge are so that you can adjust your actions accordingly. For instance, if you know you’ve never eaten a particular berry before, knowing you don’t know what it is will probably push you to investigate it more carefully before popping it into your mouth. This might seem obvious, but people, and apparently other primates, commonly make mistakes when evaluating our personal knowledge base, and that can obviously get us into trouble.

Confirming monkeys’ confidence

To test metacognition in monkeys, researchers had to train them on a multi-step game that would allow these non-verbal test subjects to demonstrate how much they thought they knew about something. Monkeys were presented with touch screens showing a single image, like a cricket, which they had to poke at to proceed. They then see that same cricket image again, along with three other images meant to distract them from their task, which is to poke the same cricket again.

After they make their selection, they’re shown a screen where they need to rate how confident they are about their previous poke. If they’re sure they’re right, they can pick an option that will net them three tokens for a correct answer, but cost them a token if they’re wrong. If they’re less sure about their answer, a low-confidence indicator will let them gain one token, even if they’re wrong. Monkeys are only rewarded once they earn a specific number of tokens, forcing them to play the long game to get a treat.

It’s a lot to throw at a monkey, but they seemed to get it enough to play and reveal patterns in their decision making. Once a monkey seemed to understand the mechanics of the game, researchers started manipulating how information was presented in order to manipulate confidence levels. For instance, higher-contrast images made the monkeys wager with more confidence, while low-contrast images had the opposite effect. These sorts of attributes change confidence levels in humans too, along with shorter, easier-to-pronounce vocabulary and the aforementioned larger text size.

Indecision versus imprecision

This isn’t to say that metacognition is just a form of self-delusion. Knowing when to take a shortcut, or react quickly and decisively, can be very helpful in certain scenarios. These traits probably evolved in a distant primate ancestor, and have been helping humans and monkeys for millions of years. Of course, it’s probably also helpful to know what you know versus what you think you know, since sometimes that same confidence can get you into trouble.

Source: Monkey sees. . . monkey knows? by Lindsey Valich, Rochester Newscenter

On September 4th, 2017 we learned about

Fossilized footprints found near Crete may seriously complicate our ancestors’ origins

Your foot may ache, smell and maybe need a toenail trim, but it’s a pretty special bit of anatomy. Even if you’re used to your single row of clawless of toes, it’s actually a unique arrangement among just about every animal on Earth. Our closest living relatives don’t have their hallux, or big toe, facing forward like we do, giving them a much more hand-shaped foot. These toes, along with the ball of our foot and long instep, come together to make a very distinct footprint. What’s exciting scientists now is that some of these prints have turned up in a time and place where they supposedly had no business being.

A set of footprints were found hardened into sedimentary rock on an island called Trachilos, off the coast of Crete. Based on the foraminifera, or marine microfossils, found in the layers of rock above and below the slab of stone in question, researchers confidently dated the prints as being 5.6 million years old. The catch is that at that time, no human ancestors, much less humans, were thought to be anywhere but Africa. What’s more, the shape of the prints look more like modern feet than any known ancestor living at that time.

Figuring out the who, what and where

Even without an actual bone or tool, these footprints may be enough to up-end our timeline for human evolution. The oldest confirmed hominid is Ardipithecus ramidus, who is thought to be a direct ancestor to modern humans. However, A. ramidus lived in Ethiopia around 4.4 million years ago, and at that point had a much more gorilla-shaped foot. We know that by 3.65 millions ago, our ancestor Australopithecus was leaving very modern-looking footprints in Tanzania, but neither of these dates sync up with the stroll some primate took through Crete at least a million years earlier.

Putting the evolutionary questions about foot-shapes aside, walking to the island of Trachilos is actually one of the easier issues to understand. North Africa and the Mediterrean Sea were very different places 5.6 million years ago, as the Sahara Desert was a savannah and the Mediterrean was beginning to dry up, thanks to the Strait of Gibraltar closing off the waters of the Atlantic Ocean. What’s more, Crete itself was still attached to the Greek mainland, making access to this particular plot of land all the more plausible.

Many questions to consider

Assuming this initial interpretation of these 50 footprints hold up, anthropologists have a lot of new questions to explore. Were the prints made by a hominid, or just a very similar gorilla living in Europe? Did this mystery primate walk out of Africa to Trachilos, having evolved our unique feet earlier than anyone thought? Were these prints left by a direct relative to modern Homo sapiens, or is this an offshoot of our family tree that didn’t end up succeeding with their early upright gait? Or, most radically, did our ancestors originate outside of Africa, strolling south instead of north? That last idea is a long-shot, and would still leave the species Homo sapiens as originating in Africa, but this simple stroll in the soft soil now demands that we investigate a lot of new possibilities.


My third grader said: I hope it turns out to be one of the crazier explanations!

Source: What Made These Footprints 5.7 Million Years Ago? by Gemma Tarlach, Dead Things

On August 29th, 2017 we learned about

Twisted and knotted cords found to tell tales beyond basic numerical tallies

234 years ago, Felipe Tupa Inka Yupanki arrived in the small Peruvian town of Collata to organize an uprising of native Incans against Spanish colonizers. Yupanki issued decrees and tried organizing an army, although the planned revolt came to a halt once these plans were discovered, ending in the rebel’s execution. This story isn’t as well known as the larger Incan rebellions of the eighteenth century, partially because it’s largely documented in collections of yarn and string in a writing medium called a khipu.

Recording words in twisted cords

Khipus are an Incan form of logosyllabic writing built out of twisted fibers and cords made from cotton, cloth or animal fibers like alpaca and llama hair. Shorter cords of variable color, patterning and textures are individually tied to one larger cord that acts like a sort of spine for the whole message. Each khipu is capped by a cayte, which was generally a textile like a ribbon or bandanna tied to the pieces creator, asserting its authenticity like a signature. If laid out flat, the whole khipu might resemble a short, uneven fringe, but that variability is key to their utility as a communication medium.

Many of the khipus now in museums have been found to use knots to record numbers, and were simple, portable accounting tools used by herders into the twentieth century. Unlike the beads of an abacus, knots were not directly representational of quantities, instead relying on distances and size to record different numbers. By and large, these kinds of accounting khipus were made of cotton, requiring less sophistication in their craftsmanship. However, two khipus recently studied in Collata display a new degree of depth and sophistication, showing that these objects could encode entire stories, as long as you know what to look for.

Narratives beyond numbers

The two khipus from Collata are known to reference the Incan rebellions, but they’re not really being read word for word at this point. Village elders have been passing the khipus, along with other documents tied to the town’s history, down to each other for generations, telling the story of the khipus in the process. The khipus are thought to be based on Quechua, a language no longer spoken in Collata. Still, the various twists, colors, and choices in fiber support the notion that this cords represent words, with 95 individual symbols having been identified.

Since my two sentence summary of the rebellion used 51 unique words, it’s easy to see that the khipu isn’t offering any one-to-one transcription. The 95 symbols that have been identified aren’t phonetic words, instead functioning more like rebuses or pictograms. So rather than directly represent a word, symbols that can remind someone of a word thanks to their similar sounds might be combined to express an idea. This system wouldn’t have been as flexible as a fully phonetic alphabet, but the 95 symbols seen across 487 cords on the Collata khipus are enough to have hundreds of unique combinations.

At this point, researchers speculate that these khipus may be a taste of a significant and unexplored communication system. The success of the Incan empire is hard to imagine without robust means of communication, and if more examples of narrative writing can be found, it may help unlock a lot of recorded knowledge about how that society functioned.

Source: Writing with Twisted Cords: The Inscriptive Capacity of Andean Khipus by Sabine Hyland, Current Anthropology

On August 28th, 2017 we learned about

Like humans, rhesus monkeys fixate on finding false faces in photos

In a time when people have a hard time seeing eye-to-eye, it may be comforting to know that we basically see the same world monkeys do. Studies of various monkeys’ perception have found that our fellow primates react to visual stimuli similarly to humans, even making the same mistakes we do. For instance, both humans and rhesus monkeys might confuse images of large, gray animals like elephants and rhinos, even though the monkeys lack contextual information to relate those creatures to each other. The visuals were perceived along the same underlying neural pathways, making an individual response from a monkey nearly indistinguishable from that of a human.

Now researchers have found a new shared “mistake” between rhesus monkeys and humans, which is called pareidolia, or seeing patterns like faces that aren’t really there. Monkeys were shown photos of inanimate objects that they’d have no real understanding of, such as coffee cups, purses and appliances. Photos that happened to have that critical two-dots-over-a-line pattern of a simple face held monkeys’ attention much longer than versions that didn’t. Based on previous data showing monkeys prefer to look at faces over non-faces, plus eye-tracking data that indicates the monkeys were examining objects’ “eyes” and “mouths” most of all, researchers are confident the monkeys are noting faces the same way we do. In fact, the when given the choice between a photo of a monkey’s face and an object’s faux-face, the object actually seemed to be the more interesting option for most test subjects.

Inherited interests

This fascination with faces isn’t surprising. Separate studies of macaque monkeys have found that the old world primates have the same functionality in their brain’s fusiform face area that we do. In that study, the monkeys recognized human faces as faces, despite our being a different species. Now that we know rhesus monkeys extend this activity to pareidolia like humans do, it’s increasingly likely that these socially-oriented specializations evolved in a distant, shared ancestor. Our interest in finding faces may not be a human trait after all, instead being shared among many primates that make a point to look into each other’s eyes.

Source: Rhesus monkeys found to see faces in inanimate objects too by Bob Yirka, Phys.org