On October 7th, 2018 we learned about

Hadza camps’ shifting community standards outweigh individuals’ interest in sharing resources

Anyone with a five-year-old can tell you that sharing is not a completely ingrained behavior in humans. Sure, we’re a very social species that owes a lot to getting along with each other, but that doesn’t seem to translate into immediately sharing our own resources with others. So if sharing can feel uncomfortable enough that a kid might turn down a free chocolate chip cookie just to avoid giving half to his sister, how did humans ever really get started teaching our children that this was a behavior worth practicing? It may seem like this question is now impossibly tangled up in economic, social and even marketing influences, but researchers believe a system used by the nomadic Hadza people of East Africa may offer some insight into how cultures developed standards about sharing.

The Hadza don’t have some single, perfect formula for dividing up resources, because they don’t live in permanent groups. Instead, individuals are likely to move between various camps on a regular basis, each with its own expectations for how people conduct themselves. So while one person may have a predisposition to share or withhold food they’ve gathered, this seems to be overruled by the expectations of whichever camp that individual is living in at the time. From the perspective of my uncooperative five-year-old, this would mean that even if he didn’t want to split a cookie, that preference wouldn’t matter as much as what his current camp expected of him.

Who holds their honey sticks?

Rather than relying on self-reported generosity, researchers tested these standards for sharing over the course of six years across 56 different camps. During that time, 383 people participated, although 137 ended up participating more than once because they had switched camps in the same order as the researchers. The test itself tried to simplify sharing by turning it into a game that asked people to hold or share some of the four straws of honey each participant was issued. Any straws shared with the group pool at the end of the game would be matched three-times over, with that resulting windfall being shared equally among all participants. So in theory, everyone could end up with 12 sticks of honey if they all pitched in, although knowing that didn’t mean that every participant really maximized their gains.

Some camps just tended to hold on to their own straws more than others. What’s more, participants that had played in a previous camp generally changed their sharing strategy to match how their new peers were sharing, rather than donating the amount they’d done in the previous camp. This strongly suggests that social expectations outweigh personal preferences, which has likely helped societies overcome individuals instincts to protect their own resources. If this holds true for all humans, I guess my household has been stingier around my five-year-old than I’d like to admit. If we’d really like to bring out his generosity, we need to make sure the whole family is willing to share our cookies before asking him to do so.

Source: The way hunter-gatherers share food shows how cooperation evolved by Bruce Bower, Science News

On September 26th, 2018 we learned about

Humans may be wired to live lazily, even if it contributed to the extinction of other hominids

It’s probably embarrassingly easy to come up with personal examples of being lazy. I admit that I’ve driven distances that were easily reachable on foot or on a bike. My kids routinely wear their shirts backwards, even after their mistake is pointed out to them. My wife seems to be only vaguely aware of the family’s laundry hamper. Each of these examples of sloth probably doesn’t amount to much in isolation, but over time they can certainly add up to rather extreme consequences. In the case of Homo erectus, researchers even think being lazy may have played a role in our fellow primates’ extinction. This isn’t great news, as modern Homo sapiens also struggle with laziness, and there’s a chance that we may not be able to ever really be rid of it.

No effort to avoid extinction

As far as we know, H. erectus didn’t go extinct because they wouldn’t deal with their laundry. There is evidence, however, that they did a very poor job managing the resources available to them, even when it surely reduced their quality of life. Stone tools excavated in the Arabian Peninsula were found to be made of very low quality rock fragments, despite an abundance of higher-quality materials within walking distance of the dig site. Unlike ancient Homo sapiens or Neanderthals that apparently transported high-quality materials for miles in order to craft better tools, the H. erectus craftsmen were confusingly content to just use whatever pieces of rock rolled down the hill into their camp.

Being slightly inefficient became a bigger problem for H. erectus as the climate started to shift around them. Researchers found that at time periods when food sources were being shaken up by changes in the environment, H. erectus didn’t seem to respond in any meaningful way. Their tool design and other survival strategies were apparently conservative to the point of being static, leaving them to the complete mercy of the droughts that probably brought about their demise.

Just like our ancestors who did go that extra mile to make better tools, modern humans would surely be protected from this degree of laziness, right? As a species, we’d never sit on our hands while our world’s climate changed around us.

Right?

Internal inertia

If it’s too early to know how closely modern Homo sapiens are willing to follow H. erectus‘ example, we do have evidence that humans are physically less active than they were in the past. Most of us don’t need to hunt or gather our own food anymore, giving us plenty of opportunities to lazily sit on the couch for hours on end (although even modern hunter-gatherers sit down for nearly half their day). Researchers following these trends don’t think it’s just that we’ve lost the will to move or anything, it’s just that being sedentary can, in the right context, be part of a successful survival strategy. Like a carnivore who sleeps nearly every hour that they’re not hunting, humans seem to have inherited a predisposition to try to save our energy whenever we get the chance.

This model may seem intuitive, but researchers recently tested these instincts on a neurological level. Test participants were, appropriately enough, tasked to play a video game that required them to identify images as either active or inactive physical activities. Most people were actually faster to click pick out the images of running or jumping, but brain scans of participants painted a different picture of these responses. Electroencephalograms (EEG) showed that more of the brain had to work to identify images of activity, whereas it was literally easier for the brain to identify images of lazing around on the couch. It seems that even imagining being lazy takes less effort than imagining being active.

So are we doomed to start making worse tools because we can’t be bothered to get off the couch? Researchers said that it’s hard to block out these automatic patterns entirely, but if we’re aware of these cognitive biases we can probably train ourselves to overcome them. It’ll just take work to do so, so maybe we should try tackling it tomorrow…

Source: Laziness helped lead to extinction of Homo erectus by Australian National University, Science Daily

On June 7th, 2018 we learned about

Identifying the factors that likely enabled humanity’s first attempts at farming

For as influential and beneficial as agriculture has been for the human race, we’re not actually sure why we started farming in the first place. Obviously people would have been happy to have better access to food for their communities, have time for skill specialization, etc., but wanting more food doesn’t explain why people would start farming one year instead of another. Since people started raising crops before they could necessarily write about doing so, researchers have had to start looking at less direct data to figure out what influenced the rise of agriculture. If the resulting models are correct, they not only help eliminate some long-standing hypothesis about the birth of agriculture, but they may be applicable to other questions about early human activities as well.

Making an analytical model

The first step in this research was to look for patterns in the intersections of cultural traits, environmental conditions and population densities in modern or recent foraging societies. Peoples that largely relied on hunting, fishing or gathering to feed themselves were used as verifiable reference point to see what conditions would be expected for a society to carry on without agriculture. Of particular interest were specific factors like environmental stability and how often people traveled in comparison to population density. All these dynamics were assembled into a predictive model that could first be tested against the observed data to ensure that further predictions about ancient populations were grounded in reality.

Researchers then used historical data from around the world to see how environmental conditions could influence a population’s food supply. Agriculture is known to have developed independently at least 12 times in human history around the world, so conditions were analyzed for each of those instances. While many specifics differed, such as the specific year, cultural norms, etc, people from New Guinea to Central America to the Middle East all followed at least one clear trend: their entry into agriculture followed improving environmental conditions.

Predicting more about the past

A possible origin story may have then started with people enjoying enough food stability in their natural environment to have a bit of extra time and population growth. That allowed for more exploration of new ideas or techniques, eventually leading to more revolutionary discoveries like crop cultivation. It seems simple enough, but this possible origin is very different from previously proposed explanations, such as agriculture being the a response to near starvation, or simply arising at random in human history.

The models and analysis that revealed these patterns could likely use further refinement, but researchers are feeling quite confident about their long-term utility. They believe that it could be applied to events further back in time than the 21,000-year-old rise of agriculture, possibly looking at other major developments in human history. By analyzing how humanity changed in response to larger environmental trends, this kind of modeling may help us make sense of otherwise sparse and spotty archaeological evidence.

Source: On the origins of agriculture, researchers uncover new clues by Colorado State University, Science Daily

On April 12th, 2018 we learned about

The adaptations Homo sapiens adopted from northern Neanderthals

Homo sapiens are the last living hominid on Earth, but some of our extinct kin “live on” in our gene pool. Many of the most obvious examples of these genetic artifacts come from Homo neanderthalensis, a species of human from Europe that we know interbred with Homo sapiens coming out of Africa. Neanderthals were generally very similar to modern humans, but they had some special adaptations that helped them live in the colder, darker reaches of Europe. Some modern humans now carry these traits as well, although the exact list of what we did, or did not, inherit from our Northern cousins hasn’t always been obvious.

DNA to deal with the dark

One common set of assumptions has to do with sunlight. Europe, particularly in the winter, receives a lot less sunshine than more equatorial regions, and so anyone living there would want lighter skin to better absorb light for the creation of vitamin D. While we do carry some Neanderthal genes associated with how our skin interacts with the Sun, they don’t overtly govern pigmentation. Red hair, often seen as tied to a very light complexion, is actually unique to Homo sapiens. On the other hand, Neanderthals did contribute genes our genome that shape how quickly you might tan or get a sunburn, so there is some logic to the idea that they carried adaptations for living with less sunlight.

Sun exposure mattered to Neanderthal mental states as well. The gene ASB1 has been linked to people’s natural circadian rhythms, making them more likely to want naps or be a “night owl.” It’s part of another set of genes Homo sapiens picked up from Neanderthals, some of which also seem to determine how exposure to sunlight affects one’s mood. It clearly wasn’t easy living up North, and makes sense that these adaptations would be appropriated by Homo sapiens looking to move to darker latitudes.

Nifty noses for better breathing

One thing Homo sapiens didn’t seem to borrow was Neanderthals’ wide nasal cavities. While both humans and Neanderthals have more sophisticated sinuses than our common ancestor, Homo heidelbergensis, Neanderthals could move, moisten and warm considerably more air per breath than we can. This is again tied to living in a cold, dry climate, but modern humans didn’t adopt these noses when they ventured north, possibly because of our metabolisms.

Neanderthals probably needed better airflow to serve their high metabolic needs. Their thick, stocky bodies are estimated to have needed as many as 4,480 calories a day, nearly twice what’s recommended for a modern human male today. To process those calories, Neanderthals would have also needed more oxygen than a modern human, so they couldn’t afford to let a cold, dry climate slow them down. So perhaps interbreeding ancestors didn’t end up with such sophisticated schnozzes simply because they didn’t need them- our lower metabolisms made us a little more flexible when it came to airflow.

Source: Neanderthals didn't give us red hair but they certainly changed the way we sleep by Darren Curnoe, Phys.org

On April 11th, 2018 we learned about

Homo sapiens got a social boost by giving up bony eyebrows

You may have your mom’s hair, or your dad’s eyes, but how about humanity’s eyebrows? As much as you may hear about big brains or opposable thumbs, anthropologists believe that our flat, fuzzy eyebrows may be a uniquely human trait. What’s more, they may have given us an edge over other hominids with bonier brows.

Benefits of a big brow

Older ancestors, like Homo heidelbergensis, generally had brows closer to what you find on other primates. Even as their heads became taller in relation to their face, they still had a prominent ridge of bone protruding above the eye sockets. That may have provided a bit of protection and shade for their eyes, but researchers have now ruled out other structural benefits. Computer simulations have found that reducing these bits of bone had no ill-effect on a virtual H. heidelbergensis‘ bite strength, nor would it make more space for a larger brain case in the skull.

Even if a smooth brow doesn’t impede an individual’s ability to bite down, it doesn’t explain how it benefited modern Homo sapiens enough to spread throughout our gene pool. It wasn’t worse than a bony brow, but how was it better? Some pitting along the brows of H. heidelbergensis provided clues, as it was remarkably similar to microscopic craters found on display features in other primates. Dominant male mandrills, for instance, have these pits on the sides of their muzzles where they grow colorful tissue to express their social status to their potential rivals and mates. Seeing these pits on the side of H. heidelbergensis‘ brow ridge suggests that that feature may have started a similar function for our ancestors, signaling fitness and status as a display structure.

Making friends with fuzz

That kind of communication is may have started human ancestors down the road to our modern eyebrows built from hair and muscle. It’s easy to take them for granted, but the movement of eyebrows plays a big role in how we express emotion and intent to people around us. We can even communicate these feelings when drawing an abstract face with simple dots for eyes, as long as the brow lines slope up, down, or asymmetrically. Some expressions, like a quick lifting of both brows to express recognition or openness, are understood by humans around the world.

It’s thought that the role of this kind of social communication became incredibly important as humans banded together to start farming forming settlements thousands of years ago. Those developments would require more trust and cooperation with other individuals, and would have depended on clear communication. Being able to form social bonds, both with close kin and newcomers, would have allowed humans to help each other through difficult circumstances. It seems there was no better way to build a friendship than by moving articulated tufts of hair up and down over your eyes.

Source: Research to raise a few eyebrows: Why expressive brows might have mattered in human evolution by University of York, Phys.org

On February 25th, 2018 we learned about

Making sure we can measure how marmosets manage their speech

Your typical marmoset conversation has a lot of “tskiks,” “ekks,” and “pheees.” From a human perspective, it’s a pretty limited set of sounds— plenty of birds and signing apes have bigger vocabularies at their disposals. However, since marmosets are primates like us, and these are sounds they naturally use to communicate, they’re a good model organism to work with; they offer a simpler version of speech that was likely inherited a shared ancestor. The catch has been that, until very recently, it looked like marmosets were breaking a key rule of how we can talk.

Previous research has found that humans can form a syllable, as the smallest unit of speech, seven times a second. This is due to both biomechanical limitations in our voice boxes and neurological limits in our brains. It makes sense that these two systems would have evolved to match each other, and researchers want to know exactly how they synced up in our species’ past. Presumably, an vocalizing primate like a marmoset would be a good reference point, but only after the confusion with their “pheee” sounds were cleared up.

Finding the shortest sound

While a marmoset’s “tsik” and “ekk” sounds clearly functioned as syllables, their “phee” sounds were problematic, because they were too long. When listening to marmosets ‘speaking’ to each other, it seemed like the “phee” sounds could be drawn out in a way that let them cram in an irregular number of syllables per second. If they could do this, it would mean that their brains were treating one sound differently than the others, raising questions about the definition of syllables as a basic unit of speech.

To test this, researchers started interrupting vocalizing marmosets. When an animal got chatty, they’d get an earful of white noise, just annoying enough to make them stop their vocalization. The idea was that they wouldn’t be able to pronounce just half a syllable when interrupted— “tsik” and “ekk” wouldn’t be divided down to “ts” or “e,” as the marmoset would at least finish that sound before their mouth and brain stopped talking. Those sounds did prove to be indivisible, but “phee” wasn’t so durable. It seemed that a long “phee” sound could indeed be split, meaning those long sounds were actually two units of speech put together. “Phee” wasn’t breaking the rules about syllables- we just hadn’t realized that those vocalizations were really “phee” and “eee” being pronounced back-to-back.

Syllables hold steady

With this expansion of marmoset vocabulary, researchers have confirmed that syllables are a durable unit of speech in primates. This way of organizing vocalizations into 100 millisecond-chunks likely evolved long ago, as humans haven’t shared a common ancestor with New World monkeys like marmosets for at least 35 million years. With this confirmation, researchers can move forward on figuring out how all primate speech ended up with our syllabic speed limits.

Source: Monkey Vocabulary Decoded by Universitaet Tübingen, Science Daily

On January 29th, 2018 we learned about

Everyone understands a lullaby: Human brains may have shaped universally-appreciated traits in music

Even if you don’t understand the lyrics, your brain may be wired to recognize the meaning of music from around the world. No matter what the finer nuances of a song may be, there’s evidence that some core element of a song’s genre may be universally understood. So just as you would never mistake a lullaby with a dance anthem in your native tongue, there’s a good chance you can also tell them apart when listening to songs by the Mentawai people of Indonesia. At least in broad strokes, it seems that humans everywhere have very similar concepts of what a dance versus a love song should sound like.

That idea may sound simple enough, but figuring out how to test is isn’t easy. Finding ways to untangle a person’s cultural background from music they’re listening to isn’t a small task, because that same background likely informs our perception of that music. To hunt for signs of universal musical concepts, researchers spent years assembling a database of music from around the world, then sorted that music by region and the type of song according to four categories: dance songs, lullabies, healing songs and love songs. Clips of those songs were played for English-speakers from 60 different countries, simply asking them to pick a category for each song they heard. If they could correctly identify a love song in a language and style they weren’t familiar with, it would suggest that some aspect of that song was shared across cultures.

With 750 listeners reporting, researchers found that dance songs and lullabies were the most easily identified. Apparently, soothing a baby is a similar process no matter where one lives. Love songs and healing songs weren’t so clear, a fact probably compounded by a lack of healing songs from Scandinavia or the United Kingdom. Overall, this first experiment suggests that there are some universally understood aspects of music, but figuring out what those are, and why they exist, are still open questions.

Possible sources of musical synchronicity

The hypothesis about how humans would end up with universal elements in our music range from ideas around our brain structure to a bit of convergent evolution. One explanation may be that some of these musical concepts, like a lullaby, are hard wired into our brains to be created and appreciated, with each culture adding their own flourish on top of a core formula for a song.

Alternatively, our brains may provide an interest in sound and stimuli, and that musicians have learned to trigger that response, with certain tunes just being the most time-tested method for doing so. In that case, a lullaby works because it takes advantage of brain functions that originally involved for other purposes, unrelated to music.

There’s also a chance that these shared traits between songs are a case of convergence. Through other environmental pressures, like a desire for community building, displaying status and more, musicians everywhere have been guided to similar-sounding songs. Each musical lineage started independently, rather than in any specific brain structure, but has nonetheless converged on a single way to perform a dance song because quicker tempos and more layered instrumentation just does that job better than anything else.

At this point, more people need to participate in the listening tests. While the initial participants hailed from many different countries, the second phase of testing will aim to include viewpoints of non-English speakers, particularly from more isolated cultures. If these songs are still universally understood at that point, we can really start looking for what defines each genre in our minds, and how that core evolved in the first place.

Source: Some Types Of Songs Are Universally Identifiable, Study Suggests by Rebecca Hersher, Goats and Soda

On December 10th, 2017 we learned about

Sorting through the spectrum of what chimpanzees regard as repulsive

You might not want to eat while reading this. According to a recent study that aimed to gross out chimpanzees, text probably isn’t enough to trigger the sense of disgust we’ve inherited from our ancestors. With that said, stories that bring up the issue of coprophagia, or eating feces, probably isn’t great for one’s appetite. At least not a first.

Digging into the details of disgust

Researchers were investigating where chimpanzees, as our closest genetic relative alive today, draw the line with what they’ll put in their mouths to eat. That line certainly wasn’t clear from the outset, as wild chimps will pick seeds out of poop to eat, and captive chimps will go a step further and snack on their poop outright. Researchers learned that there was some nuance to chimps’ consumption of crap, as the animals apparently evaluate feces based on genetic familiarity. A chimpanzee will eat their own poop, or that of closely related family members, but any other chimps’ waste elicits a clear display of disgust.

With that baseline established, researchers set up experiments to further probe chimpanzees’ criteria for when food is too revolting to eat. In one scenario, food was placed in an opaque box, either on top of soft but edible dough, or on top of a piece of rope. Chimps reaching in for the food were obviously repulsed by the soft, moist dough, yanking their hands out of the box as if it bit them. Other tests involved food being placed on what looked like feces, or near the scent of blood, and while no response was apparently quite as disgusting as something soft, wet and squishy, the chimps seemed to have similar guidelines for what was gross that humans do. They didn’t necessarily have the same standards though, as they would sometimes end up eating food from disgusting sources, but overall their criteria was pretty relatable.

Finding the value in what chimps find foul

The fact that these chimpanzees get grossed out like we do may seem obvious, but it wouldn’t have been safe to automatically assume they operated on similar criteria to humans. After all, any degree of coprophagia is probably too disgusting for humans to seriously consider, and researchers wanted to see exactly what our species had in common with our fellow primates. Avoiding substances, like poop or blood, that could easily harbor pathogens makes sense as a survival tactic, and identifying commonalities indicates that chimps and humans likely inherited some of these reference points from a shared ancestor. This work may also help zoos and conservationists manage the health of chimpanzees in their care. Dangerous substances can be presented in a more disgusting manner, and individual chimps that seem too casual about gross sources of food can be given extra attention for exposure to pathogens.

Source: What grosses out a chimpanzee? The origins of disgust by Kyoto University, EurekAlert!

On September 20th, 2017 we learned about

Experiments demonstrate how to manipulate monkey (and human) metacognition

For as many times as we tell our kids to believe in themselves, it’s good to keep in mind that confidence can sometimes be misleading. This isn’t to say that doubting your every decision is helpful or healthy, but that sometimes we don’t realize why we’re confident in the first place, opening us up to manipulation, such as putting more trust in a statement because it’s written in larger letters. This susceptibility isn’t exactly our fault though, as researchers have found that our primate cousins fall for the same tricks. While falling for these influences may seem like a drawback for us, it’s also proving that monkeys have a more sophisticated sense of self than they’re usually given credit for.

Understanding exactly what you do and do not know is called metacognition. It’s very helpful to know where your gaps in knowledge are so that you can adjust your actions accordingly. For instance, if you know you’ve never eaten a particular berry before, knowing you don’t know what it is will probably push you to investigate it more carefully before popping it into your mouth. This might seem obvious, but people, and apparently other primates, commonly make mistakes when evaluating our personal knowledge base, and that can obviously get us into trouble.

Confirming monkeys’ confidence

To test metacognition in monkeys, researchers had to train them on a multi-step game that would allow these non-verbal test subjects to demonstrate how much they thought they knew about something. Monkeys were presented with touch screens showing a single image, like a cricket, which they had to poke at to proceed. They then see that same cricket image again, along with three other images meant to distract them from their task, which is to poke the same cricket again.

After they make their selection, they’re shown a screen where they need to rate how confident they are about their previous poke. If they’re sure they’re right, they can pick an option that will net them three tokens for a correct answer, but cost them a token if they’re wrong. If they’re less sure about their answer, a low-confidence indicator will let them gain one token, even if they’re wrong. Monkeys are only rewarded once they earn a specific number of tokens, forcing them to play the long game to get a treat.

It’s a lot to throw at a monkey, but they seemed to get it enough to play and reveal patterns in their decision making. Once a monkey seemed to understand the mechanics of the game, researchers started manipulating how information was presented in order to manipulate confidence levels. For instance, higher-contrast images made the monkeys wager with more confidence, while low-contrast images had the opposite effect. These sorts of attributes change confidence levels in humans too, along with shorter, easier-to-pronounce vocabulary and the aforementioned larger text size.

Indecision versus imprecision

This isn’t to say that metacognition is just a form of self-delusion. Knowing when to take a shortcut, or react quickly and decisively, can be very helpful in certain scenarios. These traits probably evolved in a distant primate ancestor, and have been helping humans and monkeys for millions of years. Of course, it’s probably also helpful to know what you know versus what you think you know, since sometimes that same confidence can get you into trouble.

Source: Monkey sees. . . monkey knows? by Lindsey Valich, Rochester Newscenter

On September 4th, 2017 we learned about

Fossilized footprints found near Crete may seriously complicate our ancestors’ origins

Your foot may ache, smell and maybe need a toenail trim, but it’s a pretty special bit of anatomy. Even if you’re used to your single row of clawless of toes, it’s actually a unique arrangement among just about every animal on Earth. Our closest living relatives don’t have their hallux, or big toe, facing forward like we do, giving them a much more hand-shaped foot. These toes, along with the ball of our foot and long instep, come together to make a very distinct footprint. What’s exciting scientists now is that some of these prints have turned up in a time and place where they supposedly had no business being.

A set of footprints were found hardened into sedimentary rock on an island called Trachilos, off the coast of Crete. Based on the foraminifera, or marine microfossils, found in the layers of rock above and below the slab of stone in question, researchers confidently dated the prints as being 5.6 million years old. The catch is that at that time, no human ancestors, much less humans, were thought to be anywhere but Africa. What’s more, the shape of the prints look more like modern feet than any known ancestor living at that time.

Figuring out the who, what and where

Even without an actual bone or tool, these footprints may be enough to up-end our timeline for human evolution. The oldest confirmed hominid is Ardipithecus ramidus, who is thought to be a direct ancestor to modern humans. However, A. ramidus lived in Ethiopia around 4.4 million years ago, and at that point had a much more gorilla-shaped foot. We know that by 3.65 millions ago, our ancestor Australopithecus was leaving very modern-looking footprints in Tanzania, but neither of these dates sync up with the stroll some primate took through Crete at least a million years earlier.

Putting the evolutionary questions about foot-shapes aside, walking to the island of Trachilos is actually one of the easier issues to understand. North Africa and the Mediterrean Sea were very different places 5.6 million years ago, as the Sahara Desert was a savannah and the Mediterrean was beginning to dry up, thanks to the Strait of Gibraltar closing off the waters of the Atlantic Ocean. What’s more, Crete itself was still attached to the Greek mainland, making access to this particular plot of land all the more plausible.

Many questions to consider

Assuming this initial interpretation of these 50 footprints hold up, anthropologists have a lot of new questions to explore. Were the prints made by a hominid, or just a very similar gorilla living in Europe? Did this mystery primate walk out of Africa to Trachilos, having evolved our unique feet earlier than anyone thought? Were these prints left by a direct relative to modern Homo sapiens, or is this an offshoot of our family tree that didn’t end up succeeding with their early upright gait? Or, most radically, did our ancestors originate outside of Africa, strolling south instead of north? That last idea is a long-shot, and would still leave the species Homo sapiens as originating in Africa, but this simple stroll in the soft soil now demands that we investigate a lot of new possibilities.


My third grader said: I hope it turns out to be one of the crazier explanations!

Source: What Made These Footprints 5.7 Million Years Ago? by Gemma Tarlach, Dead Things