On February 28th, 2018 we learned about

Revising the histories of domesticated rabbits and horses

Humans have apparently domesticated so many animals that we can barely keep track of our activity. A lot of work has gone into trying to uncover the origins of domesticated cats and dogs, and while there’s sometimes contention over new discoveries, we seem to be honing in on how these animals came to live with us. The histories of a few other animals have a bit more room for improvement, as some commonly accepted explanations for how they were domesticated have recently been discovered to just be wrong. In the case of rabbits, researchers are essentially abandoning the notion of a single date for domestication, while horses have recently been found to be entirely domesticated, meaning that truly wild horses have been extinct for ages.

Rabbits’ gradual shift from meat and pelts to pets

Rabbits were supposedly domesticated in 600 AD, when Pope Gregory endorsed raising them so their fetuses could be eaten as a meat substitute during lent. Not only do we lack evidence that such decree was ever issued, but genetic analysis has found that people were probably raising rabbits well before that date. Once the original timing had been disproved, researchers realized that the notion of a single moment of domestication didn’t hold much water either. They said it’s more likely that rabbits were first rounded up as far back as ancient Rome, and then diverged genetically over a long time in their human-imposed isolation. The domestication was a just a gradual byproduct of other needs being met.

Earliest domestication of Equus

Rounding up wild horses was possibly a more deliberate act. When the Botai people of Kazakhstan started rounding up and keeping horses 5,500 years ago, they initiated a change in the horses and their own lifestyles. Archaeological evidence of skulls, fencing, horse-skin leather and an unusual concentration of compounds found in manure suggests that the Botai people gave up being nomads around this time in order to raise horses in permanent settlements. The horses were probably ridden, but also used as a source of meat and milk, not unlike the dietary traditions of some Mongolian tribes today.

Tamed, but then untethered

The Botai were quite successful with their horses, and it’s now thought that some escaped and began spreading across Asia and into Europe. In fact, genetic analysis of living and extinct horses has found that all horses alive today stem from the domesticated Botai horses, including Przewalski’s horse, a sub-species previously thought to be the last wild horse on Earth. Like the mustangs of North America, Przewalski’s horses are actually domesticated horses that have been allowed to become feral. Since their genes have, at some point in history, been manipulated by human intervention, we’re basically left with no horses created strictly by natural selection.

It’s understandable that these mistakes were made. Przewalski’s horse in particular is known for their upright manes, dun-colored coats and other features that resemble extinct wild horses, as well as the wild horses recorded in cave paintings by Neanderthals in France and Spain. Unfortunately, since humans weren’t really aware of how far domesticated horses had spread, it’s hard to say at this point when the last wild horse went extinct. There’s still a lot of biodiversity in domesticated horse populations, but it seems like it would be helpful for humans to fully understand our influences on the world around us.

Source: Surprising new study redraws family tree of domesticated and 'wild' horses by University of Kansas, Phys.org

On February 13th, 2018 we learned about

Panda’s plant-eating triggered a transformation in their taste-buds

In a move that may baffle veggie-fearing kids everywhere, pandas are trying to get better at tasting bitterness. While human diets today often include safe amounts of bitter greens, those flavors are generally the result of toxins a plant produces to avoid being eaten. The trick to being a plant-eater then is to be able to taste which leaves are delicious and which might be deadly. Since pandas’ ancestors used to be carnivores, their taste buds are now playing catch-up with their modern, bamboo-centric diet.

Tracking genes for taste-buds

Panda’s taste-transition was recorded in their DNA. Researchers analyzed genomes from both red (Ailurus fulgens) and giant pandas (Ailuropoda melanoleuca), comparing them to carnivorous relatives like polar bears, wolves and tigers. There was special interest in genes known to encode for umami taste receptors, which can detect the flavor of meat, and bitter receptors, which detect toxins in plants. Since pandas likely started giving up meat only seven million years ago, it was expected that many of the changes to their genome would stand out as recent modifications.

As predicted, there was a lot of activity with these genes for taste receptors. One of the earliest changes was dropping the ability to taste umami altogether. Without any kind of flesh, fungi or marine plants like nori in their diets, these receptors would be a waste of real estate on the panda’s tongues. Instead, pandas apparently started increasing the number of active copies of genes in the TAS2R family, which detect bitterness. Every creature wants to have some sense of bitterness in their mouths, but while carnivorous cousins only have 10 to 14 copies of these genes, pandas are now carrying around 16. One gene in particular, TAS2R42, was found to be accumulating mutations at a notable clip, suggesting that individuals with working copies of this gene were notably more likely to survive and spread it further through the gene pool. As one would expect, natural selection has been driving this increase to help pandas to more safely navigate their bamboo-based diet.

Making progress on plant-eating

This transition from meat to potentially-toxic plants probably isn’t complete. Panda’s still sport sharp canine teeth from their meat-eating days, and their digestive tract is notoriously slow at extracting nutrients from bamboo. It’s unknown exactly how specific their new sensitivity to bitter flavors is at this point, although researchers are now working on matching panda’s taste buds to the particular bitter compounds found in bamboo. At the very least, there’s probably further room for improvement, as pandas still have fewer bitter taste receptors than other herbivores. This may be part of the reason they’re so picky about which plants they eat, as they’re not really ready to safely taste flora other than bamboo.

Source: Panda tongues evolved to protect them from toxins, study suggests by Erica Tennenhouse, Science

On January 28th, 2018 we learned about

Human digestive tracts can handle eating insect exoskeletons

A mouth full of canines, bicuspids and molars is enough to prove that humans are omnivores. We’re not as specialized at slashing flesh as a tiger may be, but our teeth and jaws can handle a lot of different kinds of foods. However, chewing is only the beginning of the story, as we don’t necessarily have the means to digest everything we can swallow. Some fiber, for instance, can get broken down by bacteria, but without multiple stomachs like a cow, won’t provide a lot nutrition for us. One supposed gap has recently been closed though, as it turns out there’s nothing stopping us from eating, and benefiting, from eating insects.

Insects are covered in tough exoskeletons made from a substance called chitin. This gives their bodies a tough outer shell that was thought to be impervious to our digestive system. Nobody argued that our teeth couldn’t crack a beetle’s shell, but that once it was swallowed it would be a relatively inefficient source of nutrition that would basically need to be passed through us. Even insectivorous species of bats are known to pass a fair amount of chitin in their poop, suggesting that only a small portion of a bug can actually be used as food.

However, bats, mice and various primates obviously included insects in their diets for a reason. Researchers then identified a specific stomach enzyme, known as CHIA, that helped each of these mammal groups break down exoskeletons. They then looked at various primates’ genomes to see how many copies of the enzyme-producing genes each species carried. More copies of the gene would then lead to more enzyme production, presumably to help digest more bugs.

Genetic gut-check

It became clear that some of our ancient primate ancestors ate a lot more bugs than we do. Many older primates had three copies of the CHIA-producing gene, with the record going to modern tarsiers, which carries five copies to enable its insect-rich diet. It seems that insects’ role in primate diets has diminished over time though, probably after being replaced by other plants and fruit. Still, as proper omnivores, bugs aren’t off our menu entirely— humans still have one copy of the gene needed to let us safely digest an insect’s outer shell.

This confirmation probably isn’t news to the two billion people around the world who already eat insects on a regular basis. However, it may help make people who don’t eat bugs a bit more comfortable with the idea enough to give roasted grasshoppers, or at least pulverized cricket flour. In many cases, the recipes that people use to prep their bugs add one more tool to our digestive toolbox, which is heat. Even if our stomachs are ready to handle a bit of chitin, cooking our creepy crawlies will make things that much easier.


My four-year-old said: I don’t want to eat bugs. That’s yucky, and bugs are cute!

Source: Study says humans can digest bugs, assuming they want to by Robin Lally, Phys.org

On November 19th, 2017 we learned about

Analysis of potatoes’ genetic past identifies opportunities for better breeding in the future

Don’t take this the wrong way, but you’re simpler than a potato, at least on a genetic level. A study of the lowly spuds’ genetic history has found not only how complex the modern potato’s genome is, but that it may be overdue for some innovation. This interest in potato evolution isn’t because potatoes are slacking off in meeting their mutation quota, but that potatoes are humanity’s third most important crop worldwide. If nudging the right gene might yield healthier french fries, we might all be better for it.

A modern, cultivated potato has a lot of genetic material to look through, with over 39,000 genes in it’s genome. That’s more than a human’s 20,000 genes, and even more than potatoes own ancestors. Wild potatoes, as ancestors to the potatoes we grow to eat, are much simpler in comparison. They reproduce with what’s known as a diploid genome, with two sets of chromosomes per organism. This can be accomplished with seeds and berries, a feature that has obviously been bred out of their domesticated counterparts.

Taming the tuber

In the last eight- to ten-thousand years, human intervention changed a lot about these starchy members of the nightshade family. A domesticated potato can reproduce asexually, and now sports a tetroid genome, with four sets of chromosomes per individual. To potatoes out of the Andes mountains, we’ve altered everything from their pest resistance to the the plants’ circadian rhythm, as growing outside high mountain ranges meant differing amounts of sunlight per day. These sorts of mutations are among the 2,622 genes that transformed the potato into the staple starch we now find at the grocery store.

While researchers would like to see further change in the potato genome, they’re mostly looking to achieve it the old fashioned way. In the relatively short time since potatoes were first domesticated, farmers have been able to make some significant changes to these plants. With that said, there is concern that more recent breeding efforts have hit a bit of a ceiling, with no major improvements to speak of in the last 100 years. With more specific information about the potato genome, we may be able to make more significant gains via more carefully planned breeding programs. So as much as you may enjoy your mashed potatoes today, farmers may be able to offer an even better option in the not-so-distant future.

Source: Examining Potatoes’ Past Could Improve Spuds Of The Future by Layne Cameron and Robin Buell, MSU Today

On October 24th, 2017 we learned about

The strange but important reasons for your body’s Sonic Hedgehog gene

After a week of playing Acapella Science’s Evo-Devo in the car to my kids, I thought it would be worth walking through the song to try and explain what it’s talking about. The song is dense, packing in a ton of information about the relationship between individual animal’s growth and species’ evolutionary change. As far as I could tell, my four- and eight-year-old followed along as well as could be expected:

Genes provide instructions to build specific body parts. Got it!
Changing when those instructions are used can put anatomy in new locations. Ok!
That’s Sonic the Hedgehog, a video game character known for collecting gold rings and magic gems. What?

The glimpse of SEGA’s Sonic the Hedgehog is brief and unexplained in the song’s video, and my kids certainly weren’t the first people to be confused by the character’s inclusion in a discussion of genetics. As it turns out, Sonic is included as a reference to a specific gene pathway named after the video game character, much to the chagrin of scientists and doctors ever since its discovery. This is largely because the Sonic Hedgehog gene turned out to be an incredibly crucial ingredient in embryonic development, and mutations can lead to serious and often lethal health problems. Nobody wants to hear about a sassy video game mascot when facing life and death health problems, and as you’d imagine, the name was originally picked with a completely different context in mind.

Figuring out names for fruit fly genes

To try and make sense of this, we need to first back up to fruit flies. Drosophila melanogaster is a fruit fly species commonly used in laboratory experiments due to it’s relatively simple genome, manageable size, and quick life cycles. The flies reproduce and grow up quickly, allowing scientists to make changes to their genes then quickly see the effects of those alterations in action as the next batch of flies grows up. As an effect of a specific gene was isolated, researchers would give them names that described their function. An example is antennapedia, which is a gene that regulates when flies grow legs and antennae, as both types of anatomy are grown from the same underlying genetic instructions. Decreasing antennapedia’s functionality leads to legs growing from the fly’s face, while exaggerated versions of the gene will cause all the fly’s legs to grow as antennae.

At some point, researchers added a bit of irreverence to their naming conventions. This was a fun way to liven up hours spent raising deformed flies, but also helped these names stick in people’s memories. Strings of prefixes and quantities may be accurate and inoffensive, but they won’t help you remember that a certain gene mutation can cause a fly to develop without a heart in the same way the name “Tinman” can. Mutations in the “Out Cold” gene lose coordination in low temperatures. “Groucho Marx” mutations cause an excess of facial bristles. The list of silly names is long, but a group that has risen to prominence is the “hedgehog” genes, a name picked because mutations cause fly larvae to sprout unusually dense coverings of hair-like denticles on the backs while growing shorter and squatter overall, looking a lot like a hedgehog.

From “hedgehog” genes to human health

The hedgehog genes proved to be complex and important to various lines of research. To differentiate between them, specific variations were named after types of hedgehogs, including Indian hedgehog and desert hedgehog. This seemed fine until Dr. Robert Riddle found a new variation in 1991, and opted to name his hedgehog after the video game character which was debuting in the United States around that time. In that moment, it probably made sense, carrying on both the hedgehog group name and the tradition of adding humor to a gene’s name. The name certainly didn’t consider what this particular gene meant to an organism’s health, or that 75 percent of fruit fly disease genes also turn up in mammals like humans, meaning it was going to be relevant to a much wider audience.

Ideally, the Sonic Hedgehog gene pathway created proteins that regulate what anatomy gets built where in a developing embryo. Higher concentrations of these proteins at one end of an embryo lead to different outcomes than lower concentrations elsewhere, allowing it to serve multiple purposes in a developing body. Many of these functions are associated with making mirrored anatomy— you have two lungs, two lobes in your brain, etc. Eyes, for example, initially grow as a single “eye field” in the middle of an animal’s face, but the Sonic Hedgehog signalling gets the body to split that tissue into two distinct eyes. When these functions are disrupted, one result is holoprosencephaly (HPE). Mild cases of HPE may lead to a single, fused incisor instead of the usual two front teeth, but more severe cases often lead to stillborn offspring, cycloptic eyes and underdeveloped brains. This is one example of when a doctor doesn’t want to be discussing a fictional blue hedgehog, although bizarrely there is a small, unintentional connection between the character and HPE— Sonic is usually drawn with one giant sclera containing two pupils, rather than two distinct eyeballs. By some total fluke of character design, it appears that Sonic himself may be displaying symptoms of HPE.

Calls to stop controversial gene names

With all the confusion and controversy surrounding the Sonic Hedgehog gene, it seems like it should have been the last “silly” name that could be tied to a medical condition. It certainly stands out, but it certainly wasn’t the last time fictional characters were attached to a gene’s name. Beatrix Potter’s Mrs. Tiggy-Winkle is the namesake for another hedgehog gene, making studies like “Sonic Hedgehog and Tiggy-Winkle Hedgehog Cooperatively Induce Zebrafish Branchiomotor Neurons” a reality instead of just word salad. Not every silly name get through though, as Pokemon USA threatened to sue researchers who wanted their gene discovery to be dubbed the Pokemon gene, as the company understandably didn’t want their characters becoming associated with cancer. Future discoveries are likely to receive duller names as well, thanks to nomenclature committees looking to crack down on joke names that are more likely to reach the public’s ear than in the past. Theoretically, hearing about a disorder based on the “LFNG O- fucosylpeptide 3-beta-N-acetylglucosaminyltransferase” gene should somehow be more dignified than learning about problems with Sonic, especially if you’re more of a Mario fan.


My four-year-old asked: I don’t have a bad [Sonic Hedgehog] gene, do I?

There’s no sign of that. The Sonic Hedgehog gene is really only active in embryos, since it’s needed to help properly assemble anatomy in the correct arrangement. Once you’re done growing, its primary job is done. That said, it may also play a role in some regenerative functions, having been associated with hair regrowth in rats, among other things.

Source: The sonic hedgehog gene by Anna Perman, The Guardian

On October 19th, 2017 we learned about

New evidence necessitates the reevaluation of species that survived past their supposed extinctions

It seems like it’d be hard to miss an animal the size of a lion named for its serrated, sword-like teeth. An animal like Homotherium latidens, or the European scimitar cat, was once one of Europe’s most formidable predators, at least until 300,000 years ago, when it seemed to have gone extinct. However, a jawbone pulled from the North Sea is rewriting that timeline by 270,000 years, as both carbon dating and genetic evidence suggests it was alive as recently as 28,000 years ago. This younger specimen is now raising a lot of questions, as new causes for extinction, new ecological niches and that giant gap in the fossil record all need to be reconsidered.

Homotherium were like slightly scaled-down versions of their more famous saber-toothed cousins, like Smilodon. Nonetheless, this cat still had two large, canine teeth, and their knife-edge shapes suggest they were probably used for cutting and slashing rather than simply impaling prey. It’s hard to know for sure, because the humans that we now know lived as neighbors to these scimitar cats unfortunately didn’t leave any good field notes behind. Even cave paintings of predatory cats found in norther France somehow omit any solid portraits of H. latidens.

Missing, or migrating?

One of the possible explanations for Homotherium’s absence in the fossil and written record may be that they just weren’t around much. One hypothesis to explain how the cats could be alive without leaving behind more evidence is that they had gone on a very long migration, possibly even around the world. This idea is slightly bolstered by some the fact that the cats’ closest relatives are known to have turned up in North America, although that relationship is also being reexamined.

Looking at the mitochondrial DNA recovered from the new jawbone, researchers were able to not only date the specimen, but also see where it fits in the larger cat family tree. They found that H. latidens is remarkably similar to its North American kin, Homotherium serum— so much so that it’s been suggested that they might be the same species in a new location. This similarity is in contrast to H. latidens’ relationship with other cat species, which forked away from each other 20 million years ago. While they do share a common ancestor, your house cat is more closely related to a modern tiger than Homotherium is to other saber-toothed cats like Smilodon.

Assuming that we’ve now found the most recent H. latidens bone on the planet, scientists now have to think about what caused its final extinction 28,000 years ago. As presumptuous as it sounds, there’s a fair chance that these cats really did go extinct at that point (really!) if only because so many other animals were being removed from the food chain around the same time. Europe was experiencing an ice age at that point and coming to grips with more efficient human hunters. The combined ecological stresses likely explain not only the extinction of these saber-toothed cats, but other megafauna like mammoths and cave bears as well.

Other exaggerated extinctions

Balbaroo fangaroo
Balbaroo fangaroo, who’s name may never be topped

As big an upheaval as this new bone has caused, it’s worth remembering that this isn’t the first time paleontologists have had to rethink extinctions. You can’t predict what fossils will be found, and while most species seem to cluster geographically and chronologically, they can surprise us with their extended survival. Just this month, another extinct mammal with big teeth extended it’s timeline, but by five million years. Fanged kangaroos like Balbaroo fangaroo weren’t exactly impressive predators, as they were browsing herbivores that scurried around ancient Australia, but they apparently did better than we’d previously given them credit for. Like the European saber-toothed cats, the pressures that drove them to their (final!) extinction is now be rethought, since in their case they seem to have outlived their regions major climate crisis known to have taken place 15 million years ago.

Source: This Saber-Toothed Cat Mingled With Modern Humans by Michelle Z. Donahue, National Geographic

On October 4th, 2017 we learned about

Pinning down the causes and effects of overly picky eaters

“Do I have eat all of it?”

My daughter looked at me, trying her best to look sad and tortured over the possibility of eating three more forkfuls of salad. The effect was slightly diminished though by her hand, which was still pinching her nose to stop herself from actually tasting her food.

“Yes, eat all of it.”

For all the groaning and whining, she did finish the serving of vegetables. Like most kids her age, she’d greatly prefer a diet strictly composed of starches and sugars, and so this melodrama wasn’t that surprising. However, it also wasn’t that bad- she’s been slowly expanding her range of palatable foods. I can’t really say that she’s a picky eater, because she will try new foods, occasionally even admitting to like them. What may seem “picky” one night might not on another, or to another parent. Because having a limited diet can have an impact on one’s health, scientists are trying to figure out what metrics can be used to classify a truly picky eater.

Figuring out what makes kids finicky

There are a lot of factors involved in a kid’s attitude towards food. Environmental feedback from parents and caregivers counts for a lot, but there’s evidence that kids all have an underlying predisposition for certain foods over others. One distinction that’s being made is cases where kids object to a meal because they don’t like the food, or if they’re objecting as a way to gain control over a situation. That’s sometimes easier said than done, as some kids seem to swing back and forth in their reactions to anything that’s not their favorite macaroni and cheese.

One truly measurable criteria may turn out to be genetics. Kids identified as “picky” by the adults in their lives had their DNA tested, with particular attention given to five genes related to taste. Out of those five, two genes were more likely to have variations in kids that turned their noses up to everything. Kids with very limited palates were most likely to have an unusual nucleotide on the TAS2R38 gene, and kids that turned meals into power struggles showed differences on their CA6 gene. Incidentally, both genes are associated with bitter taste perception, and so these kids’ objects may be tied to feeling extra sensitive to bitter flavors. Since evolution has used bitterness as a toxic defense mechanism in many species of plants, it’s not surprising that it would be an issue kids would fight about.

Minimal menu leads to damaged eyes

This doesn’t mean that picky eaters aren’t worth working with. Most veggies aren’t going to give them a dangerous dose of toxins, but it may just save them from serious vitamin deficiencies. A boy in Canada was recently brought to a hospital because his vision was deteriorating at an alarming rate, and could only make out a blur of movement if objects were dangled a foot in front of his face. Dry patches were found around the edge of his iris, and his cornea was somewhat disfigured.

Doctors eventually realized that he was severely vitamin A deficient, thanks to an extremely limited diet of lamb, pork, potatoes, apples, cucumbers and Cheerios. Without a trace of carrots, sweet potatoes, spinach or fish in his diet, the boy had essentially starved himself of a nutrient most of us don’t need to worry much about. Instead of eating his vitamin A, he was left to receive multiple doses of it intravenously, which restored much of his vision, but not all of it. At least the apples and Cheerios are helping the poor kid get some fiber.

Source: Got a picky eater? How 'nature and nurture' may be influencing eating behavior in young children, MedicalXpress.com

On September 12th, 2017 we learned about

DNA test defies long-held assumptions by revealing that a decorated Viking warrior was, in fact, female

It’s weird when a fantasy series for kids ends up beating actual archaeologists to a historical fact. As it turns out though, thanks to characters like Astrid in the How to Train Your Dragon series, my kids are apparently more comfortable with the notion of a female Viking warrior than most scholars have been for the past 137 years. A grave in Birka, Sweden was discovered with a considerable amount of equipment suitable for a high-ranking warrior, but nobody even really considered the idea that this warrior was a woman until this year, when genetic testing firmly established her XX chromosomes.

The grave in question is known as BJ581, and is somewhat famous as an example of a successful warrior’s grave. In addition to the human skeleton, the grave also contained the bodies of a male and female horse, a sword, an axe, a spear, armor-piercing arrows, a battle knife, two shields and a board game. Many of these items have since been found to be representative of warrior burial practices in the Middle Ages, but the board game stands out. The chess-like game is thought to be a sign that this particular warrior was able, and probably expected, to have a mastery of strategy and tactics, quite likely as a commander on the battlefield. The conclusion is therefore that the occupant of grave BJ581 wasn’t not only a fighter, but a skilled and accomplished one at that.

Weapons aren’t for women?

This same collection of grave goods has long been used as evidence that this warrior was male, because people were used to that idea, more or less. Researchers from the 1880s onward have essentially assumed that anyone able to wield a weapon must be male, most likely because of researchers’ own cultural standards. In most of the studies of this grave, there was little investigation into the skeleton’s sex because it was considered a foregone conclusion. Only recently did Anna Kjellström actually investigate the sex of the body, a task made easier with modern DNA analysis. In addition to the discovery of two X chromosomes, isotope samples from the skeleton’s tooth root and upper arm also revealed that this woman had probably moved to Birka from elsewhere somewhere between ages four and nine.

As definitive as the chromosomes are in this case, there’s still push-back in the academic community to accept this correction. In some cases, people point to the idea that women may have been buried with weapons that were heirlooms, or for ceremonial purposes. Women may have been buried alongside male warriors and their weapons. People have even wanted to rewrite the meaning of the game pieces, suggesting that maybe this longstanding sign of tactical prowess was actually, after 100 years of agreement, included because the deceased might have enjoyed playing games. This is all despite other historical records pointing women’s martial abilities in Viking societies, even beyond kids’ movies and cartoons.

Accepting female fighters

The warrior from grave BJ581 isn’t the first woman to face this kind of resistance. The most famous example is probably the Scythian women more commonly known as Amazons. Famous even in their own time, myths and exaggerations created some doubt about if these warriors really did fight bows and arrows, spears and swords from horseback as written and depicted in contemporaneous artwork. Today we have the physical evidence, including genetic testing, to confirm many of these legends, and it seems that the reputation of female Viking warriors may be on a similar track. With this new knowledge from Birka, people are now wondering how many other Viking women have been misidentified in other graves, and further tests should help settle some doubts about who exactly was fighting in Viking armies.


My third grader asked: Women have smaller bones than men?

Part of what sparked the interest in testing this warrior’s skeleton’s DNA was that the bones were proportionally a bit slighter than one might expect for a male. Hips and shoulders are usually more obvious hints at a skeleton’s sex, but studies have also found that males are more likely to have slightly thicker bones, such as around the tibia.

Source: First Female Viking Warrior Proved Through DNA by Kristina Killgrove, Forbes

On August 21st, 2017 we learned about

External stimuli sets off a slate of unexpected gene expression in stickleback fish brains

In a tense situation, your body can spring into action, altering various physiological processes to help you respond to potential danger. Your heart rate may spike, you might breath faster, and the release of hormones like adrenaline can help sharpen you vision and dull pain. It’s a pretty amazing biological tool kit, and scientists have recently found that it may be more complicated than previously understood. Studies of bees, mice and now stickleback fish have found that a fleeting encounter with an intruder may trigger a flurry of activity based around the animal’s DNA, a bit of anatomy not normally associated with temporary responses to daily stimuli.

The primary function of DNA is to act as an archive or schematic of your body. Each cell has an incredibly tightly coiled series of nucleotides that are usually only unpacked when a cell is dividing and needs to help build a new cell, or when specific proteins are needed to help the body function. The complexity of the molecules involved reinforced the assumption that this was always a slow process, but researchers tracking gene expression in stickleback brains found that DNA may be more accessible than we understood. Even interactions external to the body, such as encountering an intruder in one’s territory, were enough to trigger specific genes to start producing new proteins in the fish’s brains.

Specific sequence on a schedule

There seemed to be a reliable formula to the observed gene expression. Within 30 minutes of an encounter with a potential threat, genes relating to hormone production were accessed and activated. 60 minutes after the encounter, genes that helped control metabolism were active, followed by genes related to immune function and homeostasis at 120 minutes. The fact that these genes were observed in the telencephalon and diencephalon brain centers, which are related to learning, memory and social information, may suggest that all this activity is meant to help the fish learn or at least remember their run-ins with trouble.

There’s likely more to this than simply being a reinforcement of more obvious activity between neurons. Sticklebacks are very territorial, and seem to claim any space they can control as their own. However, that means that they’re starting this chain of genetic activity many times a day, which seems energetically costly if each encounter requires unpacking sections of DNA on top of other responses. It may be that the DNA is not as immediately tied to the events themselves, but is actually priming the brain to help it learn more easily. More research is needed into how this all affects the fish’s memory, but it seems that DNA is relied upon more frequently than anyone expected.

Source: Brief Interactions Spur Lasting Waves Of Gene Activity In The Brain, Scienmag

On May 4th, 2017 we learned about

Age and “sweet tooth” genes can make eating sugar less satiating

Apologies if this makes me a bad parent, but I’m not actually sure how much sugar my kids eat each day. I do know that it makes them very excited to do so, and so every possible spike in sucrose and fructose in their daily routine is something to be negotiated, connived or at least celebrated. In the case of my four- and eight-year-old, a lot of this love for sweets is probably tied to their ages— kids taste receptors don’t work the same way adults’ do, and their growth seems to help them use those calories too. If these preferences last past their 16th birthdays though, their mom and I may be to blame, not because of parenting, but because of genetics.

Dessert-oriented DNA

Danish researchers recently isolated what they believe to be a “sweet tooth” gene, FGF21. Two variations in this gene was associated with significantly higher amounts of sugar consumption on a daily basis among the 6,500 people who participated in the study. The more common variations of the gene help produce hormones that calm neurological reward responses, making sugar less exciting to our brains after a certain amount has been eaten. People with this genetic sweet tooth don’t seem to have that same cap, and happily consume more sugar without feeling sated by it. More troubling, there may this reward connection may mean these people are also more likely to consume more alcohol and cigarettes, although that hasn’t been explicitly proven yet.

Before you start blaming FGF21 for the last candy bar you ate, don’t forget the other sweet tooth gene, SLCa2. Identified in 2008, this gene produces a protein called GLUT2, which helps move glucose around the body and help us feel full after our blood sugar levels are normalized. In lab experiments, mice with a mutation on the FGF21 gene were prone to eating more food than other mice, and there may be a correlation with Type 2 Diabetes. Overall, a change in a single amino acid correlated with as much as 25 more grams of sugar than people without the sweet tooth mutation.

Caloric counterbalance

Importantly, neither sweet tooth gene mutation really synced up with serious health problems (although these test participants’ dentists may have a different opinion on that.) People with FGF21 mutations actually had lower body mass indexes on average, so if they were somehow eating more calories due to extra sugar, they were also making up for it elsewhere in their diets. People with SLCa2 mutations were similar— while they may have eaten anywhere from 3 to 15 additional grams of sugar than other people, they weren’t consuming extra calories as a result. They were just making sugar a bigger proportion of their diet. This may be problematic if the remaining calories aren’t providing enough vitamins, antioxidants and fiber, but by itself a sweet tooth isn’t necessarily a bad thing.

Source: Crave Sugar? Maybe It's in Your Genes by Dina Fine Maron, Scientific American