On February 12th, 2018 we learned about

Cheetahs’ high speeds are viable thanks to their uniquely-sized inner ears

Cheetahs don’t run fast to set records. They run to catch food. From that perspective, it makes sense that some of their most important adaptations aren’t only found in their long legs or flexible spines, but in their heads to help guide them towards their prey. A study of the big cats’ inner ears has found that they’re possibly the most advanced, and most recently developed, feature that helps cheetahs not just run after, but also catch their elusive prey.

Inner ears aren’t used for hearing. They’re a specialized structure in the skull that tells vertebrates how their heads are oriented, like the accelerometers in your phone or video game controller. Each inner ear consists of three semicircular canals that contain fluid and sensitive hair cells. The fluid levels off thanks to gravity, tickling some hair cells but not others. With each canal handling a different direction of motion, such as side-to-side versus tilt, the combined data can be used by the brain to figure out how a head is not only oriented, but how it’s moving through space. This kind of information is crucial for movement and balance, especially when that movement is coming from a 65 mile-per-hour sprint after an unpredictable gazelle.

Steering at top speed

For a cheetah (Acinonyx jubatus) to stay on target, it needs to maintain visual contact with its prey. Even when their body is careening in a new direction, cheetahs do a great job at keeping their head steady so they can keep track of how their prey is moving. Researchers suspected that their inner ears were somehow enhanced to make this possible, and started scanning the skulls of cheetahs and other big cats with X-ray computed tomography, giving them a detailed, 3D view of how each animal’s ears worked. As expected, cheetahs were found to have larger and longer inner ear canals than other cats, which would give the fluid and hair cells more granularity in the signals they could provide. To put it another way, these super-sized sensors could pick up smaller variations in tilt and movement, plus have a slightly bigger range before those signals were “maxed out” at one extreme or the other.

By including the skulls of extinct cat species in this study, researchers were also able to estimate when cheetahs acquired this degree of sophistication in their inner ears. An extinct relative, Acinonyx pardinensis, was also specialized for running, but its ear canals wouldn’t have provided the same feedback found in a modern cheetah. That bulkier relative lived only a few hundred-thousand years ago, showing just how recently this line of cats evolved this degree of sensitivity. That timing may make sense, since A. pardinensis probably wasn’t as fast as a modern cheetah either, and thus didn’t need quite as much control and maneuverability as today’s record-holding runners. As top speeds increased over time, it seems that the sensors and feedback systems to help the animal steer grew as well.

My third-grader asked: If balance is connected to your ear, do deaf people have problems with balance?

They can. One estimate says that as many as 30 percent of deaf people may experience some kind of persistent balance problem. Apparently the cause of the hearing loss, such as meningitis versus  something like Usher’s Syndrome, can affect how severe the balance issues may be.

Source: Cheetahs' inner ear is one-of-a-kind, vital to high-speed hunting by American Museum of Natural History, EurekAlert!

On January 18th, 2018 we learned about

Survey of species parses the prerequisites for a wielding a weaponized tail

It’s hard to fight with your rear end. Aside from a creature like a horse chasing away flies with its tail, few creatures can be said to be more intimidating from behind. There are exceptions to this rule, and they’re unusual enough that researchers have studied every tail-swinging tough-guy to see why those species evolved to fight with their rears instead of their claws or faces. Even though using one’s head or limbs would seem perilous in its own way, the number of anatomical requirements needed to make a tail weapon functional may have curtailed (ahem) their popularity in both living and extinct animals.

There’s obviously some variety to how a weaponized tail can operate. Dinosaurs like Stegosaurus carried spiked thagomizers, while some ankylosaurs ended up with bludgeoning clubs. Living species are generally a tad less ornamented, but the quills on a porcupine or the speedy snap of a monitor lizard’s tail can still make a foe think twice about attacking. Once a list of 286 species was assembled, researchers categorized them by an array of traits, from diet to size to the presence of bone or spikes on the tail’s tip. At that point, the data could be sorted and sifted to find what common threads united the reptiles, mammals and dinosaurs that have ever purposely backed into a fight.

Two traits proved to be universal among all the tail-swingers. Every animal that did battle with its rump was an herbivore, and as such these weapons weren’t being used for hunting. Most living species with dangerous derrieres only use them defensively, although exceptions, like rainbow agama lizard, pick fights to compete for mates. Still, it seems that teeth and claws have an edge (sorry) for predatory activities, beating out clubbed behinds as the anatomy of choice for predators around the world.

Other traits that were tied to tail weapons weren’t quite as obvious. Creatures with enlarged tail tips, like Ankylosaurus, Glyptodon or Shunosaurus, were all huge, with specially formed vertebrae in their tails. Stiff tails seem to require a smaller minimum body size, but shared a likelihood for osteoderms, or bony skin armor. Tail spikes, such as those found on a Stegosaurus or meiolaniid turtles, were tied to more head ornamentation, wide hips and some kind of bony but discontinuous armor on the torso. All the extinct species were also found to share osteoderms enveloping their tail tips, as well as an unusually stiff trunk, possibly to provide better leverage for lateral tail-swings. Living backside-battlers still follow many of these rules, although the minimum size requirements have shrunk down to around three feet in length since the age of dinosaurs.

Putting these parts in perspective

This may seem like a very roundabout way to describe animals we already know of, but organizing these traits helps bring a few things into focus. Requirements like stiff bodies and protective skin covering had to evolve before tails were weaponized, since a tail club or spike is the less common feature, and there’s evidence that anatomy like osteoderms actually fused together to create those weapons in the first place. However, most large herbivores found other ways to protect themselves before actually developing a rear-mounted defense mechanism. This isn’t to say that tail weapons were ineffective at their jobs, but that they require so many other anatomical conditions first, they’re very unlikely to evolve in the first place.

It also seems that, as cool as a tail bludgeon or thagomizer may have been, the complete package of bulk, armor and bone didn’t necessarily out-compete other forms of defense once they were assembled. Instead, brawling with one’s butt remains a bit of a niche adaptation that evolution keeps playing with, but rarely fully commits to.

Source: Where did all the tail clubs go? by Victoria Arbour, Pseudoplocephalus

On November 14th, 2017 we learned about

Ancient otter had a bite big enough to put mollusks, mammals, birds and turtles on its menu

Paleontologists may have identified the most dangerous, and by extension, cutest, otter to ever live. Siamogale melilutra is estimated to have grown to 110 pounds in size, possibly acting as the apex predator in its local environment. This assessment isn’t just based on the otter’s wolf-sized body though, as researchers have also simulated its jaw strength, suggesting that this fuzzy critter could have probably munched just about any animal it got it’s mouth on.

S. melilutra lived around six million years ago in what is now southwestern China. It would have been a wet, swampy area with thick plant growth with plenty for an otter to eat, assuming these carnivores ate similarly to their modern river otter cousins. That’s a wide range of possibilities, as modern otters are known to eat everything from plants to rodents to clams. Once researchers factored in how the size of S. melilutra must have influenced its bite strength, they had to expand the possibilities to include thick-shelled mollusks, birds and even the local turtles if it felt like it.

Baby-faced with big bites

Otter bites are generally pretty impressive, being driven by large muscles in their face that also lead to the cute cheeks people find so adorable. To create a more detailed picture of S. melilutra’s bite, researchers scanned fossilized jaws with computed tomography (CT), getting a picture of the ancient bones inside and out. They calculated the range of the jaw’s movement and how much force it could exert on a crunchy snack, then compared that to similar calculations based on modern otters to see if the results fit with better known otter-anatomy. The overall trend was that smaller otters tend to have smaller, sturdier jaws than their larger kin. However, S. melilutra broke that trend as it’s jaws were six-times stronger than you’d expect for it’s size. Basically, it’s bite was even bigger than it’s already large body.

While calculating the physical forces a skull can exert is helpful, otters tend to complicate matters with their behavior. Most animals’ bites are correlated with the food they need to break apart more than their overall size. If this were strictly true in otters, clam-eating sea otters would have even more enormous jaws than they already do in order to break open hard shells. By using rocks, the otters have come up with an alternative to bigger teeth and jaws, and so their anatomy doesn’t need to match their diets so closely, allowing for the closer correlations with body size. Even allowing for this otter-specific weirdness, the fact that S. melilutra had both larger and stronger jaws still suggests that it was a top predator, as its powerful bite would have put just about everything in it’s environment on the menu, even without figuring out how to break shells with stones.

Source: A giant, prehistoric otter's surprisingly powerful bite by Charlotte Hsu, University of Buffalo News Center

On October 24th, 2017 we learned about

The strange but important reasons for your body’s Sonic Hedgehog gene

After a week of playing Acapella Science’s Evo-Devo in the car to my kids, I thought it would be worth walking through the song to try and explain what it’s talking about. The song is dense, packing in a ton of information about the relationship between individual animal’s growth and species’ evolutionary change. As far as I could tell, my four- and eight-year-old followed along as well as could be expected:

Genes provide instructions to build specific body parts. Got it!
Changing when those instructions are used can put anatomy in new locations. Ok!
That’s Sonic the Hedgehog, a video game character known for collecting gold rings and magic gems. What?

The glimpse of SEGA’s Sonic the Hedgehog is brief and unexplained in the song’s video, and my kids certainly weren’t the first people to be confused by the character’s inclusion in a discussion of genetics. As it turns out, Sonic is included as a reference to a specific gene pathway named after the video game character, much to the chagrin of scientists and doctors ever since its discovery. This is largely because the Sonic Hedgehog gene turned out to be an incredibly crucial ingredient in embryonic development, and mutations can lead to serious and often lethal health problems. Nobody wants to hear about a sassy video game mascot when facing life and death health problems, and as you’d imagine, the name was originally picked with a completely different context in mind.

Figuring out names for fruit fly genes

To try and make sense of this, we need to first back up to fruit flies. Drosophila melanogaster is a fruit fly species commonly used in laboratory experiments due to it’s relatively simple genome, manageable size, and quick life cycles. The flies reproduce and grow up quickly, allowing scientists to make changes to their genes then quickly see the effects of those alterations in action as the next batch of flies grows up. As an effect of a specific gene was isolated, researchers would give them names that described their function. An example is antennapedia, which is a gene that regulates when flies grow legs and antennae, as both types of anatomy are grown from the same underlying genetic instructions. Decreasing antennapedia’s functionality leads to legs growing from the fly’s face, while exaggerated versions of the gene will cause all the fly’s legs to grow as antennae.

At some point, researchers added a bit of irreverence to their naming conventions. This was a fun way to liven up hours spent raising deformed flies, but also helped these names stick in people’s memories. Strings of prefixes and quantities may be accurate and inoffensive, but they won’t help you remember that a certain gene mutation can cause a fly to develop without a heart in the same way the name “Tinman” can. Mutations in the “Out Cold” gene lose coordination in low temperatures. “Groucho Marx” mutations cause an excess of facial bristles. The list of silly names is long, but a group that has risen to prominence is the “hedgehog” genes, a name picked because mutations cause fly larvae to sprout unusually dense coverings of hair-like denticles on the backs while growing shorter and squatter overall, looking a lot like a hedgehog.

From “hedgehog” genes to human health

The hedgehog genes proved to be complex and important to various lines of research. To differentiate between them, specific variations were named after types of hedgehogs, including Indian hedgehog and desert hedgehog. This seemed fine until Dr. Robert Riddle found a new variation in 1991, and opted to name his hedgehog after the video game character which was debuting in the United States around that time. In that moment, it probably made sense, carrying on both the hedgehog group name and the tradition of adding humor to a gene’s name. The name certainly didn’t consider what this particular gene meant to an organism’s health, or that 75 percent of fruit fly disease genes also turn up in mammals like humans, meaning it was going to be relevant to a much wider audience.

Ideally, the Sonic Hedgehog gene pathway created proteins that regulate what anatomy gets built where in a developing embryo. Higher concentrations of these proteins at one end of an embryo lead to different outcomes than lower concentrations elsewhere, allowing it to serve multiple purposes in a developing body. Many of these functions are associated with making mirrored anatomy— you have two lungs, two lobes in your brain, etc. Eyes, for example, initially grow as a single “eye field” in the middle of an animal’s face, but the Sonic Hedgehog signalling gets the body to split that tissue into two distinct eyes. When these functions are disrupted, one result is holoprosencephaly (HPE). Mild cases of HPE may lead to a single, fused incisor instead of the usual two front teeth, but more severe cases often lead to stillborn offspring, cycloptic eyes and underdeveloped brains. This is one example of when a doctor doesn’t want to be discussing a fictional blue hedgehog, although bizarrely there is a small, unintentional connection between the character and HPE— Sonic is usually drawn with one giant sclera containing two pupils, rather than two distinct eyeballs. By some total fluke of character design, it appears that Sonic himself may be displaying symptoms of HPE.

Calls to stop controversial gene names

With all the confusion and controversy surrounding the Sonic Hedgehog gene, it seems like it should have been the last “silly” name that could be tied to a medical condition. It certainly stands out, but it certainly wasn’t the last time fictional characters were attached to a gene’s name. Beatrix Potter’s Mrs. Tiggy-Winkle is the namesake for another hedgehog gene, making studies like “Sonic Hedgehog and Tiggy-Winkle Hedgehog Cooperatively Induce Zebrafish Branchiomotor Neurons” a reality instead of just word salad. Not every silly name get through though, as Pokemon USA threatened to sue researchers who wanted their gene discovery to be dubbed the Pokemon gene, as the company understandably didn’t want their characters becoming associated with cancer. Future discoveries are likely to receive duller names as well, thanks to nomenclature committees looking to crack down on joke names that are more likely to reach the public’s ear than in the past. Theoretically, hearing about a disorder based on the “LFNG O- fucosylpeptide 3-beta-N-acetylglucosaminyltransferase” gene should somehow be more dignified than learning about problems with Sonic, especially if you’re more of a Mario fan.

My four-year-old asked: I don’t have a bad [Sonic Hedgehog] gene, do I?

There’s no sign of that. The Sonic Hedgehog gene is really only active in embryos, since it’s needed to help properly assemble anatomy in the correct arrangement. Once you’re done growing, its primary job is done. That said, it may also play a role in some regenerative functions, having been associated with hair regrowth in rats, among other things.

Source: The sonic hedgehog gene by Anna Perman, The Guardian

On September 28th, 2017 we learned about

How theropod dinosaur heads were reshaped to build the beaks and brains of modern birds

How did evolution build a chicken out of Tyrannosaurus parts? There’s no doubt that modern birds evolved from theropod dinosaurs millions of years ago, but scientists still want to know exactly how that transition occurred. Anatomy usually doesn’t appear or reappear all at once, but is instead resized and reshaped into forms that will better serve a creature in its changing environment. However, when it comes to the transformation from a T. rex to a chicken, some more dramatic swaps somehow took place across dinosaurs’ skulls, particularly with vanishing teeth and the rise of big beaks.

Thorough investigations of swaths of reptiles, dinosaurs and birds have found that the transformation from a theropod’s toothy snout to a bird’s hard beak had its roots in the individual life-cycles of various theropods. Limusaurus inextricabilis was a theropod that was born with teeth, but lost them entirely by the time it was an adult. Computed tomography (CT) scans of Limusaurus skulls found that even after this change took place, the adults still had tooth sockets in their lower jaws. Their mouths became beaks because those sockets were basically covered up by a sheath of material called keratin.

Developing a beak from birth

Keratin is a fibrous material that all kinds of animals use to grow a huge variety of structures on their bodies. Keratin can form hair, horns, feathers, claws and fingernails, all with the possible advantage of being a flexible at smaller sizes, to help avoid breaking, while also being easier to replenish than harder materials like enamel-covered teeth. So while a creature like Limusaurus gave up meat-slicing teeth as it aged, it was gaining a strong, repairable beak that could still help it capture all the calories it needed.

Of course, modern birds don’t have to wait until they’ve fledged to enjoy the benefits of a beak. Researchers believe that over time, avian dinosaurs started the process of growing their keratin sheaths earlier and earlier in their lives, eventually starting before hatchlings even exited their eggs. At some point, this meant that these dinosaurs skipped their toothy phase altogether, jumping straight to a beak from the beginning. Unsurprisingly, the gene that seems to spur beak growth, BMP4, has also been found to suppress the development of teeth, putting this whole transition together in a single mechanism.

Shaping skulls for bird brains

There’s more to being a bird than a hard beak though. Our modern feathered friends also generally have proportionally bigger brain cases than their ancient relatives, generally with a rounder, roomier shape over all. A separate study again looked at skull anatomy across reptiles, dinosaurs and birds, and found that while reptile heads don’t show a ton of change in the fossil record, there’s a bigger jump from non-avian dinosaurs to modern birds. Researchers believe that as the bird brains got bigger, the frontal and parietal bones in the skull basically had to balloon about to accommodate them. They don’t yet have anything as concrete as a single regulatory gene to point to, but the advantages of a bigger brain seem like an obvious evolutionary pressure to warrant reshaping the back of a species’ skull.

Source: How did dinosaurs evolve beaks and become birds? Scientists think they have the answer by Michael J. Benton, Phys.org

On September 4th, 2017 we learned about

Figuring out the functional advantage of an ostrich’s four kneecaps

What makes ostriches the fastest bird on land? Is it their their nearly four-foot-long legs? Their single-toed running? Or maybe the secret to their 43-mile-per-hour speeds is their four kneecaps? The latter is an interesting possibility, partially because ostriches are the only animals to pack two patellae into each knee, suggesting that they help the birds achieve their record-holding speeds. Thanks to a well-flexed ostrich cadaver, we now have a few more answers.

Seeing the inner-workings of a kneecap isn’t easy when it’s running around on a nine-foot bird, but Researchers from the Royal Veterinary College in London were able to take a closer look at the knees of a dead ostrich donated for study. The bird’s legs were repeatedly flexed in walking and running motions while the knees were monitored with biplanar fluoroscopy, a sort of advanced form of x-ray imaging. The upper patella, or kneecap, looked very similar to what’s found in our own legs, wrapping around tendons that join the thigh to the shin. The lower patella was more closely mounted on the shin bone, presumably to protect the same tendons as they straddled the knee joint.

Not exactly the knees you know

The weird part was how all these pieces came together when the leg moved. In your knee, the kneecap helps guard tendons, but it more importantly moves the joint where those tendons move forward, past where your thigh bone ends. This small shift in distance allows the thigh muscles to use a lot less effort to straighten the lower leg, trading distance for force. Here’s a simple demonstration of this concept if you’re having a hard time picturing it. In ostriches, however, this isn’t the case, and their knees seem designed for nearly the opposite goal.

The ostrich kneecaps were found to actually demand more effort of the thigh bones to straighten the joint. Researchers don’t think that ostriches have gotten this far on defective knees though, pointing to the advantage in reversing how the tendons can be moved by the thigh muscles. They may require more effort in this configuration, but they seem to allow the leg to straighten faster, possibly bestowing a bit of extra speed in the process. The fact that ostriches can dish out potentially-fatal kicks may also be a factor here, as the faster delivery of force will impart more damage to a target than a slower leg would.

Studying exclusive anatomy

It’s hard to really isolate the benefits of these specialized knees, because there’s not really a good point of comparison in the natural world. Ostriches’ closest relatives, like emus or cassowaries, actually lack kneecaps altogether, so they’re not even a half-way point for looking at how patellae change these large birds’ sprints. It’s not just an issue with large, flightless birds though, as other birds in the same ratite family, like kiwis, do have “normal” kneecaps. While simulations of theoretical anatomy may someday prove the exact benefit of four kneecaps, researchers are also looking at larger questions, such as when the first patella evolved, and if all these new iterations somehow stem from a long-lost ancestor.

My third grader asked: Wait- their knee is up by their body?

Yep. As with many non-humans, ostrich thighs aren’t that long. The main joint we see when they run is the ankle, which is why it bends “backwards” when they flex it. What looks like our ankle is actually an elevated toe joint. It’s all the same bones, but they’re proportioned, and thus function, just a bit differently.

Source: Why the ostrich is the only living animal with four kneecaps by Michael Marshall, Zoologger

On September 3rd, 2017 we learned about

Simulating the stresses that make single-toed horse feet make sense

Farriers would have had a much more difficult job 55 million year ago. Making a modern horseshoe isn’t necessarily easy, but the famous “U” shape is a lot simpler than what an ancient horse’s four- and three-toed feet would have needed. Of course, ancient animals like Hyracotherium probably wouldn’t have required a metal shoe in the first place. They were much smaller animals that scurried through tropical forests on feet that looked a lot like your average dog’s, give or take a toe. That obviously changed over time, and while scientists are confident about the course of change that created modern one-toed horses, zebras and donkeys, they’ve had a much harder time understanding how these changes provided much of an advantage over retaining more digits.

Where to put the weight

To dig into potential biomechanical advantages of the modern horse hoof, researchers made detailed scans of fossils and bones to really understand the structure of each species’ legs. More ancient animals, like Hyracotherium, were only 20 pounds. Nonetheless, their weight was distributed over multiple digits on each foot, with each digit ending in a small hoof. This arrangement was similar to a modern tapir, and meant that no single toe had to bear too much stress with each step.

Over time, the central toe took on bigger and bigger role. In Pseudhipparion, a horse ancestor that lived in the Miocene epoch, the middle toe was found to hold up to the stresses of running as much as a foot with every toe pitching in equally.  As species continued to grow larger and larger, the central toe seemed to become more and more robust to handle the load. We sometimes put shoes on horses today, but a healthy hoof, combined with thicker, slightly hollowed leg bones, can handle a lot of stress without a problem.

Dropping unneeded digits

With a center toe taking so much weight successfully, the other digits on horses foot may have simply become dead weight. Once horses moved out of forests and began specializing in running through open spaces like grassy plains, their legs grew longer and speed became a bigger issue. Extra toes may not seem like a big deal, but if they’re not really being used, they’re essentially just more tissue requiring energy to grow, keep healthy, and move around. With a center digit strong enough to hold their weight, it seems that horses just didn’t gain anything from keeping their other toes, and so evolution purged them over time.

My third grader asked: So all the one-toed animals are horses, zebras and donkeys? Cows and gazelles aren’t related?

This evolutionary path has only been followed by members of Equidae, which is today represented by horses, zebras and donkeys. While cows and gazelles have hard hooves, they are actually more distantly related to horses than tapirs and rhinos. The fact that so few animals have abandoned nearly all of their digits is part of why scientists have been so curious about equids’ unusual feet.

Source: How the horse became the only living animal with a single toe by Nicola Davis, The Guardian

On August 17th, 2017 we learned about

Chilesaurus diegosuarezi’s plant-digesting gut and the origins of ornithischian dinosaurs

When we first heard about Chilesaurus diegosuarezi, the unusual dinosaur was being labeled as a rare, herbivorous theropod. The creature’s leaf-shaped teeth just didn’t look up to the task of tearing meat compared to the pointed and sometimes serrated chompers seen in most theropods. The was speculation that this strange hodgepodge creature was a weird hiccup in theropod history, but new analysis suggests that C. diegosuarezi wasn’t a plant-eating theropod, but a bipedal ornithischian. If correct, this dinosaur may represent beginnings of the family tree we now associate with Stegosaurus, Ankylosaurus, and various duck-billed dinosaurs like Edmontosaurus.

C. diegosuarezi’s interest in eating plants isn’t being questioned. If anything, it’s thought to be an important factor in why his body has some more ornithischian traits. The big deliminator between theropods and ornithischians is usually the shape of their hip bones. Theropods, both before and after C. deigosaurezi, have what’s been called a “lizard” hipbone, because the pubis bone faces forward like on modern lizards. In contrast, ornithischians have “bird” hips, where the pubis faces backward. It’s all a bit confusing, because modern birds are actually theropods, even with that ornithischian-looking hip bone. With C. deigosaurezi being classified as an ornithischian too, it means that a rear-facing pubis bone must have evolved at least twice— once when the ornithischians first branched off the theropods, then again when with the development of birds.

What pressures moved the pubis bone

This may seem like a lot of arbitrary changes in anatomy, but there are explanations for why they would help each lineage survive. The plant eating of ornithischians would require bigger, more complex guts to digest than chomped flesh would, and so they’d literally need more space their abdomens. A forward facing pubis probably didn’t leave enough room for their digestive demands, giving an advantage to herbivores with “bird” hips.

While we now have birds that eat plant-based foods like nectar and seeds, digestion probably isn’t the reason a crow or sparrow ended up with a rear-facing pubis. In that case, the evolutionary pressure may have been balance. Modern birds don’t have the thick, muscular tails older theropods had, and so as their tails were reduced to the feather-covered stumps we know today, their center of gravity shifted. To avoid tipping over too much, the pubis bone evolved to face backwards, taking a bit of weight with it.

Obviously, C. deigosaurezi’s hips weren’t putting it on the road to flight, but were making room for a bigger tummy. Combined with a mouth well-suited for eating plants, it suggests that adapting to an herbivorous diet was a driving factor in the split between theropods and ornithischians. This then raises questions about what was happening in these creatures’ environment to make plant-eating so enticing that such transitions would occur. Changes in the continents were likely leading to more moisture on land, making the world a much more attractive salad bar, providing options to creatures that were trying out eating more than meat.

Source: One of the Most Puzzling Dinosaurs Ever Discovered Just Got a Major Rebrand by George Dvorsky, Gizmodo

On August 15th, 2017 we learned about

Even the dimmed sunlight from the solar eclipse can pose a danger to your eyes

Odds are that you’ve never directly viewed a solar eclipse, and you probably shouldn’t start any time soon for the sake of your eyeballs. While the eclipse does have interesting effects on our atmosphere, there’s nothing about the Moon blocking the Sun that magically transforms good sunlight into something dangerous. The sunlight is actually always dangerous, but most of the time it’s bright enough to remind us not to try and gawk at it. Even what seems like a small amount of light can be a health hazard to your eyes, so it’s very important to protect your peepers from the sun when things go dark on August 21st.

Our bodies are bathed in sunlight whenever we’re outside, and it’s obviously not such an immediate problem. Most skin can withstand short exposure to ultraviolet light (UV) without too much wear and tear, and our eyes handle the indirect UV light pretty well (although wearing sunglasses is certainly a good idea.) The reason this all compounds when viewing an eclipse is the that you’re looking right at the sun, and that light can be focused through the lens of your eye. Like a magnifying glass focusing sunlight to start a fire, your lenses focus light on the back of your at the retina. The intensity of directly-focused sunlight can quickly damage your cells by creating reactive molecules called free radicals, which then go on to kill the cell.

Safer ways to stare at the Sun

In most cases of this kind of damage, the damage is somewhat limited. The retina will basically be left with gaps where cells have been killed, and you will have a new set of blind spots in your eye to contend with. Sometimes people recover from this damage, but sometimes they’re left legally blind, as they can only see with the peripheral vision that wasn’t torched by the sun.

This isn’t to say that the only way to enjoy an eclipse is to avoid it. While your sunglasses are in no way up to the task of protecting your eyes when viewing an eclipse, solar-viewing glasses are designed to only allow a safe amount of light, meaning around 0.00032 percent of normal sun exposure. Alternatively, you can view the eclipse in the same way you usually take in sunlight— indirectly. A simple pinhole camera will let you safely watch a projected image of the Sun as it gets blocked out, all without staring right into the sky. If you’re looking for a closer look, don’t use your favorite telescope or binoculars unless you have specific filters for that as well, since that’s basically focusing sunlight at your retina even more effectively than your own eye’s lens can do.

My third grader asked: Isn’t the sunlight blocked enough to be less of a problem?

It takes very little sunlight to harm your eyes, especially when it’s being focused into your eyeball. However, once the Moon completely blocks the Sun during totality, it’s recommended that you take off your protective eyewear, as things will otherwise be too dark to see. With luck, you’ll get a peak at the Sun’s atmosphere around the outer rim of the Moon, and this light won’t be coming directly at you to cause harm. As soon as the Moon starts to move out of the way though, get your glasses back on since any direct sunlight can be a problem.

Source: If the Sun Is 93 Million Miles Away, Why Can't We Look Directly at It? by Rachael Rettner, Live Science

On August 8th, 2017 we learned about

The ups and downs of a deer’s annual investment in disposable antlers

For all of the underlying biology we share with other animals, it’s hard to relate to antlers. They grow on heads, but they’re not hair. They’re not the cellular equivalent of fingernails that we find in rhino horns or porcupine quills. Instead, antlers are weird, bony growths that sprout anew every year, demonstrating just how much of a strain and specialization a body can go through in the name of sexual selection.

Exhausting anatomy

An antler starts growing on a deer’s head in early spring each year. Unlike the inert keratin that makes up your fingernails or hair, antlers are made of living cells, and grow inside a fuzzy layer of skin called “velvet.” As the antlers develop over the summer, it’s composed of active blood vessels, nerves and bone cells, all of which can grow up to three-quarters of an inch per day. Keeping this tissue alive isn’t free though, and deer will often have to strip nutrients from other anatomy to keep their growth on track. On top of everything else, it’s an investment that deer make annually, as unlike the horns of rams or rhinos, antlers are shed every year.

Great …or good enough

Theoretically, it’s all worth it though. Deer courtship places a lot of emphasis on antlers both as display structures and weapons. Male deer will butt heads and lock antlers to demonstrate their fitness. Like other famous bits of animal anatomy, bigger antlers help attract and impress mates while staving off potential challengers.

There’s a limit to all that “fitness” though, and studies have found that sporting the biggest rack is not always the most winning reproductive strategy. The increased metabolic demands and risks associated with bigger antlers seems to have given rise to a more subtle population of deer that get by just fine with more modestly-sized head ornaments. The assumption is that smaller antlers are just big enough to catch a mate’s eye without demanding too much upkeep. The fact that they’re less likely to get caught on a tree branch may also help keep their owner alive to try to mate again another year.

Cellular secrets

This isn’t meant to diminish how impressive these head-bones are though. Regardless of an antler’s size, scientists have been studying their cellular properties that let them grow so quickly while also being so resilient. The fibrous structures that compose the bone grow in a staggered pattern that helps them stand up to stress without being damaged. Antlers have been transplanted to other body parts, and even other animals like a mouse, and they keep growing like they were still on a deer’s noggin. Scientists aren’t looking to affix spikes to people’s heads exactly, but the fact that nerves can grow so quickly in an antler may be a model for human therapeutics some day.

My third grader asked: Do only boy deer grow antlers?

Outside of some unusual anomalies, its fair to say that antlers are a male appendage in just about every species of deer. The notable exception is reindeer, as both male and females grow antlers each year. It’s thought that the antlers help females claim territory that might hold precious bits of food. Coupled with a scarcity of predators on the tundra that the deer would need to hide from, it seems that having antlers ends up being a good thing for each and every reindeer.

Source: Antlers Are Miraculous Face Organs That Could Benefit Human Health by Jason Bittel, Smithsonian