On July 9th, 2018 we learned about

Crows strike first against ravens, preempting the risk supposedly posed by their fellow corvids

No bird wants to see more corvids move into their neighborhood. Aside from the likelihood that a magpie, crow or raven would outsmart their feathered kin, these birds are also likely to prey on smaller birds and their eggs. As much as that may threaten an individual chicken or swallow, it turns out that the corvids don’t really make a dent in overall bird populations; birds that corvids don’t catch are equally likely to be captured by some other predator. Nonetheless, it seems that some corvids have taken their dangerous reputation to heart, which may be why they attack each other so vigorously when there’s a nest that might need defending.

Aggressively preventing predation

After analyzing reports from thousands of amateur observers, a pattern became clear in crow and raven interactions. Without any kind of prompt, many of these observers noted how crows would seek out and drive ravens away from their territory, even before the ravens had made any kind of threatening movements on their own. In fact, the interactions were so one-sided that researchers found that the American and Northwestern crows were the aggressors in 97 percent of these interactions. The smaller crows did seem to avoid one-on-one interactions, preferring to form small teams to harass the larger ravens.

The aggression was most likely tied to nesting. Crows were most aggressive during breeding season, although they started to ramp up their attacks in the preceding winter months as well. Researchers believe that this was likely to defend a chosen territory and nesting site, blocking the ravens from gaining a foothold or access to resources anywhere near the crows’ eventual nursery.

Safe and sound?

Unfortunately, the actual predation of young crows by ravens likely isn’t as conspicuous as a team of crows driving a raven away, and so it’s less clear how much of an impact the crows behavior makes. If the pattern for corvids versus non-corvids holds true, it would suggest that while the crows may succeed in saving their young from the ravens, there’s a good chance some other threat may balance out their numbers nonetheless.

Source: Crows are always the bullies when it comes to fighting with ravens, Science Daily

On July 9th, 2018 we learned about

Prehistoric pink pigments found in fossils of world’s most ancient organisms

Beauty, and by extension coloration, is only skin deep. It’s a frustrating fact for paleontologists, who can often only guess at what colors extinct creatures like dinosaurs may have been millions of years ago. Allowing for a few unusual exceptions, it’s just very unlikely for the color-producing pigments from an animal’s skin to be preserved as a fossil. Unless, apparently, that organism is so ancient and simple that there was never skin to worry about, in which case we can say with confidence that some of the world’s original organisms were all pink.

This conclusion is the result of oil drilling in the Sahara Desert. Some of the extracted shale was found to contain microscopic fossils from 1.1 billion years ago, well before any multi-cellular organism ever wriggled or swam across the Earth. When analyzed further, researchers realized that the fossilized cells were preserved well enough to carry molecules from the organisms’ pigmentation, and that that pigment would have given each cell a light pink hue. That pink was likely part of an early version of chlorophyll, which helped researchers identify exactly what kind of organism produced it.

Large numbers, tiny size

Cosmetic concerns aside, these fossils were identifiable as an ancient form of cyanobacteria. Their concentration was high enough to suggest that these tiny organisms likely dominated their ecosystem to such an extent that they may have been holding other forms of life in check. Until algae really spread throughout the oceans, an ecosystem flooded with minuscule cyanobacteria wouldn’t have provided much nutrition for larger, more predatory organisms. In fact, they’re so small many have been appropriated in to larger organisms’ cells, making up the chloroplasts found in plant cells today.

More complex organisms still had a long time to wait though. These tiny, pink cells would continue to dominate the planet for another 450 million years after this particular batch started becoming fossils.


My fourth-grader asked: What are cyanobacteria? What were they eating then?

Cyanobacteria were likely the first organisms on the planet, and they’re still alive today. They generally live in water, and produce their own food through photosynthesis, which is why some now live in plants as mentioned above. Thankfully for the rest of us, cyanobacteria’s primary waste product is oxygen, meaning their metabolism is actually the reason we have air to breath today.

Source: Scientists discover world's oldest colour – bright pink by Luke Henriques-Gomes, The Guardian

On July 8th, 2018 we learned about

The sugar content of plants signals brown planthoppers to specialize in flight instead of reproduction

As a human, I know that eating extra sugar in my diet will generally only help me produce extra fat on my body. There’d be some neurological pleasure associated with eating the sugar, but once I covered my caloric needs for the day, I wouldn’t gain anything more interesting than a bit of pudge around my waistline. As unglamorous as that is, it’s even worse when you compare it to what brown planthoppers (Nilaparvata lugens) do with their extra sweets— instead of getting fat, these small insects turn their extra sugar into longer wings, allowing them to travel to new homes when their plant of origin has been depleted.

Soaring when things get sweet

Strictly speaking, it’s not that the planthoppers need sugar to grow wings, but that the sugar is a signal that it’s time to migrate to a new plant. Unlike humans and other mammals, the planthoppers aren’t actually terribly interested in consuming sugary glucose from the rice plants they live on, as they’d rather chow down on amino acids. As a rice plant ages, it produces fewer of the planthoppers’ favorite proteins and more glucose, which the insects apparently use a signal that it’s time to move on to a new edible home.

That migration isn’t easy though, and only a subset of planthoppers will ever be capable of moving on. This is because planthoppers can grow into one of two body types by the time they reach adulthood; they can either have stubby wings but large, productive ovaries or they can have longer, flight-suitable wings and smaller ovaries. Researchers only recently confirmed that the insects’ glucose consumption was the key factor in which body type was each bug ended up with. In fact, planthopper anatomy is so tightly controlled by glucose that injecting the young insects with glucose could get them to grow larger wings, even if they were raised in a lab without rice plants at all.

Wings at the wrong time?

Of course, none of this was being investigated to help the planthoppers. The insects are considered pests, as they consume large amounts of rice needed for human consumption, and so the hope is that understanding exactly how they mature into a winged form will help people devise a way to control their proliferation. Since a planthopper can’t reverse its growth once it’s started down one path or the other, researchers are looking for a way to use the bugs’ specialization against them. If they could be exposed to extra glucose, for instance, they might be spurred to migrate and reproduce less, sparing plants from being completely consumed. Or if the glucose could somehow be repressed, they might end up staying on an older plant long after its amino acid supply had been depleted.

Source: Host plants tell insects when to grow longer wings and migrate, WSU Insider

On July 8th, 2018 we learned about

Triple star system demonstrates that gravity is agnostic to the density of neutron stars

To test an idea scientifically, you should aim to have an experimental condition that you can compare to a control. So for example, if you want to see if food coloring helps your flowers grow faster, you’d want to grow the same kind of flowers without food coloring for comparison. Now, setting up experimental conditions isn’t so tough when you’re talking about small plants in a flowerbed, but what if you’re interested in something that could only be observed in some of the most extreme conditions in the universe? Maybe something like the nature of gravity in ultra-dense neutron stars? In that case, you may need to get lucky and find PSR J0337+1715, a star system apparently ready-made for exactly those kinds of questions.

Different types of dying stars

The unfortunately-named PSR J0337+1715 is a triple star system some 4,200 light years from Earth. Two of these stars are white dwarfs, meaning they’re the decaying cores of larger stars from long ago. They no longer have the temperatures needed to enable nuclear fusion and push their mass outwards, and as such are collapsing back into themselves. This means that a white dwarf with the mass of our Sun would be only have the volume of Earth, making these stars significantly more dense than what’s in our solar system, although a lot of their shrinkage is thanks to material being lost altogether.

While this is clearly different from our Sun, in the case of PSR J0337+1715 the white dwarfs are providing the normal baseline for researchers’ observations. The neutron star, on the other hand, is where the major questions lie. Instead of slowly degrading like a white dwarf, a neutron star is the result of a much larger star collapsing all at once. It packs a lot more mass into a lot less space, leaving it with an incredible density, gravitational and magnetic fields. The gravitational field is so strong that an object falling from three feet above a neutron star’s surface would instantly fall at over three million miles per hour while also being spaghettified due to tidal forces. There was also a chance that these extremes would cause the whole neutron star to interact with gravity as a whole in a different way than less-dense objects, such as some conveniently-positioned white dwarf stars.

Detailed tracking of next to no deviation

As it happened, the neutron star of PSR J0337+1715 is paired with one of the white dwarf stars in its orbit around the second white dwarf. This means that researchers could track how both each of these dead stars moved around the same object, looking for differences in their acceleration that would point a difference in how gravity was affecting the neutron star. If that weren’t convenient enough, this particular neutron star was actually a pulsar, meaning it was emitting a strong blast of radio waves 366 times per second. This broadcast could then be used like a tracking device, allowing the Green Bank Telescope to follow the neutron star’s movement with incredible fidelity. Even though it was over 4,000 light years away, the neutron star’s location could be tracked to within a few hundred feet. While they couldn’t say that the neutron star’s orbit behaved in absolutely the same manner as its white dwarf partner, researchers could at least be confident any variation would be less than three parts per million.

As perfect as this triple star system was for measuring this kind of movement in a neutron star, none of this data is particularly surprising. While there have been proposals that super-dense objects like neutron stars would bend the rules of gravity, the observations from PSR J0337+1715 basically support predictions Albert Einstein made in his general theory of relativity. Known as the Equivalence Principle, the idea is that all objects interact with gravity in the same way, even if the mass involved is dense enough to crush and strip an object into goo at over three million miles an hour.

Source: Even phenomenally dense neutron stars fall like a feather by Green Bank Observatory, Science Daily

On July 1st, 2018 we learned about

Raccoons that forage on humans’ high-calorie foods are boosting their blood sugar to unhealthy levels

“Maybe we should make raccoon-healthy food and disguise it as cake before we put it in the compost?”

While many raccoons might appreciate my daughter’s idea, I’m guessing that most wildlife specialists would prefer we just feed raccoons less food. After a few more minutes of brain-storming, even my kids realized that stocking compost and trash bins with food tailored to a raccoon’s dietary needs wasn’t really the solution to a growing problem in urban areas: raccoons that feed on our leftovers are getting bigger and fatter on our high-calorie diets, just like we are.

Foraging on our sweet, fatty foods

Wild raccoons should be living on hearty, raw foods like fruit, nuts, eggs, bugs and crayfish. And to be fair, a compost bin is likely to have a decent assortment of those items, albeit with a lot of other processed foods that people tossed out. This essentially makes too many calories too easy for a hungry animal to get a hold of, which would likely lead to increased fight production and other changes in metabolism. To see if human diets were making an impact on raccoons’ health, biologists looked beyond anecdotal reports of ‘fat raccoons‘ and started collecting blood samples for analysis.

Blood samples of three populations of raccoons were compared to see if environmental conditions, specifically access to human leftovers, was making a difference in the animals’ blood sugar levels. As one might expect, rural raccoons had much lower levels of glycated serum proteins (GSP) which are commonly used to monitor glucose levels in diabetics. The elevated blood sugar of urban raccoons suggests that they could easily be at risk of developing diabetes, metabolic disorders, and of course becoming obese. The catch is that most wild raccoons don’t live long enough for these problems to develop thanks to other hazards, like being hit by cars. So unless the raccoons are getting so fat they can’t dodge traffic, they might have no reason to mind a bit of extra french fries and ranch dressing in their diet.

Who benefits from bigger raccoons?

That’s not to say that our high-calorie leftovers aren’t creating a problem though. There’s actually a chance that these calorie-rich foods are boosting raccoons’ size and reproduction rates, which could create more conflict with humans. A larger raccoon with more kits may be more likely to break into our trash, fight with pets, and possibly spread parasites and disease. So even if city-dwelling raccoons don’t mind putting on some extra weight, the people sharing the neighborhood might really want to keep their leftovers to themselves.

Source: Raccoons experiencing high blood sugar levels from eating our food by Brandie Weikle, CBC Radio-Canada

On July 1st, 2018 we learned about

The controversial origins of the cross-cultural California roll

California rolls are a bit of an enigma wrapped in mystery, then covered in rice. They helped introduce sushi to westerners in the 1970s, even though they contain avocado and crab meat instead of fish. They’re made inside-out, hiding the nori seaweed inside the roll to hide its texture. They were long considered a perversion of traditional sushi, and yet their invention has earned one chef the title of cultural ambassador to Japan. Of course, it would be easier to make sense of these peculiar contradictions if we knew the true origins of the dish, which is difficult because California rolls have somehow been invented at least two times.

Created in California

The first American sushi restaurant opened in Los Angeles, California in 1966. There wasn’t a lot of demand for dishes based around fresh cuts of raw fish, and so the Kawafuku restaurant kept its main kitchen busy with more familiar dishes like teriyaki chicken. Beyond Americans’ concerns with raw fish, there was also hesitation over the use of seaweed as an outer wrapper. While American kids today might be happy to snack on nori on its own, few western diners in the 60s were entirely ready to wrap seaweed around their raw fish without squirming about it.

The first step towards a California roll then was to essentially eliminate the fish. Chef Ichiro Mashita, lacking sufficient supplies of tuna, substituted avocado at the Tokyo Kaikan restaurant in Los Angeles. Giant green fruit may not seem like a good substitute for tuna, but the rich, fatty flavor and soft texture was fairly successful, especially when paired with crab meat to bring back a bit more of a fishy flavor. While offering Californians something as familiar as avocado was a big move on its own, these rolls didn’t really take off until the rice was moved to the outside of the nori wrapping. This change was fairly blasphemous, as the crisp texture of the nori was normally a point of pride for sushi chefs, but hiding it proved to be the secret of California rolls’ success.

Born in British Columbia

You could say that the rest was history, if not for an alternative history that took place in British Columbia around the same time. Chef Hidekazu Tojo had opened a restaurant in Vancouver in 1971, and immediately ran into some of the same difficulties faced by Mashita in Los Angeles. Truly fresh fish was hard to come by, necessitating experimentation with ingredients like avocado and salmon skin to fill the role of tuna and eel respectively. Customers in Canada were also reluctant to give sushi much of a chance, spurring Tojo to hide nori under a layer of rice in what he dubbed an inside-out roll.

Tojo refers to these rolls as Tojo rolls in his restaurant today, although in the 1970s a steady stream of customers from Los Angeles supposedly earned them the name California rolls. While we may have thus been deprived of a name like the “Angelino roll,” the impact these rolls made was immense. While the alternations to traditional sushi making may have been compromises at first, they truly succeeded at making sushi accessible to a much wider audience. In recognition of his continued innovation, Chef Tojo was named one of just 13 cultural ambassadors of Japan in 2016, lending further weight to his version of the birth of California and other inside-out forms of sushi.


My fourth-grader asked: So are avocados from California?

Unlike the strangely undocumented history of California rolls, we do have a pretty good idea about where avocados got their start. They’re originally from Mexico, although avocado pits have been found buried alongside mummies from 750 BC as far south as Peru. Those mummies probably didn’t enjoy the fruit too much though, as the trees weren’t cultivated until 500 BC.

Modern avocados were brought to California in 1871, with a variety of varieties vying for people’s plates until the 1950s. Today’s most popular variety, Haas avocados, got their start in 1926, traceable to a single tree in La Habra Heights. The tree died in 2002, and while it’s not exactly helping bridge cultures, some of its wood is being preserved for commemoration.

Source: Who Invented The California Roll? by Michelle Woo, OC Weekly

On June 28th, 2018 we learned about

Fossil record suggests that solitary primates regrew claws to help with grooming

No matter how badly you may want them, no matter how long you let you nails grow out, you will never grow claws. Humans and most other primates seem to have traded in proper claws for our flimsy nails around 56 million years ago. It might seem like we gave up some crucial anatomy in this trade, as claws can be used as everything from weaponry to climbing gear, although the fact that we now have touch screen-friendly fingers certainly helps. Nonetheless, paleontologists are now realizing that not every primate has been truly served by the loss of claws, which is why some genera have apparently regrown one on each hand, although not for the more adventurous uses listed above. The task they couldn’t give up was grooming, which raises questions about why our fashion-conscious species is able to get by with nails alone.

The long-standing assumption about primate claws is that they were lost to enable our ancestors’ mobility. As arboreal climbing, leaping and grasping became increasingly important to these animals, thinner, flatter nails likely provided an advantage over more pronounced hooks and knobs. Nails would be less likely to snag on small branches, allowing a hand to rotate around a branch as an animal swung through the trees in a way embedded claws just wouldn’t allow for. As far as anyone knew, once a common ancestor switched to nails, there was no going back.

Caring for hair with claws

Except, of course, for all the living primates that still sport a tiny grooming claw on each hand. Lemurs, lorises, galagoes and tarsiers all have a tiny claw on their second digits, enabling them to pick debris and parasites like like lice and ticks out of their thick fur. These claws aren’t exactly fierce, and they had long been thought to be a hold-over from an alternate branch of the primate family tree. A new survey of fossil fingers has started to unravel some of this model though, suggesting that some modern grooming claws have actually developed again, restoring anatomy previous lost to fingernails.

As researchers pieced together all the distal phalanges, or finger-tip bones, held in various collections, they realized a pattern was emerging that might help explain why some species needed a grooming claw while others didn’t. More solitary species, such as the modern titi and owl monkeys, need these claws to groom themselves. Primates that live in social groups don’t need that extra tool, because they can rely on each other to clean up their fur. This study can’t prove conclusively that this is why some primates have regrown their claws while others haven’t, but if this trend holds true, it opens up a new avenue for understanding extinct animals from their fossils alone. Because the grooming claw may be tied to the social structure of a species, finding the right bony digit could presumably reveal a lot more about how an animal lived than just the size of its fingers.

Source: Fossils show ancient primates had grooming claws as well as nails by Natalie Van Hoose, Phys.org

On June 28th, 2018 we learned about

Ancient organisms originally grew bigger to boost the distribution of their offspring

Long before any mammals, dinosaurs or even fish existed on Earth, the most advanced forms of life looked a bit like single fern leaves growing along the sea floor. These organisms, called rangeomorphs, literally stood out among more primitive life forms, growing larger and taller than most other life at the time. It’s easy to dismiss this difference as bigger being better, but that vague assertion doesn’t really provide insight into why these organisms would have ever started growing larger in the first place.

If not for the moving water of the ocean, the world of a rangeomorph would be pretty dull. Fossils of these 600 million-year-old organisms show no hints of mouths, organs or any anatomy that would enable mobility. They could apparently stand passively on the ocean floor, soaking up nutrients from the water around them without fear of being eaten or disturbed. This may sound a bit like plants growing in a modern forest, but that’s not really an accurate comparison- a sapling, for instance, will have to race its neighbors for access to sunshine, send out roots for water and soil nutrients, and avoid being killed by herbivores and parasites. As far as the fossil record shows, rangeomorphs worried for none of these things, making their varying sizes even weirder.

Size for the sake of their offspring

Fortunately for paleontologists, the lack of activity in the Ediacaran-era ocean has allowed entire communities of rangeomorphs to be preserved as fossils. This allowed researchers to not only compare sizes of each organism, but also the distributions of where each stalk grew. Once competition for resources or defensive positioning were eliminated, researchers took another look at where the tallest stalks grew in relation to their larger ecosystem.

What they realized is that being taller apparently helped rangeomorphs distribute their spores, suckers or whatever form of propagule they depended on to reproduce. By growing taller, individual rangeomorphs could reach slightly faster currents in the water that would then carry offspring out across a larger range of territory. Essentially, growing taller helped these rangeomorphs spread out faster than their shorter kin.

Source: Why life on Earth first got big, Phys.org

On June 27th, 2018 we learned about

New Zealand agriculture aims to invent a lucrative market for red deer milk

In the 19th century, Europeans imported red deer to New Zealand so that they could be hunted for sport. Apparently that hasn’t help people’s interest sufficiently, and New Zealanders like Steve Carden have been looking at new ways to fit these deer into people’s lives. Without much demand for venison, Carden and the state agricultural company Pāmu, are now marketing deer milk as a new dairy option for human consumption. With supposedly intense demand from top chefs, the milk likely won’t be putting cows out of a job any time soon, instead joining the ranks of interesting but still unusual forms of dairy that people harvest from our fellow mammals.

Difficulties of milking deer

Every mammal seems to have its own recipe for milk, and the red deer (Cervus elaphus) are no exception. While cow’s milk is noted for its bland sweetness with enough fat to separate out into cream or cheese, deer milk much closer to cream from the start. The low sugar, high fat, high protein milk probably wouldn’t be too attractive as a beverage, but it it proving to be very attractive to chefs. The expectation is that the milk will be raised as an expensive, niche product to excite foodies even if it comes with a premium price.

That marketing angle isn’t just because of the way the milk tastes. Thousands of years of domestication, plus the naturally robust output of larger animals like cows, means that dairy cows can produce over 100 pounds of milk a day. The deer, on the other hand, aren’t about to line up to be milked, and when they do cooperate, only offer around a half-pound of milk a day. So even if people with well-trained palates really appreciate this new source of calcium, economics will keep most of us from ever tasting it.

Milking many types of mammals

Red deer may lie on the fringe of non-cow dairy production for humans, but this isn’t the first attempt at getting milk from deer. Reindeer used to be milked in Scandinavia, and moose are still milked in parts of Russia today. Sheep, goats, camels, buffalo and horses are all milked in various parts of the world, although the qualities of each species dairy sometimes limits how the milk gets used- if the sugar or fat content is wrong, the milk might not be usable in things like cream or cheese. To come full circle, Pāmu does have another niche to target if their red deer milk doesn’t get used in cooking all that often— the high protein content apparently also makes it a candidate for use in cosmetic products, which sounds oh so delicious.

Source: Deer Milk Is Apparently a Real Thing (in New Zealand) by Dan Nosowitz, Modern Farmer

On June 27th, 2018 we learned about

How fat cells function as the body’s squishy, insulating, and scalable energy reserve

I know that exercising will help me ‘build’ muscle. I also know that that’s a kind of weird way to describe a systematic tearing and repair of muscle tissue, eventually resulting in more muscle mass overall. I should also be able to ‘burn’ fat or ‘lose’ weight, although those terms are a little more opaque. It sounds like the amount of fat in my body will be reduced, but does that mean smaller fat cells? Fewer fat cells? What does it mean to get fat in the first place, and why do our bodies even bother in the first place?

Storing energy to ensure survival

Fat does a number of jobs in an animal’s body, from providing insulation from the elements to padding or protecting more sensitive anatomy deeper in the body. Most humans aren’t hitting the gym because they’re too comfortable swimming in cold water though. The issue we struggle with today is the way fat can store energy. It’s an ability that has helped organisms survive intermittently-low food supplies for millions of years, but thanks to modern farming and food storage, is basically working better than our bodies really need. As we eat more food than we can use in a day, our bodies try to store extra energy in fat cells a hungrier time that just never seems to arrive.

To store energy, your body packages excess sugars into molecules called fatty acids, stuffing them into fat cells for storage. As your personal energy reserves continue to grow, your body will increase both the size and number of fat cells you’re carrying. As normal deposits are stuffed full, fat cells will even get deposited on muscles and major organs, leaving you with more fatty acids than you’re likely to need in most modern circumstances.

Saving more than your body can spend

Carrying fat obviously doesn’t make you immune from hunger, and your body seems to only tap into these reserves of fatty acids in certain circumstances. Highly aerobic activities, such as fleeing from a predator on foot, is one way to gain access to the energy stored in fat. On a longer timescale, reducing your overall calorie intake can also convince your body to start cracking open those cells to make use of your stored energy. This is done by releasing fatty acids into the bloodstream so your muscles, heart and lungs can literally break them apart to make use of the energy stored in their molecular bonds. The remaining molecular debris, as well as the emptied fat cells, are eventually expelled from the body in both our breath and urine as a waste product. So you sort of ‘burn’ fat, but you’re also breaking and exhaling it.

As efficient a system as that may be, it only seems to work when we regularly dip into our fat reserves. Once we have an excessive number of fat cells, they weirdly become harder to use. These cells, known as adipocytes, are often oversized and produce inflammation-causing hormones. They also end up storing extra energy, and releasing it to our muscles and organs at an abnormally slow rate. In a way, they become too good at their jobs, hindering the original function of fat cells as a way to get through lean times.

Source: How does your body 'burn' fat? by David Prologo, The Conversation