On May 21st, 2017 we learned about

When sumptuary laws limited the wearing of silk, lace and purple in people’s wardrobes

Today, wearing flashy outfits may garner rolled eyes and snide comments about needing a visit from the “fashion police,” but nobody will really make you give up your new silk pantaloons or rhinestone encrusted jacket. Even a forgery of a top designer won’t be a risk to carry, even if it wasn’t legally produced. We should appreciate this luxury, as ostentatious, or simply ambitious, fashion choices were once illegal. Dating back to the ancient Romans, sumptuary laws aimed to control items people could buy, largely depending on who those people were.

Identity vs. income

Sumptuary laws varied a lot over the years, and supposedly benefited society by helping curb irresponsible spending by individuals as well as normalize the supply of rarer goods on the market. For instance, a poor peasant may have been banned from buying a horse when they could barely afford to feed themselves first. By the Elizabethan era, this form of paternalistic sumptuary law probably wasn’t enforced very often, partially for practicality’s sake.

Many more details exist around laws that were intended to maintain the stratification of economic classes, often surrounding one’s dress. For example, British sumptuary laws of the 14th century spelled out exactly what job titles and income levels were allowed to wear veils, velvet, satin, sable fur, gold and purple garments, etc. There was a great concern that people might dress “above their station,” which could destabilize power structures— if a merchant wore nicer clothes than a lord, it might make the lord seem less important. If the wife of a knight wore ermine, how could anyone tell her apart from actual nobility? This tension may seem silly, but in some cases reshaped societies— merchants in the Edo period of Japan amassed significant wealth, and with it influence in society, even though they were technically ranked below samurai and nobility in the social hierarchy.

The new standard of scholars

Some of these rules are indirectly observed today, reversing their original intent. While people graduating from colleges don black robes and colored hoods to mark their accomplishments, those robes were originally just the required garb of poor students. They started as a point of practicality and warmth, were then enforced as law to identify students, later continued at specific universities as a sort of student uniform, and are now worn only to mark a student’s ascension to higher learning. The extra stripes for PhDs on their arms, fancy tassels and other detailing would probably make the authors of Medieval sumptuary laws’ heads spin, but so would the fact that the average person can now wear any nearly any clothing they can get their hands on without fear of legal repercussions.

Source: Sumptuary Laws of the Middle Ages, Lords and Ladies.org

On April 2nd, 2017 we learned about

Extending wildlife conservation beyond cute, cuddly and charismatic creatures

After a rainstorm, my kids will happily scour the sidewalk to rescue earthworms from careless feet, hungry birds and heat from the sun. I’ve noticed that my second grader’s enthusiasm for fetching worms seems to have declined a little bit, and I can’t help but wonder if she’s noticed that worms aren’t terribly popular, at least in comparison to more squeal-inducing critters like hummingbirds, dolphins or nearly any fuzzy baby mammal. This isn’t terribly surprising, as people show strong biases towards saving their favorite animals over others, a status rarely bestowed on an animal many people think exists to be fish bait.

Worms’ public image certainly has room for improvement, although there’s not much they can do to reshape their image. Conservationists have noted that the the public, and therefore a lot of public funding, favors a small range of animals, often based on looks. It’s much easier to capture people’s attention and sympathy when presenting cute, merchandising-friendly animals like a panda than something with scales, warts, or a tendency to scavenge for food. If that’s not an option, being especially big, or frightening or having some stand-out ability helps. Finally, the last reason people are likely to find an animal worth saving is if we think it’s worth eating, like salmon or bluefin tuna. Certainly not a lot of ins for worms there, but most earthworms you encounter in North America are thriving as an invasive species anyway.

Home-town heroes

With all of these biases in place, it’s good to note that we’re not necessarily doomed to a world with no warthogs, toads or spiders. The preference of “charismatic megafauna” may not be completely ubiquitous, as a study has found it to be influenced by where a child grew up. Kids that grew up in the continental United States would more likely pick “traditional” favorite animals, like wolves or bears. However, kids that grew up on the island of Andros in the Bahamas showed more interest in birds, lizards, fish and insects, mos likely thanks to their exposure to these species in their local environment. The downside to that same concept was an affinity for invasive feral cats, dogs and pigs on their island, but overall this shows that it’s possible for people to appreciate a wider variety of animal species (sorry plants.)

Familiarity breeds… concern?

Assuming familiarity and exposure to species matters, there are efforts under way to show the world the value of the planet’s less attractive creatures. The Ugly Animal Preservation Society (UAPS) has sponsored a contest for the ugliest animal, done specifically get people looking at creatures that aren’t so easy on the eyes, but still very important to ecosystems. They’re not likely to start pulling conservation dollars away from elephants any time soon, but the fact that people are even considering the existence of a blobfish is helpful, if only to get them asking questions about how these creatures live.

In some cases, the PR problem an animal faces isn’t that people don’t know a creature, but that they’re convinced they don’t like it. Insects probably fall into this category more often than any other fauna, with things like spiders scaring people beyond rationality. To help diffuse this, researchers have been getting these bugs some new glamour-shots, perching the six- or eight-legged critters on their faces. Aside from being quite eye-catching, these images show that we needn’t be afraid of everything that doesn’t somehow resemble a baby human. That said, I still haven’t seen anyone with worms draped all over their faces, but I can’t count my kids out of that race just yet.


My four-year-old asked: How are we protecting salmon if we’re eating them?

On an individual basis, yes, that’s a good point. However, as a species, being considered “useful” to humans may be easier than explaining to lawmakers how keystone species, or ecology in general, works. Salmon wouldn’t be the first animal to be saved so we could eat it.

Source: Too cute to die? Experts say we’re too selective about species we choose to protect by Tom Spears, National Post

On March 12th, 2017 we learned about

Sleep deprivation makes us poor judges of the people around us

After daylight savings time each year, millions of Americans regularly grumble about “losing an hour” of sleep, wondering why daylight savings time exists, and probably trying to make up the difference with extra caffeine. It may seem like this annual ritual is just an especially bad Monday morning where we’re tired but otherwise unscathed. Continuing investigations into how sleep affects the brain is showing that abruptly shifting our schedule an hour cost us a lot more than just an extra coffee to start the day.

Assumed antagonism

While sleep deprivation is famously associated with difficulty concentrating or performing detail-oriented tasks, it’s been found to hit our emotional centers as well. In extreme cases, even something as presumably neutral as an image of a spoon could trigger spikes in activity in the amygdala, causing distraction and stress.

Even if you feel like you’ll be able to keep your cool with the cutlery, there’s a good chance that your tired brain will react harshly to the humans in your life as well. People in need of sleep have been found to have a harder time identifying faces of people they don’t know, such as if you had to match a photo ID to the person in front of you. When looking at faces, sleep deprived people also tend to interpret expressions more negatively than they really were. As if the brain was worried about missing a threat it didn’t have the energy to actually assess, tired people were found to judge friendly and neutral expressions as being more threatening than their well-rested counterparts.

Stronger Sentencing on Sleepy Monday

Obviously, increased stress and confusion about the people around you may make your day harder to endure, but there can be very long-term consequences to these reactions. Analysis of a decade of criminal sentencing records has found that judges hand out tougher punishments right after daylight savings adjustments. The judges probably weren’t as acutely sleep-deprived as the test subjects of the other studies mentioned here, but they were disturbed enough to increase make their sentencing about five percent harsher than normal. That might not sound like a lot at first, but an additional three months of incarceration for nothing other than happening to be sentenced on “Sleepy Monday” is significant.

Even if you’re not a judge who should really try to rest up before daylight savings switches over, all of this will hopefully serve as a reminder to really try to mind your Sleepy Monday a bit more gently. Teachers, bosses and parents are all in positions of power that need to mind our own reactions, as well as try to give a bit of leeway to the people around us who are probably tired as well. It’d be nice if daylight savings switched on a Saturday to give everyone an extra day to adjust our circadian rhythm, but in the mean time just try to rest up as best you can— the best way to get your face/emotion/etc. interactions back to normal is some good, deep sleep.

Source: Switching to daylight saving time may lead to harsher legal sentences, Science Daily

On November 7th, 2016 we learned about

The Electoral College: elections for electors who pick the next President

What feels like a decade of campaign-oriented stress and exhaustion is wrapping up soon, but it’s technically incorrect to say it’s all settled by the time we pass out go to bed on November 8th. While the average voter’s role in the election should be settled, the next president won’t have the job once those votes are tallied, because technically, none of us voted for president. All the fighting, canvassing and more was really to get you to vote for a ticket to be acted on by a group of people who’s names you probably don’t even know. This isn’t actually some kind of conspiracy or subterfuge, it’s just the complicated system known as the Electoral College.

Voting for a voter to vote

Even though voting can feel like a very personal response to candidates, everyone on election day is technically voting for a Presidential ticket for a particular party. It’s not clear on every state’s ballot, but that ticket also carries a list of names of electors that will vote in the Electoral College later on. The number of electors per state match that state’s Congressional representation— California will have 55 electors, while smaller states like Wyoming will choose just 3 per ticket. These people are chosen for a variety of reasons, usually with ties or loyalties to the particular political party running the ticket.

Once voters have cast their ballots, the votes are tallied to see which slate of electors will actually have a chance to vote for President. Most states are looking for a simple majority to determine a winner, but Maine and Nebraska complicate things by breaking things up by looking for overall winners and winners of congressional districts, allowing for split allocations of electors. One the winning tickets are determined, governors of each state certify the results and designate that state’s electors. Those electors then get to cast two votes— one for President, and one for Vice President, although this hasn’t always been true. Before the 12th Amendment was ratified in 1804, each elector voted for two Presidential nominees, with the winner becoming president and the runner-up becoming the Vice President (which would have produced some amazing odd-couples in recent elections!)

As a consequence of this very indirect democracy is that electors don’t always have to vote according to the outcome of the election that made them an elector in the first place. On 22 different occasions, so called “faithless electors” voted for a candidate that didn’t win their state’s popular vote, although the majority of those votes were thanks to the winning candidate dying before they could be voted into office. Some states actually have have laws against being a faithless elector, although nobody has ever been prosecuted.

As if this didn’t feel convoluted enough, Congress also gets involved before the election is over. If electors across the country somehow tie, the House of Representatives get to pick the President. If Vice Presidents somehow need a tie broken, the Senate picks. Even if a tie is avoided thanks to a candidate recieving 270 or more electoral votes, a joint session of Congress is expected to count all the electoral votes, with the sitting Vice President, as President of the Senate, finally gets to officially announce the next President. Naturally, none of this is expected to be done for months, which is why things aren’t technically settled until January.

Timing arranged for agriculture

The timing of US Presidential elections is appropriately archaic to go along with all the various steps outlined above. Elections are held on the Tuesday after the first Monday in November. This date was selected when agriculture was a major occupation around the country, and farmers needed time to finish harvesting, but not be so deep into winter that travel became difficult. Tuesday was selected because it was supposed to be convenient, again for farmers. They usually delivered crops to market on Wednesdays, which meant everyone should have had a bit of time on Tuesday to fit in voting. Subsequent steps operate on a similar timescale, with electors voting in candidates in December, and Congress finalizing things in early January. We can only imagine the scheduling if our Constitution were written today to best accommodate cable news and Twitter.

The exception to all this layered, indirect action is the day of transition when the new President takes the oath of office. During the various inaugural activities, teams at the White House have no more than six hours to move the sitting President out of the building, clean and repaint walls, then moving the new first family in. Teams of people have to be extremely quick and coordinated to make sure that the new President’s clothes are in the closets, books are on the shelves and everything is ready to go, because the President really doesn’t have time to dig through that last couple of boxes in the corner to find what they’re looking for. After over a year of campaigning, months of voting, and then six hours to move in, it’s finally time to go to work.

Source: What is the Electoral College?, Archives.gov

On October 5th, 2015 we learned about

The mishmash of meanings we’ve manufactured for “vegetables”

I’m probably going to regret sharing this with my kids, but… vegetables are a bit of a lie. Obviously carrots, potatoes, and broccoli are all real plants, but the idea that a “vegetable” is actually something more specific than “foods my kids will whine about” isn’t really true. The word itself originally just referred to plant life. It wasn’t until the 1700s that it took on the now common meaning of “plants we eat.”

Botanically bankrupt

Compared to fruit being structures meant to carry a plant’s seeds, the idea of a vegetable has sort of been mushed into a catch all for “everything else.” From a botanical perspective, there’s no real relationship between carrots, lettuce, onions or mushrooms. As a root, leaf, bulb and a fungus-that’s-not-even-a-plant, it’s clear that the ingredients of a vegetable soup are there because of our taste preferences more than anything else. But even with that degree of flexibility, the categorization of different edible plants has been stretched to a degree where you might be tempted to define “vegetables” simply along the lines of “I know one when I see one.”

Legal logic

In 1883, the Tariff Act in the United States was raising prices on imported vegetables, but not fruit. It would seem that this distinction would be pretty cut and dry, thanks to tomatoes carrying seeds and thus clearly being fruits. The Supreme Court disagreed though, basically saying the common usage of tomatoes as vegetables in most cooking trumped stricter definitions, and that the tariff could be imposed. The European Union has gone the other way, ruling that rhubarb, carrots and sweet potatoes are all fruit, at least if they’re being used in jams.

These legal rulings can matter to more than just tariffs though. In regulating the nutritional standards for school lunches, Congress has waded into the murky waters of defining vegetables, far beyond concerns over seeds, roots or leaves. In the 1980s, changes were made that would effectively allow ketchup to be counted as a serving of vegetables because of the tomato content, (probably not thanks to the onion powder.) Going further with this idea, in 2011 Congress allowed reduced portions of tomato paste to count as a serving of vegetables, which effectively allowed pizza to be the equivalent of a serving of carrots (or any other non-fruit flora.)


 

My first grader said: Well, since kids this age have the instinct of a lawyer looking to follow the letter but not the spirit of a law, we immediately had to discuss a new label for our requests that she eat her… plants. “Greens” was out for being too narrow, “non-fruit plants” brought up the whole tomato problem again, and we obviously had to avoid suggestions like “things that are yucky.”

For now we’re going with the pleasant notion of eating a “whole rainbow” of foods to push some variety, although I’m curious to see how much eggplant or grapes she’s willing to eat to cover her purple obligations.

Source: Do vegetables really exist? by Henry Nicholls, BBC Earth