On July 13th, 2017 we learned about

Searching for the start of teeter-totters, tilt boards and seesaws

My four-year-old is currently a big fan of the seesaw. Or is it the teetter-totter? The tilt board? There are many names for this piece of playground equipment, all of which refer to what’s essentially a lever. Levers as machines let us do a lot of otherwise difficult work, although in the case of most playgrounds, the perfectly symmetrical arrangement of a seesaw actually makes things a bit tricky for a large adult and a small child. I do extra work to avoid launching my child thanks to our equal distance from the fulcrum, although there’s a chance that bit of physics is what got seesaws started in the first place.

Korean catapult

With something as fundamental as a lever, it’s not easy to pin down the first time anyone decided they were fun to sit on. One line of thinking connects modern seesaws to 널뛰기, or neolttwigi, a device from Korea that dates back at least a few hundred years. A neolttwigi is like a low seesaw intended to launch someone in the air. One person jumps onto the empty side to boost their playmate straight up. The story is that this device was first developed so that young women could boost each other high enough to see over the walls surrounding their homes, although since then acrobatics and props like jump-ropes have been added to the mix.

Teetering, tilting and trembling

The various names for a seesaw may hold clues as well. Americans supposedly prefer the term “teeter-totter,” although those terms existed separately before being brought to the playground. Teeter is related to titter, which probably comes from an Old Norse word titra, which meant “to shake, shiver, totter or tremble,” which seems fairly appropriate. However, those definitions are fairly broad, which doesn’t help describe the exact device in question. French terms like balançoire only mean to balance, although the theory that seesaw is a bastardization of ci-ça (see-saw), which means “this-that” is at least more playful.

Syncing saws

The most specific connections are based around the name seesaw and how it intersects with hard labor. Before engines sped everything up, sawyers were people who sawed logs and trees. For bigger jobs, two men would hold handles on either side of a large saw, lunging back and forth in an even rhythm to efficiently cut through the wood. Another variation was the pit-saw, where one sawyer was elevated above the other, sawing down at an angle. In either case, the work went better if the back-and-forth motion remained in sync, and so sawyers would sing or chant to coordinate their movement.

These songs introduced the term “see-saw” not for a specific meaning as much for rhythm. They turn up in print in the 1630s with phrases like “see-saw-sack a down” and eventually “See Saw, sacaradown, / Which is the way to London town?” in 1685. Playing on levers probably predates these chants, but the name seesaw made it to the playground by 1704. Kids may have been pretending to be sawyers, chanting along as they rocked up-and-down. While we’re not as familiar with two-man saws today, at the time it was probably a very descriptive term, especially compared to “equidistant class one lever.”

Source: See-saw by Michael Quinion, World Wide Words

On March 5th, 2017 we learned about

The perpetual difficulty of picking what is and isn’t a planet

The ancient Greeks originally described planets as asteres planetai, or “wandering stars,” thanks to the way they moved more than other specks of light in the sky. Today, we can observe a swath of differences between planets and stars, but that original definition does point out how names can be outgrown as we get more information about the universe we live in. While we still hang on to the word “planet,” its definition has changed a lot, even explicitly ruling out the idea of wandering, much less being stars. It’s also far from a settled question, and that’s not just because people feel their memories have been violated thanks to Pluto being classified as a dwarf planet.

Past planetary rosters

For a long time, the debate was focused on the original count of five or seven planets. The obvious objects in the sky included Mercury, Venus, Mars, Jupiter and Saturn, as well as the Moon and the Sun. Depending on who you asked, the Sun and Moon were lumped in the category of planets, since they were all things that, in a geocentric model of the solar system, orbited the Earth. By the Middle Ages, western astronomers were coming to realize that the Sun and Moon did not behave like planets exactly, and they were referred to more frequently as something distinct from those heavenly bodies. The next big step in breaking down these definitions was the revelation of a heliocentric model of the solar system, which then forced the issue that the Earth itself is indeed a planet.

Details of the definition

Fast-forwarding through centuries of astronomical discoveries and refinements, the International Astronomical Union (IAU) currently defines a planet according to three criteria, none of which seem totally satisfying at this point. The first rule is that a planet has to orbit not just a star, but our Sun specifically. This of course bars any of the growing number of exoplanets we’ve discovered in the past few years from being planets, as well as otherwise planet-like objects that may be truly wandering the galaxy outside a star’s orbit. The second rule is that the object is big enough for its gravity to have crushed itself into a nearly round shape (or point of hydrostatic equilibrium.) Basically, smaller objects like lumpy, lopsided asteroids are too small to count, although people have questioned what the exact, and necessarily arbitrary, threshold is for “nearly round.” The final rule is that a planet’s “neighborhood” is cleared of other, potentially intersecting objects. So a planet can have moons that travel the same route, but Pluto is disqualified because of it’s proximity to other Kuiper Belt objects like Eris, plus the fact that its solar orbit is so tied to Neptune’s gravity.

Fortunately, the ongoing discussions with what constitutes a planet are trying to take into account all the new data we have about objects in our solar system and beyond. When we discovered Eris and other Kuiper Belt objects, Pluto seemed more closely related to them than larger objects like Jupiter. However, the practicality of the “cleared neighborhood” rule is actually proving to be difficult to work with. Written with freshly-discovered Kuiper Belt objects in mind, the rule doesn’t hold up well with different parameters— for instance, even an Earth-sized planet in Pluto’s position would be entangled with Uranus’ orbit, which seems to go against the sort of filtering people were aiming for. Proposals for new definitions have been offered, mainly to strip the final rule from the current definition. This would then reclassify Pluto, Eris as well as every other dwarf planet and moon in our solar system. We’d go from eight (maybe nine) planets to 102, which is such a dramatic shift that you might wonder what about the point of such a definition the first place?

Public perceptions

The primary job of a good label for planet would be to help us sort and understand these objects as clearly as possible. However, the word ‘planet’ now also carries some baggage with it, including a sense of gravitas that people inside and especially outside the scientific community value greatly. When Pluto was reclassified as a “dwarf planet” in 2006, people referred to it as a “demotion,” and even protested the change. Even if scientists are more concerned with accuracy, the public’s take on the importance of being a planet can’t be overlooked. Securing funding and interest in sending a probe to a planet is still much easier than even a dwarf planet, much less a “small bodied object.” It seems that we all think that it’s cool to be a planet, even if we’re not completely in agreement about what that means.

Source: Should Pluto Be a Planet After All? Experts Weigh In by Mike Wall, Space.com

On January 15th, 2017 we learned about

Picking through the past of the word ‘poop’

Poop Week!

The word ‘poop’ was first written down over 600 years ago, in reference to the rear deck of a ship. Much to my children’s disappointment, this name had nothing to do with feces, instead being connected to French and Latin terms for ‘stern.’ So at that point, the smell of a ‘poop’ would have only been salty, sea air. If you wanted to talk about stinky poop as we now know it, you’d have to wait until at least 1721, and even then it wouldn’t quite be the bit of potty talk you’re probably thinking of.

Happily, poop’s non-Latin origins align much more closely with the more comedic facets of our favorite, child-friendly, word for excrement. In Middle English, the verb poupen meant to make an abrupt sound, or to blow or toot a horn. You can probably guess where we’re going with this, as the opportunity for onomatopoeia was apparently not missed by history, and ‘poop’ started seeing use to describe a fart. By 1744, in what is probably the most appropriate etymological evolution ever, poop progressed past passing gas and finally found its calling as a term for feces. Interestingly, pooping wasn’t a thing for another couple hundred years, turning up in print as a verb in 1903.

Other terms for number two

So what was the world doing in the toilet before pooping was an option? We obviously have a slew of slang and medical terms for excrement, but the most common alternative is some form of kakka. Caca, kacka, kaka and more are common, if sometimes vulgar, terms for doodoo, with roots going back to some of the earliest Indo-European languages. And while I’m not getting into it with my second grader or preschooler, I’d be remiss to not mention shit, which was in use by the 14th century, having come from terms like Old High German’s scīzan and Old English’s scēadan. Shit doesn’t quite have the fart tie-in, but with original meanings concerning defecation or “separation” from waste, it’s always been a surprisingly precise word.

Source: Poop, Online Etymology Dictionary

On December 14th, 2016 we learned about

Sugar plums: from seeded sugar to sweetness itself

There’s something pleasantly vague about the name “sugar plum” to the modern ear. When we hear about these treats in ‘Twas the Night Before Christmas, or The Nutcracker, they sound much better than modern terms that would be more descriptive, but sound a lot less magical. Children aren’t going to get excited about “visions of digestive aids,” and the “Dance of the Gobstopper Fairy” doesn’t sound terribly whimsical either.

Sugar to suck on

Sugar plums, or comfits as they were more commonly called in their heyday, are actually a very old candy that likely originated in the Middle East. They was some variety in their recipes, but the core concept involved a lot of sugar being coated around a seed or nut. Sometimes the seed could be substantial, like an almond, but they were often something tiny like a celery seed, basically acting like an anchor for sugar to start collecting around. Once an appropriate seed or speck was selected, it was then repeatedly covered in layers of sugar or syrup. The comfits would be heated, coated, and then cooled over and over again, with the entire process taking days to complete. Flavors or colors could be added, but basically you ended up with lumpy sticks of sugar, or smoother balls of dried syrup to suck on, not terribly unlike a gobstopper.

Of course, the labor involved in cooking comfits meant that these candies were not doled out carelessly. They were originally treated like an after-dinner digestive aid, meant to accompany some spiced wine, preventing indigestion and reducing flatulence. Since ‘Twas the Night Before Christmas puts sugar plums in children’s dreams in the 1823, it’s clear that these candies eventually moved beyond the formal banquet table to a place where kids could enjoy them for the balls of sugar they were. Technological advances in the 1860s, like steam and mechanized pans, made comfit production significantly cheaper and more accessible to the general public.

From candy to culture

By the time The Nutcracker premiered in 1892, sugar plums would have been widely known as a sugary treat, even transcending their origin as literal candies. From the 17th century onward, ‘sugar plums’ could be anything sweet and lovely, or even sweet and slightly devious if someone had a “mouth full of sugar plums.” These meanings were even shortened to just “plum” sometimes, removing the last traces of any descriptive words that would help a modern reader understand what was going on. Not that that’s stopping us from enjoying our own visions of sugar plums, as disconnected as they may be.

Source: Sugar-Plums and Comfits, Historic Food

On December 4th, 2016 we learned about

Official ways of declaring distress when you can’t directly ask for “help”

At a certain point in life, we all have to learn that yelling “mommy” or “daddy” isn’t the best way to request assistance. Depending on the scenario, there are even times that shouts of “help!” won’t do the trick, no matter the language you say it in. If this sounds unfamiliar, it’s probably because you don’t spend much time driving boats, trains or planes. Each of these forms of transportation have official distress calls, some of which are more intuitive that others.

SOS: Struggling ships at sea

Ships at sea have a variety of ways to call for help. These range from firing guns at one minute intervals to orange-colored smoke signals. In 1857, the International Code of Signals was established, which designated an official distress flag with a square and a ball above it, but all of these visually oriented signs have been eclipsed by SOS. SOS was officially put into international use in 1908 for wireless telegraph communications, but it has since permeated pop-culture in a generic term for “help!”

At first glance it seems like SOS should stand for something as an initialism, but that’s not actually the case. The three letters were picked by the German government in 1905 because they’re easy to send via Morse Code (dot-dot-dot, dash-dash-dash, dot-dot-dot.) Assumptions that the three letters stand for “save our ship” or “save our souls” have all been invented by English speakers after the fact, presumably trying to link it to some spontaneous exclamation from a sinking ship long ago.

Mayday: Planes with a problem

Every form of transportation breaks down at some point, and airplane pilots realized they needed a distress signal as well. By 1923, pilots were communicating via radio, so picking up something easy to punch out in Morse Code wasn’t really necessary. Simply shouting “help” into the radio was ruled out, because it was too likely to come up in situations that weren’t true emergencies. While simply shouting “SOS” probably didn’t come up in day-to-day communications, “mayday,” and eventually “mayday mayday mayday” became the designated way to call for aid.

Unlike SOS, there was some intentional, secondary meaning to “mayday.” Senior radio officer Frederick Stanley Mockford came up with the term while working at the Croydon Airport, which regularly interacted with traffic from France. Mayday didn’t sound like an English word, but did sound like m’aider, or the end of the phrase “can you help me” in French, which was considered a plus. Along the same lines of logic, less imminent danger can be reported with radioing “pan-pan, pan-pan, pan-pan,” which happens to sound a lot like panne, the word for “broken” in French.

Toot toot toot: Trains in trouble

Train engineers have their own emergency signal that they can make with their horns. Because the horns aren’t used to communicate with rail yards or railway control rooms, the primary goal of this distress call is to alert any bystanders of danger. The signal is basically a series of repeated, short toots, which presumably will be loud and frantic enough to grab people’s attention so they notice where the troubled train is headed. For less dire situations, trains have a set of horn signals based around long or short toots, almost like the dots and dashes of Morse Code. You’re most likely to hear a “long long short long” set of toots, which indicates that the train is approaching a public crossing.

Source: The origins of SOS and Mayday, OxfordWords blog

On November 23rd, 2016 we learned about

The multitude of misunderstandings behind (mis)titling turkeys

This Thanksgiving, many Americans will sit down to eat a large bird with fluffy plumage and a relatively small head. Your safest bet is to refer to this animal as Meleagris gallopavo, but that probably won’t do you a lot of good at the grocery store. You’d probably get the bird you expect asking for a ‘turkey,’ but historically that’s been a pretty convoluted mess, since these natives of Mexico have been confused with multiple other species in a handful of languages, including English, French, Spanish and Portuguese. In English, at least, the naming confusion started because people couldn’t identify where completely different bird came from.

Trading birds and brands

The originally misnamed bird is the helmeted guineafowl (Numida meleagris), which looks a lot like a North American turkey. They have fluffy dark plumage, blue and red skin on their heads, and spend most of their time on the ground. To Europeans, these birds seemed like exotic creatures, and Guinea seemed like an appropriately exciting point of origin for them, even though they are actually natives of Africa. Some of these details were probably muddled in a game of international-trade-telephone, since the African birds were being sold by Turkish traders, prompting lazy Europeans to call them turkey-cocks and turkey-hens.

Things got more complicated when Spanish explorers found M. gallopavo strutting around Mexico in 1523. When these birds were brought to Spain and North Africa, people noticed how much they looked like the birds they were coming to know as turkeys. Beyond that, calling these new birds turkeys implied that they also had a bit of that eastern mystique, which somehow made eating them more satisfying on Christmas day.

Turkey in other tongues

Not everyone had the same relationship with Turkish bird traders, but that didn’t stop other cultures from just making up their own points of reference. Rather than retread the imaginary Guinea connection, the French started calling turkeys poulet d’inde, or ‘chicken from India,’ which was later shortened to just dinde. The Turks knew turkeys weren’t really theirs, so they followed France’s lead and called the birds hindi, meaning ‘Indian.’ The Portuguese at least had the right hemisphere in mind when they named them Peru birds. Indonesians flipped the whole ‘exotic’ concept, naming them ayam belanda, or ‘Dutch chickens.’

With so many people picking new points of origin for M. gallopavo, you might hope that the original Spanish Conquistadors that exported the birds from Mexico would stick to the facts a bit more, but that didn’t work out. The primary word for turkey in Spanish today is pavo, which comes from the Latin word pavus, or ‘peacock.’ A closer look at Latin American terms finally seems to weed out other continents, countries and birds for a reference point. Depending on your locality, guanajocócono, chompipe and guajolote are all names for turkey, with the later most likely being the closest to the original Aztec. These terms don’t eliminate all confusion though, since they can also be used to refer to a stupid person.

Enjoy your _____, and have a happy Thanksgiving.

Source: The Etymologicon: A Circular Stroll Through the Hidden Connections of the English Language by Mark Forsyth, Berkeley Books

On September 7th, 2016 we learned about

The not-so-cordial connotations of “nice”

I need to stop asking my kids to be nice to each other. Aside from the desperation in my voice by the time I’m making such a request, the word “nice” itself can actually undermine my real intentions. While people mostly use it to indicate that something is pleasant, or perhaps at least harmless, this is a relatively modern twist on a word that used to be used primarily in a derogatory manner.

Starting from scire

The twisted origin of the word “nice” starts with the Latin word nescius. Nescius basically meant “not knowing,” and so by the 13th century, being called nice in Old French meant that you were foolish and senseless. There was was an element of gentle ignorance in this meaning though, and nice transitioned from stupidity to timidness, then fussiness before getting a bit more of a neutral to positive spin when it came to mean “dainty or delicate.”

So when did it become nice to call something nice? In the 1500s, nice finally took on a more familiar meaning by standing for precision and care, such as when something was “nice and neat.” Since people usually like those traits, you can find nice meaning “agreeable” in 1769, and finally the modern “kind or thoughtful” concept popping up in the 1830s. When laid out like this, it actually seems like a fairly smooth, logical progression from foolish to friendly, but that’s mainly thanks to the clarity afforded by hindsight.

Too many meanings

At this point, “nice” can still be defined by nearly all of its previous meanings, plus a few more tangential concepts as well. Nice can be used to mean lewd, modest, refined, fastidious, trivial, pleasant, enjoyable or even profane, and that’s all before you start layering on the sarcasm. With these many contradictory meanings, it’s obviously not the right thing to request of two children hitting each other in the back seat of the car. Maybe requesting kindness is a better option? Although since that’s coming from Old English’s gecynde, meaning “with the feeling of relatives for each other,” there’s not much hope for arguing siblings since sharing a family is what got them fighting in the first place.

Source: What Does 'Nice' Mean, Anyway?, Word History

On April 18th, 2016 we learned about

How the word ‘toilet’ passed from preening to plumbing

Hundreds of years ago, toilets were just cloth. Not cloth diapers or some kind of sanitary hammock, but any kind of cloth covering, because originally, the word toilet had nothing to do with a trip to the potty. Originating from the French word toile, for cloth, toilets could be cloth to bundle your clothing in, a shawl, or even some kind of fringed bonnet. Most importantly, it could also refer to the cloth draped over a lady’s dressing table, which is where the word started it’s journey from the world at large to the confines of our bathrooms.

Primping, not pooping

This transition started in the 1660s, when women would sit at a table, most likely with a mirror, to get made up for the day. The collection of grooming activities, like brushing one’s hair and applying makeup, was eventually bundled up as “taking one’s toilet,” possibly since the most consistent part of any individual’s routine was where it took place. While there was some leeway in what a person’s daily toilet could include, it didn’t involve relieving one’s bowels or bladder in a person’s bedroom (for that, you might have had a commode with a piss-pot sitting in it.) A private place for such relief would actually first be invented in public locations, when buildings started installing toilet rooms.

In the nineteenth century, public buildings in the United States were likely to have toilet rooms. Carrying over from the toilet you had at home, these rooms were ostensibly for touching up one’s appearance, which may have just been a way for people to talk about them without mentioning the fact that they also housed privies. The ruse seems to have worn thin though, and by the early twentieth century references to a toilet were very likely to be aimed at the porcelain throne itself, not just the room that housed it.

Maintaining room for imagination

The desire to be indirect about this critical piece of plumbing likely gave rise to the variety of new terms for what was once the toilet room. Lavatory, water closet, bathroom, loo and restroom all let us bring up a universal need without going into specifics, unless you’re just needing to go powder your nose of course. This desire for discretion isn’t new though, as the term privy stemmed from the Latin word privatus, for privacy. On the other hand, names that seem to be more bluntly connected to the task at hand, like the crapper, are merely fortunate coincidences. While Sir Thomas Crapper did own a number of plumbing-related patents, he did not invent the flush toilet, and the word crap was already coined as a synonym for poop before his time.


My three-year-old said: “HAHAHAHAHAHAHA! TOILET!”

“HAHAHAHAHAHA!”

Source: The origin of ‘toilet’ by Edmund Weiner, Oxford Dictionaries Language Matters

On February 28th, 2016 we learned about

Learning to count despite English’s linguistic leftovers

I’ve gotten a lot of dirty looks from my kids helping them learn English. And confusion. And, almost like various stages of grief, denial, as they’d occasionally just decide that Dad must be nuts, because that “correction” doesn’t match any of the other words or phrases they’d already learned. I don’t blame them for these reactions, because English is a mess, and it’s not fun to have to learn a bunch of irregulars just because “that’s the way it is.” Those weird spellings and constructions do sometimes have origins though, and in case of how we deal with the names of numbers, they’re not “the way it is” as much as “the way it was.”

Ten more than what?

As an adult you stop thinking about it, but as kids learn to count things get a bit wonky after “ten.” Why don’t we say onety-one, or at least oneteen? Eleven and twelve just seem to come of of nowhere, and then we switch to teens, and then it over to the much more obvious twenty, twenty-one, etc. Higher numbers are pretty obviously based around the number of 10s and then the number of ones you’re describing. The teens follow a different logic system, but they at least seem to explicitly reference a quantity of 10 plus a smaller number. To be more specific, they actually say “10 more than x,” which used to be -tene in Old English, and -tekhuniz before that in Proto-Germanic. So while fourteen isn’t constructed like most of our names for numbers, at least it’s easier to write than fedwōrtekhuniz.

The leavings of spears

While saying that something is “ten more than four” seems a bit backwards, it’s actually very similar to how we ended up with eleven and twelve. Looking back to Viking battles with Anglo-Saxons, “leavings of spears” was described as daropa laf. That laf, as a leftover, went through a number of pronunciation shifts over the years, including lif and elf. These would then be attached to versions of “one” and “two” in the Old English endleofan and twelf, respectively, which if you say them out loud clearly tie into modern English’s eleven and twelve. But while that helps with pronunciation, those two words basically mean “one left” and “two left.” Why is there no reference to ten anywhere?

Finding the missing ten in the meaning of eleven is where things finally get to the point of “that’s the way it was.” As the mixed origins above indicate, not all numbers were given names at once, and many languages were functional with no word for numbers higher than 10. If that sounds impractical, keep in mind that the mathematical concept for 0 didn’t reach Europe until the 11th century, so while counting 12 ladybugs at the ladybug picnic seems obvious now, these terms are actually the result of centuries of iteration. While Lithuanian held onto the “two-left” construction for their numbers a bit longer, most of English switched over to teens, and then the even more mathematically consistent twenties and thirties (allowing for a few more pronunciation tweaks). However, it seems that eleven and twelve held on because they were the most commonly used terms, and people just didn’t feel like giving them up.

To get even nerdier with this discussion, word usage seems to be a lot like software adoption. Familiarity and ubiquity can be powerful enough forces to avoid updating to a better designed system, even if it means dragging around bugs and other inconsistencies. Not that my kids want to hear that explanation either.

Source: Why Is It 'Eleven, Twelve' Instead of 'Oneteen, Twoteen'? by Arika Okrent, mental_floss

On October 5th, 2015 we learned about

The mishmash of meanings we’ve manufactured for “vegetables”

I’m probably going to regret sharing this with my kids, but… vegetables are a bit of a lie. Obviously carrots, potatoes, and broccoli are all real plants, but the idea that a “vegetable” is actually something more specific than “foods my kids will whine about” isn’t really true. The word itself originally just referred to plant life. It wasn’t until the 1700s that it took on the now common meaning of “plants we eat.”

Botanically bankrupt

Compared to fruit being structures meant to carry a plant’s seeds, the idea of a vegetable has sort of been mushed into a catch all for “everything else.” From a botanical perspective, there’s no real relationship between carrots, lettuce, onions or mushrooms. As a root, leaf, bulb and a fungus-that’s-not-even-a-plant, it’s clear that the ingredients of a vegetable soup are there because of our taste preferences more than anything else. But even with that degree of flexibility, the categorization of different edible plants has been stretched to a degree where you might be tempted to define “vegetables” simply along the lines of “I know one when I see one.”

Legal logic

In 1883, the Tariff Act in the United States was raising prices on imported vegetables, but not fruit. It would seem that this distinction would be pretty cut and dry, thanks to tomatoes carrying seeds and thus clearly being fruits. The Supreme Court disagreed though, basically saying the common usage of tomatoes as vegetables in most cooking trumped stricter definitions, and that the tariff could be imposed. The European Union has gone the other way, ruling that rhubarb, carrots and sweet potatoes are all fruit, at least if they’re being used in jams.

These legal rulings can matter to more than just tariffs though. In regulating the nutritional standards for school lunches, Congress has waded into the murky waters of defining vegetables, far beyond concerns over seeds, roots or leaves. In the 1980s, changes were made that would effectively allow ketchup to be counted as a serving of vegetables because of the tomato content, (probably not thanks to the onion powder.) Going further with this idea, in 2011 Congress allowed reduced portions of tomato paste to count as a serving of vegetables, which effectively allowed pizza to be the equivalent of a serving of carrots (or any other non-fruit flora.)


 

My first grader said: Well, since kids this age have the instinct of a lawyer looking to follow the letter but not the spirit of a law, we immediately had to discuss a new label for our requests that she eat her… plants. “Greens” was out for being too narrow, “non-fruit plants” brought up the whole tomato problem again, and we obviously had to avoid suggestions like “things that are yucky.”

For now we’re going with the pleasant notion of eating a “whole rainbow” of foods to push some variety, although I’m curious to see how much eggplant or grapes she’s willing to eat to cover her purple obligations.

Source: Do vegetables really exist? by Henry Nicholls, BBC Earth