On July 23rd, 2018 we learned about

The invention of trapeze, and the tights that went with it

“Maybe someone needed a better way to swing in the branches of a tree?”

That’s probably not a concern most people worry about, but then again, it’s hard to tie something like a trapeze to any practical purpose. Even after a week at circus camp, my nine-year-old was clearly stretching to figure out what could have inspired the design of such a simple but specific device.

“They didn’t have airplanes then, so was part of a blimp?”

To be fair, the origin story of the trapeze isn’t necessarily intuitive. A young man named Jules, who had grown up learning to climb and tumble in his father’s gymnasium, saw ropes hanging over the accompanying swimming pool. He placed a cross bar between two ropes, supposedly to use as a chin-up bar. Apparently it didn’t take long for other uses of a swinging bar to be found, because within a year this acrobat had put together a performance in his home town of Toulouse, France.

Within three years of that first performance in 1856, Jules had not only moved past his would-be career as a lawyer to work in the Cirque Napoleon in Paris, but he had a second act figured out as well. On November 12th, he swung from one trapeze to another for the first time, gripping audiences’ attention as never before. While the swimming pool was traded for a set of mattresses, more trapezes were added to the performance, allowing Jules to do back-to-back somersaults between five different swings.

“Was his name Mr. Trapeze?”

No- the name trapeze probably came from the Latin trapezium, referring to an “irregular quadrilateral,” like a trapezoid. That’s not to say that young Jules’ name was really forgotten though, since his last name was Leotard.

To safely swing and climb among the ropes and bars of his trapezes, Jules Leotard created a form-fitting body suit out of wool. While it was probably quite hot to wear, the elastic nature of wool allowed him to move without being hindered or snagged on any equipment. It also helped Mr. Leotard hold the attention of many audience members, as the tight outfit revealed his physique in a way unheard of at the time. The combined spectacle of the trapeze and costuming helped make Leotard quite successful, earning him plenty of money and notoriety, including the song the “The Daring Young Man On The Flying Trapeze.”

Leotard’s body suit wasn’t referred to as a ‘leotard’ until around ten years after he died in 1870. As eye-popping as the garment was when it was first created, it’s now fairly standard for athletes and dancers around the world. That’s not to say that Jule’s first passion has died out though- as recently as 2013, performer Han Ho Song performed five consecutive somersaults off a trapeze in Stuttgart, Germany. The key difference is that he updated Leotard’s trick from 1859, making all five revolutions in a single jump.

Source: Trapeze origins, Vertical Wise

On March 12th, 2018 we learned about

We started saying ‘hello’ to attract attention on the telephone

Saying hello is now synonymous with greeting someone. It’s a word that, with some necessary spelling changes, is spoken in languages around the world, and at first glance seems appropriate for all kinds of contexts. However, its usage is actually pretty limited, only going back around 150 years. It’s not that people haven’t always greeted each other when they met, but that seeing someone in person wasn’t nearly as alarming as hearing their voice on a telephone.

We take the ability to speak on the phone for granted now, to the point that many of use avoid using a device that was once considered a marvel of modern technology. When it was invented, the phone was a totally new social space for people to make sense of, requiring its own set of etiquette to go with it. Alexander Graham Bell felt that this long-distance communication would make sense borrowing nautical terms for some reason, and always answered the phone with ahoy. Thomas Edison, perhaps noting that ships aren’t known for facilitating direct conversation, pushed the idea of hello, although that word carried a bit more impact in the 19th century than it does today.

Hello’s history as hail

The word hello comes from variations of hail, and as such is closely related to holler. It was used as a greeting, but often in bigger, public declarations than a one-on-one conversation, such as “hail to the king!” By the 1800s, hello’s exclamatory value was central to its usage, and it would be used mostly to show surprise or draw attention to something exciting, such as “Hello! What’s this?!” So Edison’s use of hello on a phone wasn’t really a socially bland greeting as much as a punctuated call for an answer from a distant shore. Aside from ahoy, which wasn’t going over with anyone, hello was actually competing with phrases like “Are you there?”

Hello is now used around the world, but mostly in the context of phone calls, showing how hearing a voice over a wire still occupies a particular space in our brains. Popular alternatives to hello or allo often retain the flavor “are you there.” In Spanish, diga is technically asking if you can be heard on the other end of the line. In Russian, я слушаю is basically saying “I’m listening,” again drawing attention to the fact that you’re in a conversation with someone that can’t see you. As forms of written communication like text messaging occupy larger portions our telecommunications, will hello simply shed this notion of an excited shout through the void, or will it fade away altogether?

Hi and hey

If hello really has peaked, it’s a safe bet that hi will be a replacement. As comfortable as hi feels in a text message or email, it actually has a very similar history to hello. It’s thought to be a variation on hey, which of course could also be shouted to draw attention to something of interest. Nonetheless, hi has brevity on its side, making it an easier word to use when you can already trust their phone to have gotten their attention for you.

Source: Where Does 'Hello' Come From?, Words at Play

On January 3rd, 2018 we learned about

The word pineapple made more sense when everything was apples

Pineapples obviously don’t resemble apples, but they used to. This isn’t because pineapples used to have thin peals and white flesh before humans started cultivating them, but because people’s concept of what an apple is has changed since the fruit was first identified by European explorers. For hundreds of years, the word apple could be used for just about any unknown fruit, even if they were in no way related to plants in the Malus genus. Following this logic, peaches were first known as “Persian apples,” and bananas were once “finger apples.” This then leaves us with the “pine” in name pineapple, which still doesn’t make much sense.

Since pineapples don’t even grow on trees, it’s clear that nobody thought these tropical fruits were somehow growing on spruce or redwoods. What they did think is that pineapples looked a lot like pine cones, which, in 16th century, were still called pine apples themselves, since they produced seeds while growing in a tree. So from a 16th century perspective, pineapples do look like their namesake, at least before we changed the name of the plants that were being referred to. In the case of pine cones, the use of the word cone was borrowed from the Greek kōnos, and eventually passed from more academic botanical discussions to common English by the 18th century.

When is an apple not a fruit?

Apple is obviously no longer a generic term, now used exclusively for expensive computer hardware or the fruit of the Malus pumila tree. The word fruit itself is now our best generic term for tree-bound produce, although it too has a narrower definition than it used to. In the late 12th century, fruit could be any “useful” portion of a plant, with etymological roots in words for enjoyment and satisfaction. At some point, it could even be used for any product grown from soil, from nuts to veggies, and so the idea of a reward being the “fruit of one’s labor” barely qualifies as a metaphor.

Today, fruit covers a lot of the ground that the word apple used to, but it has also gained a strict, botanical definition. The funny thing is that since fruits are “seed-bearing structures in angiosperms formed from the ovary after flowering,” pine cones, as the original pineapples, can’t really be considered fruit at all. With pineapple’s linguistic connections being erased bit by bit, maybe it’s time English speakers simply joined the rest of the world and picked up the word Ananas. It wouldn’t be any more confusing than what we’re saying now.

Source: A Pineapple Is An Apple (Kind Of), Merriam-Webster Word History

On November 20th, 2017 we learned about

Tracing the origins of turkeys’ supposed stupidity

Nobody wants to be a turkey. Even ignoring the fact that Americans will eat 87 million of these birds on holidays alone, there’s just no glory to be had for these iconic birds. Sure, Benjamin Franklin famously tried to promote their social standing, but today turkeys are solidly associated with incompetence, incoherence, and stupidity. Are they actually a particularly pathetic species, or should we be calling fowl (sorry!) over turkey’s maligned reputation?

To be clear, turkeys are unlikely to really compete with a crow or parrot on an IQ test. They’re not known for being especially cunning birds, but they’re certainly not helpless animals either. Wild turkeys (Meleagris gallopavo) will generally live in social groups of up to 200 individuals, usually composed of hens and their young broods. For protection, they rely on camouflage and the eyes and ears of their flock, like many other social birds. At night, they can overcome their relatively heavy bone structure to fly into trees, staying off the ground where they’d be more easily discovered. To further compensate for their limited flight, they’ve also been known to swim, fanning out their tails to increase buoyancy.

Unwanted associations

The above features were successful enough to see turkeys flourish across North America, but they weren’t enough to really impress humans. Once humans realized that their seed-fed meat was pretty tasty, we started hunting turkeys, taking advantage of their relatively limited mobility. This led to the idea of a “turkey shoot,” as the birds posed little challenge for a well-aimed gun. Limited flight also led to the name “turkey” being used as an insult. In the 1920s, a stage show or movie that performed badly was called a turkey, since sales failed to “take off.” This sense of general failure has probably fed into the idea that turkey’s are stupid, although popular evidence for their poor intelligence is actually a misunderstanding.

Stupidity or neck spasms?

Turkeys are supposedly so stupid that they will stand with their mouth agape, looking up into the sky as it rains, even it it means they drown themselves. Assuming this isn’t evidence of suicidal birds looking for a way out, it does seem fair to criticize animals that can’t be bothered to keep themselves alive. What’s not fair is judging intelligence when the real issue is a genetic condition that causes uncontrollable muscles spasms.

The condition is called tetanic torticollar spasming, and it can be spontaneous, or triggered by external stimuli like loud noises. It can be fatal in hatchlings, as it can interfere with getting food and water, but is more survivable in turkeys with later onset of symptoms. If it seems incompatible with living in the wild, that may be because it’s only associated with domestic turkeys. In manipulating their genomes to maximize muscle growth, humans have helped boost the prevalence of the recessive genes that cause the spasms. These poor birds look stupid because of a disability humans have unknowingly promoted, which for some folks may make eating turkeys a bit easier on their conscience.

None of the above necessarily demands that we completely reevaluate our opinion of turkeys. Domestic turkeys in particular do have a life few would be envious of, but maybe we should start associating them with the pitfalls of being delicious instead of just being dumb. Ok, and maybe a bit weird.

Source: Are turkeys really the dumbest animals? by Valerie Strauss, The Washington Post

On October 9th, 2017 we learned about

The history of waffles, from the Stone Age to street food to your local supermarket

Humanity’s first shot at waffles was probably cooked on a rock. They weren’t a happy discovery made during a camping trip, but were probably some cutting-edge cuisine back in the Neolithic, or Stone Age over four thousand years ago. These proto-waffles certainly lacked refinements like an indented grid or maple syrup, but the core of what’s now a favorite breakfast food was still there— a cereal-based batter that was cooked not baked or fried, but seared on both sides. Like many cultural inventions, it seems safe to say that we’ve improved on this recipe over time, but clearly our ancestors knew they were on to something big, even before they could slather their dish in whipped cream and strawberries.

Once people started mastering metal in the Iron Age, rocks were traded out for metal plates, often held at the end of sticks to more easily reach into ovens. These early griddles basically sandwiched the batter as it was placed in heat, allowing it to be cooked in half the time. This concept would be picked up by the ancient Greeks, who called the resulting flat cakes obleios, or wafers. Like wafers you find in stores today, obleios weren’t especially cake-like, instead being cooked to a flatter, crispier consistency. They also flavored them as a savory food, using cheese and herbs instead of sugar and syrups. Europeans continued munching wafer-styled waffles well through the Middle Ages, with the only major innovations being to make them larger and sell them from street vendors.

Standing out with indented sqaures

The 13th century ushered in a new era of two-sided flat-cake cuisine, with waffles finally gaining their signature grid patterning, plus the name that we know today. A blacksmith enhanced the traditional iron griddle plates, hinging them together and putting in raised patterning that would increase the cakes’ surface area, allowing for more efficient heat distribution. More importantly, the indented squares that now marked waffles added a fun design element that caught people’s imaginations, earning the name wafel as a reference to a section of a bee’s honeycomb or woven webs. Other designs have been created since then, including coats of arms and even landscapes, but none are as iconic as the square pits we now associate with waffles.

Waffles’ popularity continued to grow throughout Europe. In the late 1500s, France’s King Charles IX even had to issue regulations concerning how close waffle vendors could cluster together to cut down on the number of fights in the streets. Recipes showed some divergence as well, with lower classes eating flatter, crispier waffles made from flour and water, while upper classes added more cake-like textures to their waffles with milk and eggs in the mix. This interest in softer, puffier waffles eventually grows into what we now think of as a Belgian waffle, which generally includes yeast to complete the effect. As popular as this concept seems now, it was certainly not part of the original flat cake recipe.

Sweeter, slower, faster and frozen in America

Waffles were brought to North America first with Pilgrims, and then again with Thomas Jefferson after a trip to France. Maple syrup finally found its home among the square divots in the 1800s, which alongside molasses, pushes waffles closer to the sweet side of the flavor spectrum. In 1869, Cornelius Swarthout of New York made his contribution to waffle history, patenting yet another improvement on the waffle griddle concept. Obviously the squares had to stay, but Swarthout’s design allowed waffles to be cooked over a stove top as long as the chef was willing to slow the process down, flipping the new waffle iron over to cook both sides. In a way, it was a step backwards from earlier technology that aimed to cook waffles faster, but it made waffle-cooking accessible in a new way, earning the date of the patent it’s own holiday in the form of National Waffle Day on August 24th of each year.

Waffles’ ties to technology meant that they kept changing over the 20th century. 1904 saw the invention of the waffle cone for ice cream, although considering waffles’ origins as flat, crispy wafers, you might not call this a major innovation. Alongside many other domestic objects, waffle irons were electrified in 1911, removing the need for the stove top, but often retaining the need for flipping. “Froffles” made frozen waffles available to consumers in 1953, although their egg-heavy flavor would eventually see them renamed as today’s Eggo products that are now part of a $211 million market in the United States. Finally, Americans got a proper introduction to Belgian waffles in 1964, although originally as “Brussels waffles.” Again, marketing concerns lead to a new name, and the vanilla and yeast-infused recipe was eventually circulated simply as Belgian waffles.

Blueberry, gluten-free, fried-chicken flat cakes

Today it seems that we have a huge range of waffle variations to choose from. Waffles can be found with fruit, whole wheat, chocolate, ice cream, pumpkin and of course, fried chicken. It seems that throughout this grand history, we’ve really only given up two aspects of previous waffles— the speed of cooking both sides at once, and the opportunity to buy waffles on the street from a horde of competing waferers when walking about town.

Source: Waffle History Page 1: The Origin & Evolution Of Waffles, The Nibble

On July 13th, 2017 we learned about

Searching for the start of teeter-totters, tilt boards and seesaws

My four-year-old is currently a big fan of the seesaw. Or is it the teetter-totter? The tilt board? There are many names for this piece of playground equipment, all of which refer to what’s essentially a lever. Levers as machines let us do a lot of otherwise difficult work, although in the case of most playgrounds, the perfectly symmetrical arrangement of a seesaw actually makes things a bit tricky for a large adult and a small child. I do extra work to avoid launching my child thanks to our equal distance from the fulcrum, although there’s a chance that bit of physics is what got seesaws started in the first place.

Korean catapult

With something as fundamental as a lever, it’s not easy to pin down the first time anyone decided they were fun to sit on. One line of thinking connects modern seesaws to 널뛰기, or neolttwigi, a device from Korea that dates back at least a few hundred years. A neolttwigi is like a low seesaw intended to launch someone in the air. One person jumps onto the empty side to boost their playmate straight up. The story is that this device was first developed so that young women could boost each other high enough to see over the walls surrounding their homes, although since then acrobatics and props like jump-ropes have been added to the mix.

Teetering, tilting and trembling

The various names for a seesaw may hold clues as well. Americans supposedly prefer the term “teeter-totter,” although those terms existed separately before being brought to the playground. Teeter is related to titter, which probably comes from an Old Norse word titra, which meant “to shake, shiver, totter or tremble,” which seems fairly appropriate. However, those definitions are fairly broad, which doesn’t help describe the exact device in question. French terms like balançoire only mean to balance, although the theory that seesaw is a bastardization of ci-ça (see-saw), which means “this-that” is at least more playful.

Syncing saws

The most specific connections are based around the name seesaw and how it intersects with hard labor. Before engines sped everything up, sawyers were people who sawed logs and trees. For bigger jobs, two men would hold handles on either side of a large saw, lunging back and forth in an even rhythm to efficiently cut through the wood. Another variation was the pit-saw, where one sawyer was elevated above the other, sawing down at an angle. In either case, the work went better if the back-and-forth motion remained in sync, and so sawyers would sing or chant to coordinate their movement.

These songs introduced the term “see-saw” not for a specific meaning as much for rhythm. They turn up in print in the 1630s with phrases like “see-saw-sack a down” and eventually “See Saw, sacaradown, / Which is the way to London town?” in 1685. Playing on levers probably predates these chants, but the name seesaw made it to the playground by 1704. Kids may have been pretending to be sawyers, chanting along as they rocked up-and-down. While we’re not as familiar with two-man saws today, at the time it was probably a very descriptive term, especially compared to “equidistant class one lever.”

Source: See-saw by Michael Quinion, World Wide Words

On March 5th, 2017 we learned about

The perpetual difficulty of picking what is and isn’t a planet

The ancient Greeks originally described planets as asteres planetai, or “wandering stars,” thanks to the way they moved more than other specks of light in the sky. Today, we can observe a swath of differences between planets and stars, but that original definition does point out how names can be outgrown as we get more information about the universe we live in. While we still hang on to the word “planet,” its definition has changed a lot, even explicitly ruling out the idea of wandering, much less being stars. It’s also far from a settled question, and that’s not just because people feel their memories have been violated thanks to Pluto being classified as a dwarf planet.

Past planetary rosters

For a long time, the debate was focused on the original count of five or seven planets. The obvious objects in the sky included Mercury, Venus, Mars, Jupiter and Saturn, as well as the Moon and the Sun. Depending on who you asked, the Sun and Moon were lumped in the category of planets, since they were all things that, in a geocentric model of the solar system, orbited the Earth. By the Middle Ages, western astronomers were coming to realize that the Sun and Moon did not behave like planets exactly, and they were referred to more frequently as something distinct from those heavenly bodies. The next big step in breaking down these definitions was the revelation of a heliocentric model of the solar system, which then forced the issue that the Earth itself is indeed a planet.

Details of the definition

Fast-forwarding through centuries of astronomical discoveries and refinements, the International Astronomical Union (IAU) currently defines a planet according to three criteria, none of which seem totally satisfying at this point. The first rule is that a planet has to orbit not just a star, but our Sun specifically. This of course bars any of the growing number of exoplanets we’ve discovered in the past few years from being planets, as well as otherwise planet-like objects that may be truly wandering the galaxy outside a star’s orbit. The second rule is that the object is big enough for its gravity to have crushed itself into a nearly round shape (or point of hydrostatic equilibrium.) Basically, smaller objects like lumpy, lopsided asteroids are too small to count, although people have questioned what the exact, and necessarily arbitrary, threshold is for “nearly round.” The final rule is that a planet’s “neighborhood” is cleared of other, potentially intersecting objects. So a planet can have moons that travel the same route, but Pluto is disqualified because of it’s proximity to other Kuiper Belt objects like Eris, plus the fact that its solar orbit is so tied to Neptune’s gravity.

Fortunately, the ongoing discussions with what constitutes a planet are trying to take into account all the new data we have about objects in our solar system and beyond. When we discovered Eris and other Kuiper Belt objects, Pluto seemed more closely related to them than larger objects like Jupiter. However, the practicality of the “cleared neighborhood” rule is actually proving to be difficult to work with. Written with freshly-discovered Kuiper Belt objects in mind, the rule doesn’t hold up well with different parameters— for instance, even an Earth-sized planet in Pluto’s position would be entangled with Uranus’ orbit, which seems to go against the sort of filtering people were aiming for. Proposals for new definitions have been offered, mainly to strip the final rule from the current definition. This would then reclassify Pluto, Eris as well as every other dwarf planet and moon in our solar system. We’d go from eight (maybe nine) planets to 102, which is such a dramatic shift that you might wonder what about the point of such a definition the first place?

Public perceptions

The primary job of a good label for planet would be to help us sort and understand these objects as clearly as possible. However, the word ‘planet’ now also carries some baggage with it, including a sense of gravitas that people inside and especially outside the scientific community value greatly. When Pluto was reclassified as a “dwarf planet” in 2006, people referred to it as a “demotion,” and even protested the change. Even if scientists are more concerned with accuracy, the public’s take on the importance of being a planet can’t be overlooked. Securing funding and interest in sending a probe to a planet is still much easier than even a dwarf planet, much less a “small bodied object.” It seems that we all think that it’s cool to be a planet, even if we’re not completely in agreement about what that means.

Source: Should Pluto Be a Planet After All? Experts Weigh In by Mike Wall, Space.com

On January 15th, 2017 we learned about

Picking through the past of the word ‘poop’

Poop Week!

The word ‘poop’ was first written down over 600 years ago, in reference to the rear deck of a ship. Much to my children’s disappointment, this name had nothing to do with feces, instead being connected to French and Latin terms for ‘stern.’ So at that point, the smell of a ‘poop’ would have only been salty, sea air. If you wanted to talk about stinky poop as we now know it, you’d have to wait until at least 1721, and even then it wouldn’t quite be the bit of potty talk you’re probably thinking of.

Happily, poop’s non-Latin origins align much more closely with the more comedic facets of our favorite, child-friendly, word for excrement. In Middle English, the verb poupen meant to make an abrupt sound, or to blow or toot a horn. You can probably guess where we’re going with this, as the opportunity for onomatopoeia was apparently not missed by history, and ‘poop’ started seeing use to describe a fart. By 1744, in what is probably the most appropriate etymological evolution ever, poop progressed past passing gas and finally found its calling as a term for feces. Interestingly, pooping wasn’t a thing for another couple hundred years, turning up in print as a verb in 1903.

Other terms for number two

So what was the world doing in the toilet before pooping was an option? We obviously have a slew of slang and medical terms for excrement, but the most common alternative is some form of kakka. Caca, kacka, kaka and more are common, if sometimes vulgar, terms for doodoo, with roots going back to some of the earliest Indo-European languages. And while I’m not getting into it with my second grader or preschooler, I’d be remiss to not mention shit, which was in use by the 14th century, having come from terms like Old High German’s scīzan and Old English’s scēadan. Shit doesn’t quite have the fart tie-in, but with original meanings concerning defecation or “separation” from waste, it’s always been a surprisingly precise word.

Source: Poop, Online Etymology Dictionary

On December 14th, 2016 we learned about

Sugar plums: from seeded sugar to sweetness itself

There’s something pleasantly vague about the name “sugar plum” to the modern ear. When we hear about these treats in ‘Twas the Night Before Christmas, or The Nutcracker, they sound much better than modern terms that would be more descriptive, but sound a lot less magical. Children aren’t going to get excited about “visions of digestive aids,” and the “Dance of the Gobstopper Fairy” doesn’t sound terribly whimsical either.

Sugar to suck on

Sugar plums, or comfits as they were more commonly called in their heyday, are actually a very old candy that likely originated in the Middle East. They was some variety in their recipes, but the core concept involved a lot of sugar being coated around a seed or nut. Sometimes the seed could be substantial, like an almond, but they were often something tiny like a celery seed, basically acting like an anchor for sugar to start collecting around. Once an appropriate seed or speck was selected, it was then repeatedly covered in layers of sugar or syrup. The comfits would be heated, coated, and then cooled over and over again, with the entire process taking days to complete. Flavors or colors could be added, but basically you ended up with lumpy sticks of sugar, or smoother balls of dried syrup to suck on, not terribly unlike a gobstopper.

Of course, the labor involved in cooking comfits meant that these candies were not doled out carelessly. They were originally treated like an after-dinner digestive aid, meant to accompany some spiced wine, preventing indigestion and reducing flatulence. Since ‘Twas the Night Before Christmas puts sugar plums in children’s dreams in the 1823, it’s clear that these candies eventually moved beyond the formal banquet table to a place where kids could enjoy them for the balls of sugar they were. Technological advances in the 1860s, like steam and mechanized pans, made comfit production significantly cheaper and more accessible to the general public.

From candy to culture

By the time The Nutcracker premiered in 1892, sugar plums would have been widely known as a sugary treat, even transcending their origin as literal candies. From the 17th century onward, ‘sugar plums’ could be anything sweet and lovely, or even sweet and slightly devious if someone had a “mouth full of sugar plums.” These meanings were even shortened to just “plum” sometimes, removing the last traces of any descriptive words that would help a modern reader understand what was going on. Not that that’s stopping us from enjoying our own visions of sugar plums, as disconnected as they may be.

Source: Sugar-Plums and Comfits, Historic Food

On December 4th, 2016 we learned about

Official ways of declaring distress when you can’t directly ask for “help”

At a certain point in life, we all have to learn that yelling “mommy” or “daddy” isn’t the best way to request assistance. Depending on the scenario, there are even times that shouts of “help!” won’t do the trick, no matter the language you say it in. If this sounds unfamiliar, it’s probably because you don’t spend much time driving boats, trains or planes. Each of these forms of transportation have official distress calls, some of which are more intuitive that others.

SOS: Struggling ships at sea

Ships at sea have a variety of ways to call for help. These range from firing guns at one minute intervals to orange-colored smoke signals. In 1857, the International Code of Signals was established, which designated an official distress flag with a square and a ball above it, but all of these visually oriented signs have been eclipsed by SOS. SOS was officially put into international use in 1908 for wireless telegraph communications, but it has since permeated pop-culture in a generic term for “help!”

At first glance it seems like SOS should stand for something as an initialism, but that’s not actually the case. The three letters were picked by the German government in 1905 because they’re easy to send via Morse Code (dot-dot-dot, dash-dash-dash, dot-dot-dot.) Assumptions that the three letters stand for “save our ship” or “save our souls” have all been invented by English speakers after the fact, presumably trying to link it to some spontaneous exclamation from a sinking ship long ago.

Mayday: Planes with a problem

Every form of transportation breaks down at some point, and airplane pilots realized they needed a distress signal as well. By 1923, pilots were communicating via radio, so picking up something easy to punch out in Morse Code wasn’t really necessary. Simply shouting “help” into the radio was ruled out, because it was too likely to come up in situations that weren’t true emergencies. While simply shouting “SOS” probably didn’t come up in day-to-day communications, “mayday,” and eventually “mayday mayday mayday” became the designated way to call for aid.

Unlike SOS, there was some intentional, secondary meaning to “mayday.” Senior radio officer Frederick Stanley Mockford came up with the term while working at the Croydon Airport, which regularly interacted with traffic from France. Mayday didn’t sound like an English word, but did sound like m’aider, or the end of the phrase “can you help me” in French, which was considered a plus. Along the same lines of logic, less imminent danger can be reported with radioing “pan-pan, pan-pan, pan-pan,” which happens to sound a lot like panne, the word for “broken” in French.

Toot toot toot: Trains in trouble

Train engineers have their own emergency signal that they can make with their horns. Because the horns aren’t used to communicate with rail yards or railway control rooms, the primary goal of this distress call is to alert any bystanders of danger. The signal is basically a series of repeated, short toots, which presumably will be loud and frantic enough to grab people’s attention so they notice where the troubled train is headed. For less dire situations, trains have a set of horn signals based around long or short toots, almost like the dots and dashes of Morse Code. You’re most likely to hear a “long long short long” set of toots, which indicates that the train is approaching a public crossing.

Source: The origins of SOS and Mayday, OxfordWords blog