Talkie Tuesday: Nine English words people just made up

Languages evolve the same way as anything else: Growth and change. What becomes useful survives and what doesn’t dies. English is no different, although it’s a language that loves to grow via consumption — taking in bits of other languages or finding new uses for old words.

Sometimes the words are just made up by writers in their works — and sometimes, those words go on to become a part of the language.

The words below are listed in alphabetical order, but don’t go looking for Shakespeare’s name. While he’s often credit with creating hundreds of new English words, he really didn’t. Rather, he was really good at collecting them from what he heard, then using them in his plays to make the dialogue sound realistic — it was how the people were talking in the streets.

The problem happened when the first Oxford English Dictionary was created, and the entries had to include an attestation to first printed use. Well, at that time, guess who that often was? And so Shakespeare wound up being credited as the source of words that he, at best, curated.

That doesn’t diminish his genius one it, though. Now here are the words.


Source: Edward Spenser’s poem, The Faerie Queene

This word first appears in the form of the Blatant Beast, who works for Envie and Detraction, two allegorical figures. Of them, the poem says:

Vnto themselues they gotten had

A monster, which the Blatant beast men call,

A dreadfull feend of gods and men ydrad.

While the original Blatant Beast represented the worst sort of slander that could be spread about a person, the word eventually lost its beastly origins and came to mean offensive or in your face — “a blatant disregard for the truth.”

Source: Lewis Carroll’s poem, Jabberwocky

‘Twas brillig, and the slithy toves did gyre and gimble in the wabe…

These lines open Carroll’s brilliant gibberish poem that is, nevertheless, somewhat understandable because the grammar and parts of speech follow English rules and rhythm perfectly even if the words don’t quite. In that opening line, it’s quite obvious that “brillig” refers to something about the place we’re at, and slithy toves are creatures (nouns) that do actions (verbs) in a particular place — the wabe.

While Carroll’s Alice books were more likely than not a satire of “modern” math written by a rather conservative mathematician, they nonetheless also reflected his fascination and extreme talent with words. Jabberwocky also uses familiar structure — traditional math — with nonsense expressions standing in for all the standard variables as a reflection of Carroll’s disdain for what was happening to math at the time.

As for “chortle,” it appears in this sentence, after the hero has slain the Jabberwock: “’O frabjous day! Callooh! Callay!’ he chortled in his joy.” Here, the word can be taken as a portmanteau of chuckle and snorted.


Source: William Gibson’s short story Burning Chrome

This one took off and took on meaning fast, becoming deeply entrenched in our culture particularly after the rise of the internet. The short story was first published in the magazine OMNI in 1982.

The first instance in the book is here: “I knew every chip in Bobby’s simulator by heart; it looked like your workaday Ono-Sendai VII, the “Cyberspace Seven…”

The term didn’t really take off two years later, when he used it in his novel, Neuromancer, and it’s defined thusly: “The matrix has its roots in primitive arcade games… Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation.”

Sounds like the internet, doesn’t it? Sort of, but in Gibson’s vision, it went a little bit farther. Think Ready Player One — VR that’s interactive to the point that your own mind is projected into it.

Neuromancer was not the first cyberpunk novel or even the first work in the genre, but Gibson was its most famous author, and he boosted the aesthetic into the zeitgeist.


Source: Sir Walter Scott’s novel Ivanhoe

You’ll probably figure out the origin of this one as soon as I explain the premise of Scott’s 1819 novel. Set in England in the Middle Ages several centuries after the time of the Norman Conquest, it tells the story of one of the last remaining Anglo-Saxon noble families in the country.

The story is contemporaneous with Robin Hood, and the hero, Sir Wilfred of Ivanhoe, is a knight. This is one of the first modern stories to popularize the whole idea of chivalry and jousting tournaments with the English-speaking world, and now you probably have guessed how the word “freelance”  came up in the work.

Since these knights jousted and fought with lances, a “freelance” was someone who held no allegiance to a king or prince but, rather, was available for hire.


Source: Sylvia Wright’s magazine article The Death of Lady Mondegreen

This one is almost charming, but the word itself has taken on an entire life online. The term refers to terribly misheard song lyrics, with one of the most cited being people hearing Jimi Hendrix’s “Excuse me while I kiss the sky” as “Excuse me while I kiss this guy.”

In Wright’s case, it was her mother who had misread a poem to her when she was a girl. The poem was Percy’s Reliques, and the correct line was, “layd him on the green,” which came out of her mother’s mouth as “Lady Mondegreen.”

Needless to day, it wasn’t until years later that Wright figured out the error and wrote her article, ushering the word into common usage and a great source of memes.


Source: Dr. Seuss’s book If I Ran the Zoo

While Seuss himself didn’t define this one, it popped up in college slang with its modern meaning a year later in 1951, and while Merriam-Webster seems to think that this argues against Seuss inventing it, it actually makes perfect sense.

Families were bigger then and babysitting was an ubiquitous occupation, so it’s quite plausible that a high school senior or college freshman picked it up from reading to a younger sibling or babysitting client and the word made its way from there.

Here’s its original appearance in the book:

And then, just to show them, I’ll sail to Ka-troo

And bring back an It-kutch, a Preep and a Proo,

A Nerkle, a Nerd, and a Seersucker, too!

Again, it’s not defined, although the word “seersucker” was and still is a well-known fabric with somewhat square and nerdy connotations, so that may have helped define it to those college kids who took off with it.


Source: John Milton’s epic Paradise Lost

Milton was the author who went ahead and wrote the origin story for Dante’s Inferno (okay, Divine Comedy, but no one ever reads the other two parts), and here he tells the story of Satan, the war in Heaven, and all that yadda yadda.

To him, Pandemonium was the capital of Hell, and the word was derived pretty simply: from the Greek prefix pan-, meaning all, and demonium, referring to the realm of the demons. So the word simply meant “Place of all demons.”


Source: Dorothy Parker’s short story The Waltz

It’s the “scaredy” part that she coined here, although it’s been firmly welded to the word cat, so that it never appears separately or in any other compound. You’ll never hear “scaredy-dog,” after all.

The modern definition is somebody who’s afraid of everything, but in the context of the story, it has the typical Dorothy Parker sarcastic bite to it. In the story, she’s a woman at a dance feeling sorry for another woman is currently dancing with a man she doesn’t want to.

But then Parker’s character realizes she’s probably going to be asked yes and doesn’t want to, visualizes all kinds of scenarios on how to get out of it, including referring to seeing him in hell first or having labor pains, but she concludes with, “Oh, yes, do let’s dance together — it’s so nice to meet a man who isn’t a scaredy-cat about catching my beri-beri…”

Ultimately, she realizes that she has no choice but to politely comply.


Source: Jonathan Swift’s satirical novel Gulliver’s Travels

Long before it was an internet company that’s pretty much since past its prime, “yahoo” was coined by Jonathan Swift as the name of a depraved and filthy group of creatures Lemuel Gulliver encounters in his travels.

They are obsessed with digging through filth and mud to find pretty stones, making them stand-ins for the author to mock the petty materialism and elitism of 18th century Britain. There’s also one wild theory that their appearance in the book was based on contemporary reports of the Sasquatch coming from Native Americans at the time, but that could be specious.

And there you have it. Nominated for the list but cut upon investigation: The allegation that Dorothy Plath coined “dreamscape,” when it’s fairly badly attested, along with claims that Alexandre Dumas fils created “feminist.” Not only did the word exist before he used it in 1872, but he used it in a pamphlet that was extraordinarily misogynistic, so no credit to him.

What are your favorite invented words? Let us know in the comments.

Forces of nature

If you want to truly be amazed by the wonders of the universe, the quickest way to do so is to learn about the science behind it.

And pardon the split infinitive in that paragraph, but it’s really not wrong in English, since it became a “rule” only after a very pedantic 19th century grammarian, John Comly, declared that it was wrong to do so — although neither he nor his contemporaries ever called it that. Unfortunately, he based this on the grammar and structure of Latin, to which that of English bears little resemblance.

That may seem like a digression, but it brings us back to one of the most famous modern split infinitives that still resonates throughout pop culture today: “To boldly go where no one has gone before,” and this brings us gracefully back to science and space.

That’s where we find the answer to the question “Where did we come from?” But what would you say exactly is the ultimate force that wound up directly creating each one of us?

One quick and easy answer is the Big Bang. This is the idea, derived from the observation that everything in the universe seems to be moving away from everything else, so that at one time everything must have been in the same place. That is, what became the entire universe was concentrated into a single point that then somehow exploded outward into, well, everything.

But the Big Bang itself did not instantly create stars and planets and galaxies. It was way too energetic for that. So energetic, in fact, that matter couldn’t even form in the immediate aftermath. Instead, everything that existed was an incredibly hot quantum foam of unbound quarks. Don’t let the words daunt you. The simple version is that elements are made up of atoms, and an atom is the smallest unit of any particular element — an atom of hydrogen, helium, carbon, iron, etc. Once you move to the subatomic particles that make up the atom, you lose any of the properties that make the element unique, most of which have to do with its atomic weight and the number of free electrons wrapped around it.

Those atoms in turn are made up of electrons that are sort of smeared out in a statistical cloud around a nucleus made up of at least one proton (hydrogen), and then working their way up through larger collections of protons (positively charged), an often but not always equal number of neutrons (no charge), and a number of electrons (negatively charged) that may or may not equal the number of protons.

Note that despite what you might have learned in school, an atom does not resemble a mini solar system in any particular way at all, with the electron “planets” neatly orbiting the “star” that is the nucleus. Instead, the electrons live in what are called orbitals and shells, but they have a lot more to do with energy levels and probable locations than they do with literal placement of discrete dots of energy.

Things get weird on this level, but they get weirder if you go one step down and look inside of the protons and neutrons. These particles themselves are made up of smaller particles that were named quarks by Nobel Prize winner Murray Gell-Man as a direct homage to James Joyce. The word comes from a line from Joyce’s book Finnegans Wake, which itself is about as weird and wonderful as the world of subatomic science. “Three quarks for muster mark…”

The only difference between a proton and a neutron is the configuration of quarks inside. I won’t get into it here except to say that if we call the quarks arbitrarily U and D, a proton has two U’s and one D, while a neutron has two D’s and one U.

And for the first few milliseconds after the Big Bang, the universe was an incredibly hot soup of all these U’s and D’s flying around, unable to connect to each other because the other theoretical particles that could have tied them together, gluons, couldn’t get a grip. The universe was also incredibly dark because photons couldn’t move through it.

Eventually, as things started to cool down, the quarks and gluons started to come together, creating protons and neutrons. The protons, in turn, started to hook up with free electrons to create hydrogen. (The neutrons, not so much at first, since when unbound they tend to not last a long time.) Eventually, the protons and neutrons did start to hook up and lure in electrons, creating helium. This is also when the universe became transparent, because now the photons could move through it freely.

But we still haven’t quite gotten to the force that created all of us just yet. It’s not the attractive force that pulled quarks and gluons together, nor is it the forces that bound electrons and protons. That’s because, given just those forces, the subatomic particles and atoms really wouldn’t have done much else. But once they reached the stage of matter — once there were elements with some appreciable (though tiny) mass to toss around, things changed.

Vast clouds of gas slowly started to fall into an inexorable dance as atoms of hydrogen found themselves pulled together, closer and closer, and tighter and tighter. The bigger the cloud became, the stronger the attraction until, eventually, a big enough cloud of hydrogen would suddenly collapse into itself so rapidly that the hydrogen atoms in the middle would slam together with such force that it would overcome the natural repulsion of the like-charged electron shells and push hard enough to force the nuclei together. And then you’d get… more helium, along with a gigantic release of energy.

And so, a star is born. A bunch of stars. A ton of stars, everywhere, and in great abundance, and with great energy. This is the first generation of stars in the universe and, to quote Bladerunner, “The light that burns twice as bright burns half as long.” These early stars were so energetic that they didn’t make it long, anf they managed to really squish things together. You see, after you turn hydrogen into helium, the same process turns helium into heavier elements, like lithium, carbon, neon, oxygen, and silicon. And then, once it starts to fuse atoms into iron, a funny thing happens. Suddenly, the process stops producing energy, the star collapses into itself, and then it goes boom, scattering those elements aback out into the universe.

This process will happen to stars that don’t burn as brightly, either. It will just take longer. The first stars lasted a few hundred million years. A star like our sun is probably good for about ten billion, and we’re only half way along.

But… have you figured out yet which force made these stars create elements and then explode and then create us, because that was the question: “What would you say exactly is the ultimate force that wound up directly creating each one of us?”

It’s the same force that pulled those hydrogen atoms together in order to create heavier elements and then make stars explode in order to blast those elements back out into the universe to create new stars and planets and us. It’s the same reason that we have not yet mastered doing nuclear fusion because we cannot control this force and don’t really know yet what creates it. It’s the same force that is keeping your butt in your chair this very moment.

It’s called gravity. Once the universe cooled down enough for matter to form — and hence mass — this most basic of laws took over, and anything that did have mass started to attract everything else with mass. That’s just how it works. And once enough mass got pulled together, it came together tightly enough to overcome any other forces in the universe.  Remember: atoms fused because the repulsive force of the negative charge of electrons was nowhere near strong enough to resist gravity, and neither was the nuclear force between protons and neutrons.

Let gravity grow strong enough, in fact, and it can mash matter so hard that it turns every proton in a star into a neutron which is surrounded by a surface cloud of every electron sort of in the same place, and this is called a neutron star. Squash it even harder, and you get a black hole, a very misunderstood (by lay people) object that nonetheless seems to actually be the anchor (or one of many) that holds most galaxies together.

Fun fact, though. If our sun suddenly turned into a black hole (unlikely because it’s not massive enough) the only effect on the Earth would be… nothing for about eight minutes, and then it would get very dark and cold, although we might also be fried to death by a burst of gamma radiation. But the one thing that would not happen is any of the planets suddenly getting sucked into it.

Funny thing about black holes. When they collapse like that and become one, their radius may change drastically, like from sun-sized to New York-sized, but their gravity doesn’t change at all.

But I do digress. Or maybe not. Circle back to the point of this story: The universal force that we still understand the least also happens to be the same damn force that created every single atom in every one of our bodies. Whether it has its own particle or vector, or whether it’s just an emergent property of space and time, is still anybody’s guess. But whichever turns out to be true, if you know some science, then the power of gravity is actually quite impressive.

Friday Free-for-All #16

In which I answer a random question generated by a website. Here’s this week’s question Feel free to give your own answers in the comments.

What piece of technology brings you the most joy?

This one is actually very simple. It is the lowly but very important integrated circuit, or IC. They combine a host of functions previously performed by much larger and more complicated devices — mostly transistors, resistors, and capacitors — which can create all sorts of tiny components, like logic gates, microcontrollers, microprocessors, sensors, and on and on.

In the old pre-ICs days, transistors, resistors, and capacitors all existed on a pretty large scale, as in big enough to pick up with your fingers and physically solder into place.

Before that, old school “integrated circuits” were big enough to hold in your hand and resembled very complicated lightbulbs. These were vacuum tubes, and essentially performed the same functions as a transistor — as either an amplifier or a switch. And yes, they were considered analog technology.

The way vacuum tubes worked was actually via heat. A piece of metal would be warmed up to release electrons, which was also the reason for the vacuum. This meant that there were no air molecules to get in the way as the electrons flowed from one end (the cathode) to the other (the anode), causing the current to flow in the other direction. (Not a typo. It’s a relic from an early misconception about how electricity works that was never corrected.)

The transition away from vacuum tubes to transistorized TV sets began in 1960, although the one big vacuum tube in the set — the TV screen itself — stuck around until the early 2000s.

But back to the vacuum tube function. Did it seem off that I described transistors as either amplifiers or switches? That’s probably because you might think of the former in terms of sound and the latter in terms of lights, but what we’re really talking about here is voltage.

Here’s the big secret of computers and other modern electronic devices. The way they really determine whether a bit value is 0 or 1 is not via “on” or “off” of a switch. That’s a simplification. Instead, what they really use is high or low voltage.

Now, granted, those voltages are never that “high,” being measured in milliamps, but the point is that it’s the transistor that decides either to up a voltage before passing it along, or which of an A/B input to pass along which circuit.

Meanwhile, resistors are sort of responsible for the math because they either slow down currents, so to speak, or let them pass as-is. Finally, capacitors are analogous to memory, because they store a received current for later use.

Put these all together, and that’s how you get all of those logic gates, microcontrollers, microprocessors, sensors, and on and on. And when you put all of these together, ta-da: electronics.

These can be as simple as those dollar store calculators that run on solar power and can only do four functions, or as complicated as the fastest supercomputers in the world. (Note: Quantum computers don’t count here because they are Next Gen, work in an entirely different way, and probably won’t hit consumer tech for at least another thirty years.)

So why do ICs give me joy? Come on. Look around you. Modern TVs; LCD, LED, and OLED screens; eReaders; computers; cell phones; GPS; synthesizers; MIDI; CDs, DVDs, BluRay; WiFi and BlueTooth; USB drives and peripherals; laser and inkjet printers; microwave ovens; anything with a digital display in it; home appliances that do not require giant, clunky plugs to go into the wall; devices that change to or from DST on their own; most of the sensors in your car if it was built in this century; the internet.

Now, out of that list, a trio stands out: computers, synthesizers, and MIDI, which all sort of crept into the consumer market at the same time, starting in the late 70s and on into the 80s. The funny thing, though, is that MIDI (which stands for Musical Instrument Digital Interface) is still around and mostly unchanged. Why? Because it was so incredibly simple and robust.

In a way, MIDI was the original HTML — a common language that many different devices could speak in order to reproduce information in mostly similar ways across platforms and instruments. Started with sixteen channels, it’s proven to be a ridiculously robust and backwards-compatible system.

Over time, the number of channels and bit-depth has increased, but a MIDI keyboard from way back in the early 80s will still communicate with a device using MIDI 2.0. You can’t say the same for, say, storage media and readers from different time periods. Good luck getting that early 80s 5-inch floppy disc to work with any modern device.

What’s really remarkable about MIDI is how incredibly robust it is, and how much data it can transfer in real time. Even more amazing is that MIDI has been adapted to more than just musical instruments. It can also be used for things like show control, meaning that a single MIDI system runs the lights, sound systems and, in some cases, even the practical effects in a concert or stage production.

And, again, while MIDI 1.0 was slowly tweaked over time between 1982 and 1996, it still went almost 25 years before it officially went from version 1.0 to 2.0, in January 2020. Windows 1.0 was released on November 20, 1985, although it was really just an overlay of MS-DOS. It lasted until December 9, 1987, when Windows 2.0 came out. This was also when Word and Excel first happened.

Apple has had a similar history with its OS, and in about the same period of time that MID has been around, both of them have gone through ten versions with lots of incremental changes along the way.

Now, granted, you’re not going to be doing complex calculations or spreadsheets or anything like that with MIDI, and it still doesn’t really have a GUI beyond the independent capabilities of the instruments you’re using.

However, with it, you can create art — anywhere from a simple song to a complex symphony and, if you’re so inclined, the entire stage lighting and sound plot to go along with it.

And the best part of that is that you can take your musical MIDI data, put it on whatever kind of storage device is currently the norm, then load that data back onto any other MIDI device.

Then, other than the specific capabilities of its onboard sound-generators, you’re going to hear what you wrote, as you wrote it, with the same dynamics.

For example, the following was originally composed on a fairly high-end synthesizer with really good, realistic tone generators. I had to run the MIDI file through an online MIDI to audio site that pretty much uses the default PC cheese-o-phone sounds, but the intent of what I wrote is there.

Not bad for a standard that has survived, even easily dumping its proprietary 5-pin plug and going full USB without missing a beat. Literally. Even while others haven’t been able to keep up so well.

So kudos to the creation of ICs, and eternal thanks for the computers and devices that allow me to use them to be able to research, create, and propagate much more easily than I ever could via ancient analog techniques.

I mean, come on. If I had to do this blog by typing everything out on paper, using Wite-Out or other correction fluid constantly to fix typos, then decide whether it was worth having it typeset and laid out (probably not) and debating whether to have it photocopied and mimeographed.

Then I’d have to charge all y’all to get it via the mail, maybe once a month — and sorry, my overseas fans, but you’d have to pay a lot more and would probably get it after the fact, or not at all if your postal censors said, “Oh, hell noes.”

Or, thanks to ICs, I can sit in the comfort of my own isolation on the southwest coast of the middle country in North America, access resources for research all over the planet, cobble together these ramblings, and then stick them up to be blasted into the ether to be shared with my fellow humans across the globe, and all it costs me is the internet subscription fee that I would pay anyway, whether I did this or not.

I think we call that one a win-win. And if I went back and told my first-grade self, who was just having his first music lessons on a decidedly analog instrument, in a couple of years, science is going to make this a lot more easy and interesting, he probably would have shit his pants.

Okay. He probably would have shit his pants anyway. Mainly by realizing, “Wait, what. You’re me? Dude… you’re fucking old!”

Oh well.

Image (CC BY 3.0) by user Mataresephotos.


%d bloggers like this: