Wonderous Wednesday: 5 Things that are older than you think

Quarantine is hard, so in lieu of not posting anything, here’s a blast from the past, an article posted in February 2019, but which is still relevant today.

A lot of our current technology seems surprisingly new. The iPhone is only twelve years old, for example, although the first Blackberry, a more primitive form of smart phone, came out in 1999. The first actual smart phone, IBM’s Simon Personal Communicator, was introduced in 1992 but not available to consumers until 1994. That was also the year that the internet started to really take off with people outside of universities or the government, although public connections to it had been available as early as 1989 (remember Compuserve, anyone?), and the first experimental internet nodes were connected in 1969.

Of course, to go from room-sized computers communicating via acoustic modems along wires to handheld supercomputers sending their signals wirelessly via satellite took some evolution and development of existing technology. Your microwave oven has a lot more computing power than the system that helped us land on the moon, for example. But the roots of many of our modern inventions go back a lot further than you might think. Here are five examples.

Alarm clock

As a concept, alarm clocks go back to the ancient Greeks, frequently involving water clocks. These were designed to wake people up before dawn, in Plato’s case to make it to class on time, which started at daybreak; later, they woke monks in order to pray before sunrise.

From the late middle ages, church towers became town alarm clocks, with the bells set to strike at one particular hour per day, and personal alarm clocks first appeared in 15th-century Europe. The first American alarm clock was made by Levi Hutchins in 1787, but he only made it for himself since, like Plato, he got up before dawn. Antoine Redier of France was the first to patent a mechanical alarm clock, in 1847. Because of a lack of production during WWII due to the appropriation of metal and machine shops to the war effort (and the breakdown of older clocks during the war) they became one of the first consumer items to be mass-produced just before the war ended. Atlas Obscura has a fascinating history of alarm clocks that’s worth a look.

Fax machine

Although it’s pretty much a dead technology now, it was the height of high tech in offices in the 80s and 90s, but you’d be hard pressed to find a fax machine that isn’t part of the built-in hardware of a multi-purpose networked printer nowadays, and that’s only because it’s such a cheap legacy to include. But it might surprise you to know that the prototypical fax machine, originally an “Electric Printing Telegraph,” dates back to 1843. Basically, as soon as humans figured out how to send signals down telegraph wires, they started to figure out how to encode images — and you can bet that the second image ever sent in that way was a dirty picture. Or a cat photo. Still, it took until 1964 for Xerox to finally figure out how to use this technology over phone lines and create the Xerox LDX. The scanner/printer combo was available to rent for $800 a month — the equivalent of around $6,500 today — and it could transmit pages at a blazing 8 per minute. The second generation fax machine only weighed 46 lbs and could send a letter-sized document in only six minutes, or ten page per hour. Whoot — progress! You can actually see one of the Electric Printing Telegraphs in action in the 1948 movie Call Northside 777, in which it plays a pivotal role in sending a photograph cross-country in order to exonerate an accused man.

In case you’re wondering, the title of the film refers to a telephone number from back in the days before what was originally called “all digit dialing.” Up until then, telephone exchanges (what we now call prefixes) were identified by the first two letters of a word, and then another digit or two or three. (Once upon a time, in some areas of the US, phone numbers only had five digits.) So NOrthside 777 would resolve itself to 667-77, with 667 being the prefix. This system started to end in 1958, and a lot of people didn’t like that.

Of course, with the advent of cell phones prefixes and even area codes have become pretty meaningless, since people tend to keep the number they had in their home town regardless of where they move to, and a “long distance call” is mostly a dead concept now as well, which is probably a good thing.


When do you suppose the first computer animation appeared on film? You may have heard that the original 2D computer generated imagery (CGI) used in a movie was in 1973 in the original film Westworld, inspiration for the recent TV series. Using very primitive equipment, the visual effects designers simulated pixilation of actual footage in order to show us the POV of the robotic gunslinger played by Yul Brynner. It turned out to be a revolutionary effort.

The first 3D CGI happened to be in this film’s sequel, Futureworld in 1976, where the effect was used to create the image of a rotating 3D robot head. However, the first ever CGI sequence was actually made in… 1961. Called Rendering of a planned highway, it was created by the Swedish Royal Institute of Technology on what was then the fastest computer in the world, the BESK, driven by vacuum tubes. It’s an interesting effort for the time, but the results are rather disappointing.

Microwave oven

If you’re a Millennial, then microwave ovens have pretty much always been a standard accessory in your kitchen, but home versions don’t predate your birth by much. Sales began in the late 1960s. By 1972 Litton had introduced microwave ovens as kitchen appliances. They cost the equivalent of about $2,400 today. As demand went up, prices fell. Nowadays, you can get a small, basic microwave for under $50.

But would it surprise you to learn that the first microwave ovens were created just after World War II? In fact, they were the direct result of it, due to a sudden lack of demand for magnetrons, the devices used by the military to generate radar in the microwave range. Not wanting to lose the market, their manufacturers began to look for new uses for the tubes. The idea of using radio waves to cook food went back to 1933, but those devices were never developed.

Around 1946, engineers accidentally realized that the microwaves coming from these devices could cook food, and voìla! In 1947, the technology was developed, although only for commercial use, since the devices were taller than an average man, weighed 750 lbs and cost the equivalent of $56,000 today. It took 20 years for the first home model, the Radarange, to be introduced for the mere sum of $12,000 of today’s dollars.

Music video

Conventional wisdom says that the first music video to ever air went out on August 1, 1981 on MTV, and it was “Video Killed the Radio Star” by The Buggles. As is often the case, conventional wisdom is wrong. It was the first to air on MTV, but the concept of putting visuals to rock music as a marketing tool goes back a lot farther than that. Artists and labels were making promotional films for their songs back at almost the beginning of the 1960s, with the Beatles a prominent example. Before these, though, was the Scopitone, a jukebox that could play films in sync with music popular from the late 1950s to mid-1960s, and their predecessor was the Panoram, a similar concept popular in the 1940s which played short programs called Soundies. However, these programs played on a continuous loop, so you couldn’t chose your song. Soundies were produced until 1946, which brings us to the real predecessor of music videos: Vitaphone Shorts, produced by Warner Bros. as sound began to come to film. Some of these featured musical acts and were essentially miniature musicals themselves. They weren’t shot on video, but they introduced the concept all the same. Here, you can watch a particularly fun example from 1935 in 3-strip Technicolor that also features cameos by various stars of the era in a very loose story.

Do you know of any things that are actually a lot older than people think? Let us know in the comments!

Photo credit: Jake von Slatt

Friday Free-for-All #12

In which I answer a random question generated by a website. Here’s this week’s question Feel free to give your own answers in the comments.

What is the best path to find truth: Science, math, art, philosophy, or something else?

I suppose it depends upon how you define “truth,” but if we take it to mean objective facts that cannot be refuted by any subjective evidence, then the hands down answer is math, period.

Yes, our terminology for things is arbitrary, but what’s happening beneath it all is objectively true. 1 + 1 = 2, although you could just as easily express it as pine cone + pine cone = melon, blarf + blarf = smerdge, or whatever.

Note that those are metaphorical pine cones and melons, of course. The idea is that the symbol for a single thing plus the symbol for another single thing equals a total of a double thing.

The circumference of a circle has an absolute and fixed ratio to its radius, easy as pie. The sides of a right triangle will always compare to each other in the same way in Euclidian geometry — likewise with trigonometric functions. And it doesn’t matter what kind of numbering system or base you use.

When it comes to simple math, you’ve probably seen those online puzzles that will show something like two ice cream cones equal ten; an ice cream cone and a hamburger equals seven, and so on. Well, this is just simple algebra, except that the typical Xs, Ys, and Zs are replaced with emojis.

That doesn’t make any difference, and you’re still going to get the same answer once you solve it all out.

Let’s try one right now — although since I can’t embed emojis easily here, we’ll stick with the classics. Just imagine hotdogs, eggplants, peaches, whatever. Solve the last equation:

X + X + X = 15

X + X + Y = 13

Y + Y + Z = 10

X + Y = Z + ?

It’s all a lot simpler with reductions. The first equation is the same as 3X = 15, so X is obviously 15/3, or 5. In the second, 2*5+Y = 13 is exactly the same as 13-2*5 = Y. 13-10 = 3. In the third, 2Y + Z = 10, or 10 – 2*3 = Z, so Z = 4.

And in the last equation, 5 + 3 = 8, which is 4 + 4, or Z + Z.

Math like this has given us a way to measure the world, but it doesn’t give us the “why” behind any of it, just the “what.” This is where the next step to truth comes in, and that is science, which stands on the back of math.

The job of science is to ask questions, and then use all of those irrefutable truths of math to get to the next level of truth, which is not objectively true, but which is demonstrably true until falsified.

Note that math gives us a way to measure, because that is very important in science. Science is all about measuring. It’s about coming up with the hypothesis of “The degree to which A happens is affected by both B and C,” and then creating an experiment to test that, then measuring the results over and over.

For example: The hypothesis is dead cats bounce higher if the person who dropped them donated to the Calico party.

How to test it: Get a bunch of people to drop a bunch of dead cats over and over. Record which party they donated to, correlate to how high the dead cats bounced, gather enough data points to establish a pattern, publish results.

Preliminary theory: Yes, donating to the Calico party seemed to have an effect that made the dead cats bounce higher.

But let’s say you’re skeptical of that result. How to make sure it’s true? Time for a double-blind study. First, we make sure that the people dropping the cats have no idea that we have any interest in which party they donated to, so we ask them a ton of innocuous questions for “demographic purposes.”

We might even lead them to think that we’re interested in their hair color.

Second, we make sure that the people recording the results have no idea what we’re looking for either.

Finally, we make sure that we don’t know who falls into which category by issuing each test subject a random and anonymous ID that is tagged to their party, but locked away until later.

Then the cat dropping commences.

And guess what? Once the results are tabulated back to the data on party donations, it actually turns out that party donation has absolutely no effect whatsoever on how high the dead cat bounces.

But at this level, in order to get to the truth, it took a lot of maneuvering around human bias and whatnot to find it. And — surprise — all those steps in creating the double blind procedures came from… math.

And you hated it in eighth grade? Don’t worry. So did I. It took me a long time to understand why it’s so important.

Anyway… with enough of the scientific method going on, we can get to a pretty good semblance of the objective truth, although really not quite, although a bunch of it sticks.

For example, the theory of gravity. You’re not about to step off of a tall building to test it, are you? Nope. You’re going to trust that this would just lead to a short, fast fall, a hard splat, and death.

This brings us to art and philosophy, and I’ll frankly dismiss the latter as just so much intellectual jerking off, no matter who’s doing it. The only school of thought I could ever come close to agreeing with was Empiricism, which basically felt that knowledge could only come from direct experience.

Or, in other words, I can only know it if I’ve experienced it through my senses, or humans can only know it if they’ve measured it. That is, science. So the empiricists basically managed to establish their own field as complete BS. Nice job, really.

As for art, it will never discover any objective truths, because that’s not what it’s about. But what it can do is take the objective truths of math and science and turn them into relatable and subjective truths for their audiences, and do it by creating an emotional reaction in that audience.

The scientists who have spread the truth the best have also been artists in that they have performed and created an emotional reaction. Just look at Carl Sagan and how he enflamed interest in science with his series Cosmos, or how Neil deGrasse Tyson repeated that success in the 2010s.  And everybody loves Bill Nye, the Science Guy.

But, again, why? Because art swoops in to popularize science. And while art only ever leads to subjective truths, art in service of science education will always lead to objective truth.

So… what is the best path to find truth? If you happen to be mathematically or scientifically inclined, then those. But if you’re artistically inclined, follow those artists who create a lot of stuff about science, and you’ll get led back eventually.

Most definitely, though, ignore the person on the soapbox who is saying that their way is the only way without backing it up, because they are a philosopher, and they are just yapping to hear themselves talk.

Trust me. I met their kind at university, and it wasn’t pretty.

Wednesday Wonders: Baby, it’s cold inside

April 8, 1911: Heike Kamerlingh Onnes makes an interesting discovery while futzing around with very low temperatures. It’s a discovery that will lead to many modern innovations that affect us just over a century later.

Strange things happen as the temperature drops toward absolute zero, which is basically the temperature equivalent of the speed of light in a vacuum (C) being the velocity limit for anything with mass. Oh, we’ve gotten really close to absolute zero — within nanokelvins — and in theory could get really close to the speed of light, although that would take ridiculous amounts of energy.

But… where matter can’t be is right at these two figures: Exactly absolute zero, exactly C. There’s nothing in the equations, though, that say that objects with mass cannot move faster than the speed of light or be colder than absolute zero.

Practically speaking, it would require infinite energy to jump from 99.99999% to 100.00001% of C, so that’s not possible, but scientists in Germany think they may have achieved temperatures below absolute zero.

Of course, these create weird situations like negative temperatures in an absolute sense, and not just as measured. That is, while we can say that it’s 24 below zero outside, that really isn’t a negative temperature by strict definition. It’s just a temperature that’s negative on the scale we’re using.

Remember: 1º on the Kelvin scale is actually –457.87ºF.

These kinds of negative temperatures are actually below that absolute physical limit, and so they represent thermal energy that behaves the opposite as temperatures above absolute zero. And, in all likelihood, an object moving faster than light would also travel backwards in time thanks to the time dilation effect being reversed.

These, though, are theoretical arguments. What we do know is that things get weird as the temperature drops. At a few nanokelvin, the only energy left in the system is quantum, and so these strange effects take over on a massive scale, pun intended.

The key here is that as atoms lose energy and cool down, they stop moving as much, eventually reaching a point where they’re just sitting there. But… there’s a principle in physics, Heisenberg’s uncertainty principle, which says that there is a fundamental limit to the precision with which you can measure two connected properties of any particle.

For example, if you measure position precisely, you can’t measure momentum with much accuracy, and vice versa. The sharper one measurement is, the fuzzier the other one becomes. Not to get too deep into the science of it, but there are two classes of elementary particle, Fermions and bosons.

Fermions are elitists, and when they’re in a bunch, they don’t like to occupy the same quantum energy state. Electrons are totally fermions, which is why that whole concept of an atom as electrons neatly orbiting a nucleus like planets orbit the Sun is only a metaphor and very inaccurate.

Each electron in an atom occupies a different quantum energy state, which is why there’s the concept of electron “shells” filling up, but the location of each electron is not a unique point that changes over time. It’s a statistical probability of a particular electron being in a particular place at any given time, and so the “shape” of those shells can vary from a sphere to two squashed and joined spheres to distended ovoid shapes, and so on.

Bosons, on the other hand, are egalitarians, don’t mind sharing the same quantum energy state. In fact, they love to do it. This leads to a very interesting form of matter known as a Bose-Einstein Condensate.

Basically, at a low enough temperature, a bunch of atoms can suddenly coalesce into a single quantum particle with the same energy state and even become visible to a regular microscope.

Why? Because when we stop their movement, we can measure their momentum at near zero. Therefore, our ability to measure where they are becomes very inaccurate. It’s like the fermions all gather together and then balloon up into one entity in order to hide their individual locations.

This would be the equivalent of a bunch of people preventing GPS tracking of each of them by leaving their phones in one room and then all of them heading out in opposite directions in a big circle. Or sphere, if they can manage that.

The discovery that Onnes made in 1911 is related to this phenomenon. In his case, he dipped a solid mercury wire into liquid helium at 4.2 degrees Kelvin and discovered that all electrical resistance went away. That is, he discovered a property of matter known as superconductivity.

The same principle and the low temperature led to the electromagnetic force interacting in a different way — fermions meet bosons under extreme conditions, and electric and magnetic sort of separate, or at least keep themselves at arm’s length, as it were.

This can lead to all sorts of interesting effects, like levitation.

This is the technology taking maglev trains to the next level. But superconductivity is also used in things like medical imaging devices, motors, generators, transformers, and computer parts.

But the holy grail of the field is finding he so-called “room temperature” superconductor. All right. In some ways, “room temperature” is a bit of a misnomer, and the warmest superconductor yet found has a transition temperature of –23ºC. But a more promising substance could be a superconductor at 53ºC. That’s the good news. The bad news is that it requires ridiculously high atmospheric pressures to do it — in the range of a million or more times the pressure at sea level.

Oh, well.

Of course, the U.S. Navy did file a patent for a “room remperature” superconductor just over a year ago, but it’s not clear from the patent whether they used the “Not 0ºK” definition of room temperature or the popular press definition of about 77ºF.

It makes sense, though, that barring low temperature, some other extreme would be needed to achieve the effect. Nature just seems to work like that, whether it’s extremely low temperatures or very high pressures required to create superconductivity, or the extreme gravity and energy conversion required to create that other holy grail so beloved of alchemy: transmutation of matter, specifically turning lead into gold.

Ah, yes. If those alchemists only knew that every star was constantly transmuting elements every second of every day strictly through the power of enormous gravity and pressure — hydrogen to helium and so on, right down to iron — then who knows. One of them might have managed fusion centuries ago.

Okay, not likely. But just over a century ago, superconductivity was discovered, and it’s been changing the world ever since. Happy 109th anniversary!

Wednesday Wonders: Fooled by famous frauds and fakes

It’s April 1st, but given the state of the world at the moment, I would hope that everyone refrains from any kind of pranks or jokes today in honor of the occasion. Instead, let’s look at five times in the past that scientific types have passed off a fake as reality.

I’ll take it in (mostly) chronological order.

The Mechanical Turk

In 1769, Maria Theresa, empress of Austria-Hungray, invited her trusted servant, Wolfgang von Kempelen, to a magic show. Von Kempelen knew his physics, mechanics, and hydraulics. The empress wanted to see what he’d make of a stage illusionist.

In short, he was not impressed, and said so in front of the court, claiming that he could create a better illusion. The empress accepted his offer and gave him six months off to try.

In 1770, he returned with his results: An automaton that played chess. It was in the form of a wooden figure seated behind a cabinet with three doors in front and a drawer in the bottom. In presenting it, von Kempelen would open the left door to show the complicated clockwork inside, then open a back door and shine a lantern through it to show that there was nothing else there.

When he opened the other two doors, it revealed an almost empty compartment with a velvet pillow in it. This he placed under the automaton’s left arm. The chess board and pieces came out of the drawer, and once a challenger stepped forward, von Kempelen turned a crank on the side to start it up, and the game was afoot.

Called the Mechanical Turk, it was good, and regularly defeated human opponents, including Benjamin Franklin.  and Napoleon Bonaparte — although Napoleon is reported to have tried to cheat, to which the Turk did not respond well.

Neither its creator nor second owner and promoter revealed its secrets during the machine’s lifetime, and it was destroyed by a fire in 1854. Although many people assumed that it was actually operated by a human and was not a machine, playing against it did inspire Charles Babbage to begin work on his difference engine, the mechanical precursor to the modern computer.

In the present day, a designer and builder of stage illusions built a replica of the Turk based on the original plans, and watching it in action is definitely uncanny.

Moon-bats and Martians!

This is actually a twofer. First, in August 1835, the New York Sun ran a six part series on discoveries made by the astronomer John Herschel on the Moon. The problem: The press flat out made it all up, reporting all kinds of fantastical creatures Herschel had allegedly seen and written about, including everything from unicorns to flying bat-people, all thanks to the marvel of the fabulous new telescope he had created. When Herschel found out about it, he was not pleased.

The flipside of this came sixty years later in 1895, when the astronomer Percival Lowell first published about the “canals of Mars,” which were believed to be channels of water that ran into the many oceans on the planet.

In reality, they were just an optical illusion created by the lack of power of telescopes of the time. This didn’t stop Lowell, though, and he went on in the early 19th century to write books that postulated the existence of life on Mars.

Of course, Lowell was not trying to perpetrate a fraud. He just had the habit of seeing what he wanted to see, so it was more self-delusion than anything else.

The Cardiff Giant

This would be Cardiff. The one in New York, not the capital of Wales. The year is 1869. The “giant” was a petrified 10-foot-tall man that had been dug up on a farm belonging to William C. “Stub” Newell. People came from all around to see it, and that did not stop when Newell started charging fifty cents a head to have a look. That’s the equivalent of about ten bucks today.

The statue was actually created by George Hull, who was a cousin of Newell’s. An atheist, Hull had gotten into an argument with a Methodist minister who said that everything in the Bible had to be taken literally. Since the Bible said that there had been giants in those days, Hull decided to give him one, and expose the gullibility of religious types at the same time.

Cardiff, after all, wasn’t very far from where Joseph Smith had first started the Mormon religion, and that sort of thing was not at all uncommon in the area during the so-called Second Great Awakening.

Although a huge hit with the public to the point that P.T. Barnum created his own fake giant, the Chicago Tribune eventually published an exposé with confessions from the stonemasons. That didn’t seem to make one bit of difference to the public, who still flocked to see the statues. Hull and his investors made a fortune off of the whole adventure.

Piltdown Man

Less innocuous was a hoax that actually sent a couple of generations of anthropologists and evolutionists down the wrong path in tracing the ancestry of humans. In 1912, Charles Dawson, an amateur archaeologist, claimed to have discovered the fossilized remains of a hitherto unknown human species in Piltdown, Sussex, England.

The key part was that while the skull had a human-like cranium, it had an ape-like mandible, or lower jaw. In other words, having traits of both species, it could easily have been the long-sought “missing link,” a transitional form that provides the evolutionary bridge between two species.

The first so-called missing link, Java Man, had been discovered twenty years prior to Dawson’s. Unlike Dawson’s Piltdown Man, Java Man, now known as homo erectus, has been accepted as a legitimate transitional form between ape and man.

Dawson’s downfall came after the discovery of more transitional forms and improved testing methods that authenticated many of these. When researchers finally turned their attention back to the original Piltdown Man fossils, they determined that the skull was only about 500 years old, the jaw, only a few decades. Both had been stained to simulate age.

In 1953, they published their findings, which were reported in Time magazine, but the damage had been done, setting back anthropological studies, because more recent, legitimate discoveries were doubted because they conflicted with the fake evidence.

It seems likely that Dawson was the sole hoaxer. What was his motive? Most likely, he wanted to be nominated to the archaeological Royal Society, but hadn’t yet because of a lack of significant findings.

In 1913, he was nominated because of Piltdown, proving yet again that it’s possible for a fraud to profit — if they’re white and connected.

Vaccines and autism

We’re still feeling the repercussions of this fraud, which was first perpetrated in 1998 by a researcher named Andrew Wakefield. This was when he published results of studies he carried out which, he said, showed an undeniable link between childhood vaccinations, particularly measles, mumps, and rubella (MMR) and autism.

In Wakefield’s world, “undeniable link” meant “cause and effect,” and a whole bunch of parents proceeded to lose their minds over the whole thing. We’re still dealing with the fallout from it today, with diseases like measles and whopping cough — which should have been eradicated — suddenly causing mini-epidemics.

Eventually, when they could not be replicated, it came out that Wakefield had flat-out falsified his results, and his papers and findings were withdrawn and repudiated by medical journals.

What was his motive for falsifying information without any regard for the lives he endangered? Oh, the usual motive. Money. He had failed to disclose that his studies “had been funded by lawyers who had been engaged by parents in lawsuits against vaccine-producing companies.”

But, as with Piltdown Man, we’re still seeing the effects and feeling the damage a generation later. This is why now, more than ever, we need to rely on actual scientific findings that have been replicated through peer review instead of rumors, myths, or memes.

Happy April 1st!

Talky Tuesday: Language is (still) a virus

I used this Burroughs quote as a post title a couple of years ago in an entirely different context, but the idea has taken on new relevance, as I’m sure the world can now agree.

This post’s title comes from a William S. Burroughs quote which reads in full as, “Language is a virus from outer space.”

What he meant by the first part is that words enter a host, infect it, and cause a change in it. Just as a virus hijacks a host’s cells in order to become little factories to make more virus to spread a disease, words hijack a host’s neurons in order to become little factories to make more words to spread ideas and belief systems.

As for the “outer space” part, I think that Burroughs was being metaphorical, with the idea being that any particular language can appear totally alien to any other. While, say, Mongolian and Diné may both come from humans on Earth, if a speaker of either encountered someone who only spoke the other, they might as well be from different planets because, for all intents and purposes, they are from different worlds, unable to communicate with words.

And the language we use can quite literally create and shape our perceptions of the world, as I discussed in my original Language is a virus post. One of the most striking examples I cited in that link was Guugu Yimithirr, an aboriginal language that has no words for relative direction. Instead, they refer to everything based upon where it is relative to actual cardinal directions.

In other words, if you ask someone who speaks this language where you should sit, they won’t say, “In the chair on your left.” Rather, they’ll say something like, “In the chair to the north.” Or south, or east, or west. And a speaker of the language will know what that means, whether they can see outside or not.

Quick — right now, if someone said “Point east,” could you do it without thinking?

And that is how languages can change thoughts and perceptions.

But, sometimes — honestly, far too often — language can change perceptions to create tribalism and hostility, and during this plague year, that has suddenly become a huge topic of debate over a simple change of one C word in a phrase.

I’m writing, of course, about “coronavirus” vs. “Chinese virus.” And the debate is this: Is the latter phrase racist, or just a statement of fact?

One reporter from a rather biased organization did try to start the “it’s not” narrative with the stupidest question ever asked: “Mr. President, do you consider the term ‘Chinese food’ to be racist because it is food that originated from China?”

There are just two problems with this one. The first is that what Americans call “Chinese food” did not, in fact, originate in China. It was the product of Chinese immigrants in America who, being mostly men, didn’t know how to cook, and didn’t have access to a lot of the ingredients from back home. So… they improvised and approximated, and “Chinese food” was created by Chinese immigrants starting in San Francisco in the 19th century.

Initially, it was cuisine meant only for Chinese immigrants because racist Americans wouldn’t touch it, but when Chinatowns had sprung up in other cities, it was New York’s version that finally lured in the hipster foodies of the day to try it, and they were hooked.

In short, “Chinese food” was a positive and voluntary contribution to American culture, and the designation here is merely descriptive, so not racist. “Chinese virus” is a blatant misclassification at best and a definite attempt at a slur at worst, with odds on the latter.

But we’ve seen this with diseases before.

When it comes to simple misidentification of place of origin, let’s look back to almost exactly a century ago, when the Spanish flu went pandemic. From 1918 to 1919, it hit every part of the world, infected 500 million people and killed 50 million.

A little perspective: At the time, the world’s population was only 1.8 billion, so this represents an infection rate of 28% and a mortality rate among the infected of 2.8%. If COVID-19 has similar statistics — and it seems to — then that means this pandemic will infect 2.1 billion people and kill 211 million.

By the way, while the 1918 pandemic was very fatal to children under 5 and adults over 65, it also hit one other demographic particularly hard: 20 to 40 year-olds.

So if you’re in that age range and think that COVID-19 won’t kill you, think again — particularly if you smoke or vape or have asthma, and don’t take the quarantine seriously. And remember: the rich and world leaders are not immune either — not now and not then.

The president of the United States, Woodrow Wilson, was infected in the 1918 H1N1 pandemic in 1919, and while he survived, this assault on his health probably led to the stroke he had late in that year, an incident that was covered up by his wife, with the help of the president’s doctor. The First Lady became de facto president for the remainder of his second term.

In modern times, the first world leader to test positive for coronavirus was Prince Albert II of Monaco, followed not long after by Prince Charles and Boris Johnson. Of course, I’m writing these words a bit ahead of when you’ll read them, so who knows what will have happened by then.

In medical circles, the name “Spanish Flu” has been abandoned, and that particular pandemic is now known as H1N1, which I’m sure looks really familiar to you, because this has been the standard nomenclature for flu viruses for a while: H#N#, sans location, animal, or occupation, more on which in a minute.

But first, let’s get to the reasons behind naming a disease after a place. The H1N1 Pandemic was a simple case of mistaken identity and also contingent upon that whole “Great War” stuff going on in Europe.

See, other countries had been hit by it first, but in the interests of the old dick-waving “Gotta appear strong before our enemies” toxic masculinity, all of them suppressed the news. It wasn’t until Spain started seeing it in their citizens and, because they were neutral, they announced outbreaks, that the world suddenly pointed fingers and said, “Ooh… this came from Spain. Hence, it’s the Spanish flu.”

Except, not. Ironically, it seems now that the Spanish flu originated in… China. Although that’s according to historians. Virologists, on the other hand, have linked it to an H1 strain later identified in pigs in Iowa in the U.S.

Either way, all of the countries involved in WW I, aka “The Great War,” kept mum about it.

So the name “Spanish flu” was a simple mistake. On the other hand, the names of other diseases actually are outright xenophobic or racist, and we only have to look as far  as syphilis to see why.

Basically, syphilis is an STI that was the most feared of its kind until… AIDS, because syphilis was not treatable or curable until penicillin was discovered in 1928 — although it was not produced on a mass scale until 1945, thanks to needs created by WW II, and facilitated by the War Production Board.

Hm. Sound familiar?

But the reason it became known as the French disease outside of France was that it began to spread after Charles VIII of France invaded Italy in 1494-95 to reclaim a kingdom he thought should be his. It was eventually so devastating that Charles had to take his troops home, and so it began to spread in France and across Europe.

Since it first showed up in French soldiers, it was quickly dubbed the French disease in Italy and England, although the French preferred to call it the Italian disease. In reality, it most likely originated in the New World, and was brought back to Europe by… Columbus and his Spanish soldiers, who then somehow managed to spread it to the French as they worked for them as mercenaries.

Hm. STI. A bunch of male soldiers. I wonder how that worked, exactly.

And I am totally destroying my future google search suggestions by researching all of this for you, my loyal readers, just so you know! Not to mention that I can’t wait to see what sort of ads I start getting on social media. “Confidential STI testing!” “Get penicillin without a prescription.” “These three weird tricks will cure the STI. Doctors hate them!”

But the naming of diseases all came to a head almost five years ago when the World Health Organization (WHO)  finally decided, “You know what? We shouldn’t name diseases after people, places, animals, occupations, or things anymore, because that just leads to all kinds of discrimination and offense, and who needs it?”

This directly resulted from the backlash against the naming of the last disease ever named for a place, despite the attempt to obfuscate that in its official terminology. Remember MERS, anyone?  No? That one came about in 2012, was first identified in Saudi Arabia, and was named Middle East respiratory syndrome.

Of course, it didn’t help when things were named swine flu or avian flu, either. A lot of pigs and birds died over those designations. So away went such terminology, especially because of the xenophobic and racist connotations of naming a disease after an entire country or people.

Of course, some antiquated folk don’t understand why it’s at the least racist and at the most dangerous to name diseases the old way, as evinced by the editorial tone of this article from a right-wing publication like The Federalist. But they actually kind of manage to prove the point that yes, such terminology is out of fashion, because the only 21st century example they can list is the aforementioned MERS.

The most recent one before that? Lyme disease, named for Lyme, Connecticut, and designated in… the mid-70s. Not exactly the least racist of times, although this disease was named for a pretty white-bread area.

The only other examples of diseases named for countries on their list: the aforementioned Spanish flu, Japanese encephalitis, named in 1871 (seriously, have you ever heard of that one?); and the German measles, identified in the 18th century, although more commonly known as rubella.

So, again — it’s a list that proves the exact opposite of what it set out to do, and calling coronavirus or COVID-19 the “Chinese virus” or “Chinese disease” is, in fact, racist as hell. And it won’t save a single life.

But calling it that will certainly endanger lives by causing hate crimes — because language is a virus, and when people are infected by malignant viruses, like hate speech, the results can be deadly.

Inoculate yourselves against them with education if possible. Quarantine yourselves against them through critical thinking otherwise. Most of all, through these trying times, stay safe — and stay home!

Image source: Coronavirus Disease 2019 Rotator Graphic for af.mil. (U.S. Air Force Graphic by Rosario “Charo” Gutierrez)

Theatre Thursday: The house is dark tonight

As of now, Los Angeles is six days into the lockdown, it has been eighteen days since I last worked box office for ComedySportz L.A., and seventeen days since I’ve done improv on stage, and I have to tell you that the last two have been the hardest part of the whole social distancing and isolation process.

Not that I’m complaining, because shutting down all of the theaters, bars, clubs, sporting events, and other large gatherings, as well as limiting restaurants to take-out only, are all good things. Yes, it does cost people jobs — I’m one of the affected myself, and dog knows I have a ton of friends who are servers or bartenders — but California has also stepped up in making unemployment and disability benefits much more readily available.

And maybe we’ll all get $1,000 from the Federal government, maybe not. The down the road side benefit of this human disaster is that it may just finally break our two-party system in the U.S. and wreak havoc with entrenched power structures elsewhere. And, remember, quite a lot of our so-called lawmakers also happen to belong to the most at-risk group: Senior citizens. So there’s that.

But what is really hurting right now is not the loss of the extra money I made working CSz box office (although if you want to hit that tip jar, feel free — blatant hint.)

Nope. The real loss is in not being able to see and hang out with my family regularly: the Main Company, College League, and Sunday Team; as well as doing improv with the Rec League every Monday night.

And with every week that passes when I don’t get to take to that stage, I feel a bit more separated from the outside world, a bit less creative, a bit less inspired.

I know that I shouldn’t, but honestly, improv in general and Rec League in particular has added so much to my life for the last two and a half years that having to do without it is tantamount to asking me to deal with having no lungs. And no heart.

185 coronaviruses walk into a bar and the bartender says, “Sorry, we’re closed.”

The coronaviruses say, “As you should be.”

And no one laughs. It’s not a time for laughter, but it is a time for support. And while I can’t do improv in real life with this wonderful funny family of mine, I can at least reach out to them all and say, “Hey. How are you doing?” I can also reach out to my loyal readers here and ask the same question.

It’s been amazing, because several of my improviser pals have started doing podcasts or the like. I can’t name names or link here, but I’ve got at least one improv friend who has been doing virtual shows in which he somehow manages to broadcast phone-to-phone routines through what must be a third phone.

Another friend of mine has been reading various scripts, screenplays, or fan fiction live online while also getting twisted on various intoxicating substances, and it’s been hilarious. Then again, he’s hilarious, and although he’s fairly new to the company, he quickly became one of my favorite players.

Okay, so the upside is that I’m now free Friday through Monday evenings again. Yay?

Maybe. The downside? I still don’t know who, out of all my friends and loved ones, is going to die. And that includes me.

But when you have fiscal conservatives like Mitt Romney suddenly advocating for what is pretty much the Universal Basic Income idea supported by (but not created by) Andrew Yang, you can easily come to realize that what we are going through right now, in real time, is an enormous paradigm shift.

More vernacularly, that’s what’s known as a game-changer.

The current crisis has the clear potential to change the way society does things. It may accelerate the race that had already been happening to make all of our shopping virtual, as well delivering everything with autonomous vehicles or drones. In the brick and mortar places that do remain, you may be seeing a lot fewer actual cashiers and a lot more automated kiosks.

This is particularly true in fast food places. McDonald’s alone has been on a push to add kiosks to 1,000 stores per quarter since mid-2018. Compare that to Wendy’s, which the year before set a goal of putting the machines in only 1,000 stores total.

They’re even developing the technology to let AI make recommendations based on various factors, like the weather, or how busy the location is.

But as these jobs go away, ideas like Universal Basic Income and cranking up the minimum wage become much more important — especially because people in these minimum wage jobs are, in fact, not the mythical high schooler making extra cash. Quite a lot of them are adults, many of them with children and families to support.

We are also already seeing immediate and positive effects on the environment due to massive shutdowns of transportation and industry. Scientists had already shown how airline travel contributes to global warming because the shutdown of flights for three days after September 11 gave them a unique living lab to study it in.

And remember: That was pretty much a limit on foreign flights coming into the U.S. What’s happening now is on a very global scale. We’re suddenly dumping fewer pollutants into the atmosphere, using less fossil fuel, and generating lower levels of greenhouse gases — and it already has been for longer than three days, and is going to be for a lot longer than that.

One of the must sublime effects, though, has been in one of the hardest-hit countries. In Italy, the waters in the canals of Venice are running clear for the first time in anyone’s memory, although this didn’t bring the dolphins to them nor make the swans return to Burano. The dolphins were in the port at Sardinia and the swans are regulars.

While a lot of the specific environmental recoveries are true, a lot of them are not. Even NBC was taken in by the hoax that National Geographic debunked.

There’s something poetic in the irony that, as humans have been forced to shut themselves inside, animals do have opportunity to come back into the niches we displaced them from, even if only temporarily.

It’s not always a good thing, though. In Bangkok, the lack of tourists — an abundant source of free food — led to an all-out monkey war between two different tribes.

All of this is just a reminder that all of us — human, animal, and plant alike — live on and share the same planet, and what one does affects all of the others.

The ultimate example of that, of course, is a pandemic. It now seems likely it all began with patient zero, a 55 year-old man from Hubei in Wuhan province, who was the first confirmed case, back on November 17, 2019. But the most likely reservoir from which the virus jumped to humans was probably the pangolin — just more proof that it’s the cute ones you always have to beware of.

It may seem strange to start on the topic of theatre and veer hard into science via politics, but like everything else on the planet, it’s all interconnected. Art, politics, and science are opposite faces of an icosahedral die that never stops being thrown by the hand of fate.

Or by completely random forces. Or it’s a conspiracy. Or it’s all predictable if you have enough data.

Stay safe out there by staying in, wherever you are. See you on the other side but I hope to keep seeing you through it on a daily basis. I’m not going anywhere, dammit.

Image Source: Fairmont Theater, (CC BY-ND 2.0) 2009 Jon Dawson. Used unchanged.

Wednesday Wonders: A busy day in space

Happy New Year! And happy first day of spring!

Wait, what… you say those things aren’t today, March 25th? That the latter was six days ago and the former was almost four months ago?

Well… you’d be right in 2020, but jump back in history to when the Julian calendar was still around, and things were dated differently. This led to the adoption of the new Gregorian calendar, but since it was sponsored by the Pope, not everyone switched over right away. Long story short, Catholic countries like Spain, Portugal, and Italy adopted it immediately in 1582. Protestant countries held out, so that places like England (and the colonies) didn’t switch until 1752.

That was also when England moved New Year’s day back to January 1, which is itself ironic, since it was the Catholic Church that moved the day from then to March 25 at the Council of Tours in 567, considering the prior date pagan, which was probably accurate, since the Romans had moved New Year’s from March to January 1st when they deified Julius Caesar after his assassination.

The practical reason for switching calendars was that the Julian calendar lost 11 hours a year, which added up fast, meaning that entire extra months had to be added between years to set things right again. The Gregorian calendar is much more accurate, although about 2,800 years from now it will have lost a day.

By the way, the religious reasoning for picking March 25th is that it was the Feast of the Annunciation, meaning the day that the Archangel Gabriel appeared to Mary to let her know that she was going to get knocked up by god — although it doesn’t get mentioned canonically until a century after the ol’ calendar switch-a-roo.

Anyway, the math isn’t hard to do. March 25th is exactly nine months before Christmas. And in strictly astronomical terms, the former is the first day of spring and the latter is the first day of winter. Just psychologically, the Vernal Equinox, which is now closer to the 19th or 20th, is the better New Year’s Day option because it’s when days start to get longer than nights, vegetation starts to grow anew, and nature awakes from its slumber.

Note: Your mileage in 2020 may vary.

It’s kind of ironic, then, that today marks the birth of a German astronomer and mathematician, Christopher Clavius, who was instrumental in doing the calculations necessary to figure out how much in error the Julian calendar had become, and then to come up with a calendar to fix it and a method to transition.

This is where the Catholic Church came into it, because Easter, being a moveable feast based on the Julian lunar calendar, had been slipping later and later into the year, threatening to move from the spring to summer. Clavius’s job was to bring it back toward the vernal equinox.

He succeeded to the degree of accuracy noted above — only a day off in 3,236 years. Not bad. This was also when New Year’s Day went back to January 1st, per the old Roman style, and while this is attributed to Pope Gregory XIII, I can’t help but think that Clavius had a hand in implementing the change.

I mean, come on. You’re handed a chance by the most powerful person in the western world at the time to move a major holiday off of your birthday so that your day is finally special on its own? Who wouldn’t do that given the power?

Good ol’ Chris did make other discoveries and get some nice presents, like a crater on the moon named after him, as well as the moon base in the movie 2001.

Still, even if the equinox did move away from March 25, the date still keeps bringing special things for astronomers. It was on this day in 1655 that the Dutch physicist and astronomer Christiaan Huygens discovered Saturn’s largest moon, Titan,

Huygens also has another time connection, though. Where Clavius gave us a calendar accurate to over 3,000 years, Huygens gave us a clock that was the most accurate for the next 300 years. His innovation? Put a pendulum on that thing and let it swing. He literally put the “tick tock” in clock.

Why was this possible? Because the swing of a pendulum followed the rules of physics and was absolutely periodic. Even as friction and drag slowed it down, it would cover a shorter distance but at a slower pace, so that the time between tick and tock would remain the same.

The pendulum itself would advance a gear via a ratchet that would turn the hands of the clock, and adding kinetic energy back into that pendulum was achieved through a spring, which is where that whole “winding the clock” thing came in. Tighten the spring and, as it unwinds, it drives that gear every time the pendulum briefly releases it, but thanks to physics, that pendulum will always take the exact same time to swing from A to B, whether it’s going really fast or really slow.

Back to Huygens’s discovery, though… Titan is quite a marvel itself. It is the second largest natural satellite in our solar system, taking a back seat (ironic if you know your mythology) only to Jupiter’s Ganymede. It is half again as big as our own Moon and 80% more massive. It’s even bigger than the planet Mercury, but only 40% as massive, mainly because Mercury is made of rock while Titan may have a rocky core but is mostly composed of layers of different forms of water-ice combined with ammonia, and a possible sub-surface ocean,

Titan also has a thick, nitrogen-rich atmosphere, the only other atmosphere in the solar system besides Earth’s to have so much nitrogen in it. In case you’re wondering, Earth’s atmosphere is almost 80% nitrogen — OMG, you’re breathing it right now! But this also makes the aliens’ Achilles heel in the movie Mars Attacks! kind of ridiculous, since the whole deal was that they could only survive in a nitrogen atmosphere. We have that, Mars doesn’t. Mars is mostly carbon dioxide, but not even much of that. But don’t get me started.

Despite all that, it’s still a fun film.

And Titan, next to Jupiter’s moon Europa, is one of the more likely places we might find life in our solar system.

One final bit of March 25th news in space for this day: In 1979, OV-102, aka Space Shuttle Columbia, was delivered to NASA. It was the first shuttle completed, and its delivery date, after a flight that had begun on March 24th, came four years to the day after fabrication of the fuselage began. Sadly, it was also the last shuttle to not survive its mission, so there was a strange sort of symmetry in that.

While I warned you about the Ides of March, the 25th should be full of nothing but anticipation, even in a plague year. It’s a date for exploration and discovery, whether out into the cosmos, or within the confines of whatever space you’re in right now. Make good with what you have, create all you can, and take advantage of our wonderful technology to share and connect.

After all, that’s what worked for Clavius and Huygens. They worked with the tech they had, then networked once they had an idea, and look how well that worked out.

Hint: It worked out very well, for them and for us.

Image Source: Titan, by NASA.