Wednesday Wonders: Facing the music

For some reason, face morphing in music videos really took off, and the whole thing was launched with Michael Jackson’s video for Black or White in 1991. If you’re a 90s kid, you remember a good solid decade of music videos using face-morphing left and right.

Hell, I remember at the time picking up a face-morphing app in the five dollar bin at Fry’s, and although it ran slow as shit on my PC at the time, it did the job and morphed faces and, luckily, it never got killed by the “Oops, Windows isn’t backward compatible with this” problem, so it runs fast as hell now. Well, whenever I last used it, and it’s been a hot minute.

If you’ve never worked with the software, it basically goes like this. You load two photos, the before and after. Then, you mark out reference points on the first photo.

These are generally single dots marking common facial landmarks: inside and outside of each eye, likewise the eyebrows and mouth, bridge of the nose, outside and inside of the nostrils, top and bottom of where the ear hits the face, major landmarks along the hairline, and otherwise places where there are major changes of angle.

Next, you play connect the dots, at first in general, but then it becomes a game of triangles. If you’re patient enough and do it right, you wind up with a first image that is pretty closely mapped with a bunch of little triangles.

Meanwhile, this entire time, your software has been plopping that same mapping onto the second image. But, at least with the software I was working with then (and this may have changed) it only plops those points relative to the boundaries of the image, and not the features in it.

Oh yeah — first essential step in the process: Start with two images of identical dimensions, and faces placed about the same way in each.

The next step in the morph is to painstakingly drag each of the points overlaid on the second image to its corresponding face part. Depending upon how detailed you were in the first image, this can take a long, long time. At least the resizing of all those triangles happens automatically.

When you think you’ve got it, click the magic button, and the first image should morph into the second, based on the other parameters you gave it, which are mostly screen rate.

And that’s just for a still image. For a music video, repeat that for however many seconds any particular transition takes, times 24 frames per second. Ouch!

I think this will give you a greater appreciation of what Jackson’s producers did.

However… this was only the first computerized attempt at the effect in a music video. Six years earlier in 1985, the English duo Godley & Creme (one half of 10cc so… 5cc?) released their video Cry, and their face morphing effect is full-on analog. They didn’t have the advantage of powerful (or even wimpy) computers back then. Oh, sure, they had pulled off kind of early CGI effects for TRON in 1982, but those simple graphics were nowhere near good enough to swap faces.

So Godley & Crème did it the old fashioned way, and anyone who has ever worked in old school video production (or has nerded out over the firing up the Death Star firing moments in Episode IV) will know the term “Grass Valley Switcher.”

Basically, it was a mechanical device that could take the input from two or more video sources, as well as provide its own video input in the form of color fields and masks, and then swap them back and forth or transition one to the other.

And this is what they did in their music video for Cry.

Although, to be fair, they did it brilliantly because they were careful in their choices. Some of their transitions are fades from image A to B, while others are wipes, top down or bottom up. It all depended upon how well the images matched.

In 2017, the group Elbow did an intentional homage to this video using the same technique well into the digital age — and with a nod from Benedict Cumberbatch, with their song Gentle Storm.

And now we come to 2020. See, all of those face morphing videos from 1991 through the early 2000s still required humans to sit down and mark out the face parts and those triangles and whatnot, so it was a painstaking process.

And then, this happens…

These face morphs were created by a neural network that basically looked at the mouth parts and listened to the syllables of the song, and then kind of sort of found other faces and phonemes that matched, and then yanked them all together.

The most disturbing part of it, I think, is how damn good it is compared to all of the other versions. Turn off the sound or don’t understand the language, and it takes Jackson’s message from Black or White into the stratosphere.

Note, though, that this song is from a band named for its lead singer, Lil’ Coin (translated from Russian) and the song itself is about crime and corruption in Russia in the 1990s, titled Everytime. So… without cultural context, the reason for the morphing is ambiguous.

But it’s still an interesting note that 35 years after Godley & Crème first did the music video face morph, it’s still a popular technique with artists. And, honestly, if we don’t limit it to faces or moving media, it’s a hell of a lot older than that. As soon as humans figured out that they could exploit a difference in point of view, they began making images change before our eyes.

Sometimes, that’s a good thing artistically. Other times, when the changes are less benevolent, it’s a bad thing. It’s especially disturbing that AI is getting into the game, and Lil’ Coin’s video is not necessarily a good sign.

Oh, sure, a good music video, but I can’t help but think that it was just a test launch in what is going to become a long, nasty, and ultimately unwinnable cyber war.

After all… how can any of you prove that this article wasn’t created by AI? Without asking me the right questions, you can’t. So there you go.

Image: (CC BY-SA 2.0) Edward Webb

Momentous Monday: Meet the Middletons

Thanks to boredom and Amazon Prime, I watched a rather weird movie from the 1930s tonight. While it was only 55 minutes long, it somehow seemed much longer because it was so packed with… all kinds of levels of stuff.

The title is The Middleton Family at the New York World’s Fair, and while the content is 7exactly what it says on the tin, there are so goddamn many moving parts in that tin that this is one worth watching in depth, mainly because it’s a case study in how propaganda can be sometimes wrong, sometimes right and, really, only hindsight can excavate the truth from the bullshit.

While it seems like a feature film telling the fictional story of the (snow-white but they have a black maid!) Middleton Family from Indiana who goes back east ostensibly to visit grandma in New York but, in reality, in order to attend the New York World’s Fair of 1939, in reality this was nothing more than a piece of marketing and/or propaganda created by the Westinghouse Corporation, major sponsors of the fair, poised on the cusp of selling all kinds of new and modern shit to the general public.

Think of them as the Apple or Microsoft of their day, with solutions to everything, and the World’s Fair as the biggest ThingCon in the world.

Plus ça change, right?

But there’s also a second, and very political, vein running through the family story. See, Dad decided to bring the family to the fair specifically to convince 16 year-old son Bud that, despite the bad economic news he and his older friends have been hearing about there being no job market (it is the Great Depression, after all) that there are, in fact, glorious new careers waiting out there.

Meanwhile, Mom is hoping that older daughter Babs will re-connect with high school sweetheart Jim, who had previously moved to New York to work for (wait for it) Westinghouse. Babs is having none of it, though, insisting that she doesn’t love him but, instead, is in love with her art teacher, Nick.

1939: No reaction.

2020: RECORD SCRATCH. WTF? Yeah, this is one of the first of many disconnect moments that are nice reminders of how much things have changed in the 82 years since this film happened.

Girl, you think you want to date your teacher, and anyone should be cool with that? Sorry, but listen to your mama. Note: in the world of the film, this relationship will become problematic for other reasons but, surprise, the reason it becomes problematic then is actually problematic in turn now. More on which later.

Anyway, obviously richer than fuck white family travels from Indiana to New York (they’re rich because Dad owns hardware stores and they brought their black maid with them) but are too cheap to spring for a hotel, instead jamming themselves into Grandma’s house, which is pretty ritzy as well and that says grandma has money too, since her place is clearly close enough to Flushing Meadows in Queens to make the World’s Fair a daily day trip over the course of a weekend.

But it’s okay — everyone owned houses then! (Cough.)

And then it’s off to the fair, and this is where the real value of the film comes in because when we aren’t being propagandized by Westinghouse, we’re actually seeing the fair, and what’s really surprising is how modern and familiar everything looks. Sure, there’s nothing high tech about it in modern terms, but if you dropped any random person from 2020 onto those fairgrounds, they would not feel out of place.

Well, okay, you’d need to put them in period costume first and probably make sure that if they weren’t completely white they could pass for Italian or Greek.

Okay, shit. Ignore that part, let’s move along — as Jimmy, Babs’ high school sweetheart and Westinghouse Shill character, brings us into the pavilion. And there are two really weird dynamics here.

First is that Jimmy is an absolute cheerleader for capitalism, which is jarring without context — get back to that in a moment.

The other weird bit is that Bud seems to be more into Jimmy than Babs ever was, and if you read too much gay subtext into their relationship… well, you can’t read too much , really. Watch it through that filter, and this film takes on a very different and subversive subplot. Sure, it’s clear that the family really wishes Jimmy was the guy Babs stuck with, but it sure feels like Bud wouldn’t mind calling him “Daddy.”

But back to Jimmy shilling for Westinghouse. Here’s the thing: Yeah, sure, he’s all “Rah-Rah capitalism!” and this comes into direct conflict with Nicholas, who is a self-avowed communist. But… the problem is that in America, in 1939, capitalism was the only tool that socialism could use to lift us out of depression and, ultimately, create the middle class.

There’s even a nod to socialism in the opening scene, when Bud tells his dad that the class motto for the guys who graduated the year before was, “WPA, here we come!” The WPA was the government works program designed to create jobs with no particular aim beyond putting people to work.

But once the WPA partnered with those corporations, boom. Jobs. And this was the beginning of the creation of the American Middle Class, which led to the ridiculous prosperity for (white) people from the end of WW II until the 1980s.

More on that later, back to the movie now. As a story with relationships, the film actually works, because we do find ourselves invested in the question, “Who will Babs pick?” It doesn’t help, though, that the pros and cons are dealt with in such a heavy-handed manner.

Jimmy is amazing in every possible way — young, tall, intelligent, handsome, and very knowledgeable at what he does. Meanwhile, Nicholas is short, not as good-1ooking (clearly cast to be more Southern European), obviously a bit older than Babs, and has a very unpleasant personality.

They even give him a “kick the puppy” moment when Babs introduces brother Bud, and Nicholas pointedly ignores the kid. But there’s that other huge issue I already mentioned that just jumps out to a modern audience and yet never gets any mention by the other characters. The guy Babs is dating is her art teacher. And not as in past art teacher, either. As in currently the guy teaching her art.

And she’s dating him and considering marriage.

That wouldn’t fly more than a foot nowadays, and yet in the world of 1939 it seems absolutely normal, at least to the family. Nowadays, it would be the main reason to object to the relationship. Back then, it isn’t even considered.

Wow.

The flip side of the heavy-handed comes in some of Jimmy’s rebukes of Nicholas’ claims that all of this technology and automation will destroy jobs. While the information Jimmy provides is factual, the way his dialogue here is written and delivered comes across as condescending and patronizing to both Nicholas and the audience, and these are the moments when Jimmy’s character seems petty and bitchy.

But he’s also not wrong, and history bore that out.

Now this was ultimately a film made to make Westinghouse look good, and a major set piece involved an exhibit at the fair that I actually had to look up because at first it was very easy to assume that it was just a bit of remote-controlled special effects set up to pitch an idea that didn’t really exist yet — the 1930s version of vaporware.

Behold Elektro! Here’s the sequence from the movie and as he was presented at the fair. Watch this first and tell me how you think they did it.

Well, if you thought remote operator controlling movement and speaking lines into a microphone like I did at first, that’s understandable. But the true answer is even more amazing: Elektro was completely real.

The thing was using sensors to actually interpret the spoken commands and turn them into actions, which it did by sending light signals to its “brain,” located at the back of the room. You can see the lights flashing in the circular window in the robot’s chest at around 2:30.

Of course, this wouldn’t be the 1930s if the robot didn’t engage in a little bit of sexist banter — or smoke a cigarette. Oh, such different times.

And yet, in a lot of ways, the same. Our toys have just gotten a lot more powerful and much smaller.

You can probably guess which side of the argument wins, and while I can’t disagree with what Westinghouse was boosting at the time, I do have to take issue with one explicit statement. Nicholas believes in the value of art, but Jimmy dismisses it completely, which is a shame.

Sure, it’s coming right out of the Westinghouse corporate playbook, but that part makes no sense, considering how much of the world’s fair and their exhibit hall itself relied on art, design, and architecture. Even if it’s just sizzle, it still sells the steak.

So no points to Westinghouse there but, again, knowing what was about to come by September of 1939 and what a big part industry would have in ensuring that the anti-fascists won, I can sort of ignore the tone-deafness of the statement.

But, like the time-capsule shown in the film, there was a limited shelf-life for the ideas Westinghouse was pushing, and they definitely expired by the dawn of the information age, if not a bit before that.

Here’s the thing: capitalism as a system worked in America when… well, when it worked… and didn’t when it didn’t. Prior to about the early 1930s, when it ran unfettered, it didn’t work at all — except for the super-wealthy robber barons.

Workers had no rights or protections, there were no unions, or child-labor laws, or minimum wages, standard working hours, safety rules, or… anything to protect you if you didn’t happen to own a big chunk of shit.

In other words, you were management, or you were fucked.

Then the whole system collapsed in the Great Depression and, ironically, it took a member of the 1% Patrician Class (FDR) being elected president to then turn his back on his entire class and dig in hard for protecting the workers, enacting all kinds of jobs programs, safety nets, union protections, and so on.

Or, in other words, capitalism in America didn’t work until it was linked to and reined-in by socialism. So we never really had pure capitalism, just a hybrid.

And, more irony: this socio-capitalist model was reinforced after Pearl Harbor Day, when everyone was forced to share and work together and, suddenly, the biggest workforce around was the U.S. military. It sucked in able-bodied men between 17 and 38, and the weird side-effect of the draft stateside was that suddenly women and POC were able to get jobs because there was no one else to do them.

Manufacturing, factory jobs, support work and the like boomed, and so did the beginnings of the middle class. When those soldiers came home, many of them returned to benefits that gave them cheap or free educations, and the ability to buy homes.

They married, they had kids, and they created the Boomers, who grew up in the single most affluent time period in America ever.

Side note: There were also people who returned from the military who realized that they weren’t like the other kids. They liked their own sex, and couldn’t ever face returning home. And so major port towns — San Francisco, Los Angeles, Long Beach, San Diego, Boston, New York, Miami, New Orleans — were flooded with the seeds of future GLB communities. Yes, it was in that order back then, and TQIA+ hadn’t been brought into the fold yet. Well, later, in the 60s. There really wasn’t a name for it or a community in the 1940s.

In the 60s, because the Boomers had grown up with affluence, privilege, and easy access to education, they were also perfectly positioned to rebel their asses off because they could afford to, hence all of the protests and whatnot of that era.

And this sowed the seeds of the end of this era, ironically.

The socio-capitalist model was murdered, quite intentionally, beginning in 1980, when Ronald fucking Reagan became President, and he and his cronies slowly began dismantling everything created by every president from FDR through, believe it or not, Richard Nixon. (Hint: EPA.)

The mantra of these assholes was “Deregulate Everything,” which was exactly what the world was like in the era before FDR.

Just one problem, though. Deregulating any business is no different from getting an alligator to not bite you by removing their muzzle and then saying to them, “You’re not going to bite me, right?”

And then believing them when they swear they won’t before wondering why you and everyone you know has only one arm.

Still, while it supports an economic system that just isn’t possible today without a lot of major changes, The Middletons still provides a nice look at an America that did work because it focused on invention, industry, and manufacturing not as a way to enrich a few shareholders, but as a way to enrich everyone by creating jobs, enabling people to actually buy things, and creating a rising tide to lift all boats.

As for Bud, he probably would have wound up in the military, learned a couple of skills, finished college quickly upon getting out, and then would have gone to work for a major company, possibly Westinghouse, in around 1946, starting in an entry-level engineering job, since that’s the skill and interest he picked up during the War.

Along the way, he finds a wife, gets married and starts a family, and thanks to his job, he has full benefits — for the entire family, medical, dental, and vision; for himself, life insurance to benefit his family; a pension that will be fully vested after ten years; generous vacation and sick days (with unused sick days paid back every year); annual bonuses; profit sharing; and union membership after ninety days on the job.

He and the wife find a nice house on Long Island — big, with a lot of land, in a neighborhood with great schools, and easy access to groceries and other stores. They’re able to save long-term for retirement, as well as for shorter-term things, like trips to visit his folks in Indiana or hers in Miami or, once the kids are old enough, all the way to that new Disneyland place in California, which reminds Bud a lot of the World’s Fair, especially Tomorrowland.

If he’s typical for the era, he will either work for Westinghouse for his entire career, or make the move to one other company. Either way, he’ll retire from an executive level position in about 1988, having been in upper management since about 1964.

With savings, pensions, and Social Security, he and his wife decide to travel the world. Meanwhile, their kids, now around 40 and with kids about to graduate high school, aren’t doing so well, and aren’t sure how they’re going to pay for their kids’ college.

They approach Dad and ask for help, but he can’t understand. “Why don’t you just do what I did?” he asks them.

“Because we can’t,” they reply.

That hopeful world of 1939 is long dead — although, surprisingly, the actor who played Bud is still quite alive.

Image: Don O’Brien, Flickr, 2.0 Generic (CC BY 2.0), the Middleton Family in the May 1939 Country Gentleman ad for the Westinghouse World’s Fair exhibits.

Wednesday Wonders: Red-blooded? Not necessarily

Previously, I wrote about various foods that aren’t actually their original natural colors for various reasons. These include cherries, oranges, margarine, wasabi, and Blue Curaçao. Now, I’m going to go for the flip side of that one.

When I ask, “What color is blood?” I’d guess that your immediate answer would be “red.” And if you’re a member of certain species, then that is true, those species being humans and most vertebrates.

But that’s not true of every species at all. It depends entirely upon chemistry.

Red

So, if you’re red-blooded, what does it really mean? It has nothing to do with courage, valor, patriotism, or any of those silly attributes. What? Goldfish have red blood. So do dogs and cats. But why is that the case?

It’s simple. Well, it’s actually ultimately complicated, but all you need to really know is that the hemoglobin in our blood, which is the molecule that binds to oxygen and circulates it through our body, contains an iron molecule at the center of a ring structure.

This is what allows your red blood cells to circulate oxygen, out from your lungs, around your body, and back again as carbon dioxide.

If you’re wondering, “Okay, why red? I can’t see oxygen in the air,” think about this. Have you ever seen rust? What color is it? And what is rust? Oxidized iron.

In the body, in reality, the blood in the lungs starts out bright red and winds up a duller and more rust-like color by the time it comes back. But it’s red because of that iron.

But blood doesn’t necessarily need to use iron.

Yellow

Swap the iron out for the metal vanadium, and you get yellow blood, which is found, for example, in beetles and sea cucumbers. Surprise, though: vanadium does nothing to circulate oxygen, so its presence is still a mystery.

Green

While you might associate green blood with a certain popular Star Trek character, one human did surprise surgeons by bleeding green during surgery, although that was due to a medication he was taking rather than alien origins.

Otherwise, it’s really not normal for humans. But there are a few species of lizard that are very green on the inside and, ironically, it’s due to the same chemical that our bodies produce as a waste-product of red blood cell death, but which would kill us if it built up to levels that would actually turn our blood green.

That chemical is biliverdin, which is filtered out by human livers as quickly as possible via conversion to bilirubin.

It’s not such a problem for these species of lizards discovered in New Guinea, which have levels of biliverdin more than twenty-times that ever seen in a human.

Blue

Figuratively, “blue blood” refers to a member of the noble class. The English expression is actually a direct translation of the Spanish sangre azul, and it came from the noble classes of Spain wanting to distinguish themselves from the darker skinned Moorish invaders.

The nobles of Spain claimed descent from the Visigoths, who were actually Germanic and when one has paler skin, the veins that show through their skin appear blue, hence the term. Although, keep in mind that while veins may appear blue, the blood in them actually isn’t.

It’s just a trick of light and refraction, much the same way that our Sun is actually white, but our atmosphere makes it look yellow and, in turn, makes the sky appear blue.

If you want to find real blue blood, you’ll have to seek out certain octopodes, crustaceans, snails and spiders, which are all related. Instead of hemoglobin to transport oxygen, they use hemocyanin, and you can see the clue in the name: cyan is a particular shade of blue.

Instead of iron, hemocyanin uses copper as the oxygen-binding element. When copper oxidizes, it doesn’t rust. Rather, it corrodes, so while corroded copper picks up a green patina, when it carries oxygen in blood, it imparts a blue color.

One of the most famous blue animal bloods came from horseshoe crabs, who until recently were harvested in order to collect their blood because it could be used to test for bacteria, contamination, and toxins during the manufacture of any medicine or medical device intended to go inside of a human.

While the blood harvesting isn’t intended to harm the animals, many of them were still dying in the process, so scientists finally switched to an artificial substitute.

Purple

Finally, we come to the blood color that Romans would have considered the most noble, but find it mostly in lowly worms. These animals use the molecule hemerythrin to transport oxygen, which has two molecules of iron. Before it’s oxygenated, it’s transparent. Once it’s oxygenated, it turns light purple, almost violet.

So there’s a rainbow tour of blood, proving that we have plenty of “alien” biology already here on Earth, as well as that the simplest of molecular changes can make a huge difference in a surface appearance.

Image via (CC BY-SA 4.0)

Wonderous Wednesday: 5 Things that are older than you think

A lot of our current technology seems surprisingly new. The iPhone is only about fourteen years old, for example, although the first Blackberry, a more primitive form of smart phone, came out in 1999. The first actual smart phone, IBM’s Simon Personal Communicator, was introduced in 1992 but not available to consumers until 1994. That was also the year that the internet started to really take off with people outside of universities or the government, although public connections to it had been available as early as 1989 (remember Compuserve, anyone?), and the first experimental internet nodes were connected in 1969.

Of course, to go from room-sized computers communicating via acoustic modems along wires to handheld supercomputers sending their signals wirelessly via satellite took some evolution and development of existing technology. Your microwave oven has a lot more computing power than the system that helped us land on the moon, for example. But the roots of many of our modern inventions go back a lot further than you might think. Here are five examples.

Alarm clock

As a concept, alarm clocks go back to the ancient Greeks, frequently involving water clocks. These were designed to wake people up before dawn, in Plato’s case to make it to class on time, which started at daybreak; later, they woke monks in order to pray before sunrise.

From the late middle ages, church towers became town alarm clocks, with the bells set to strike at one particular hour per day, and personal alarm clocks first appeared in 15th-century Europe. The first American alarm clock was made by Levi Hutchins in 1787, but he only made it for himself since, like Plato, he got up before dawn. Antoine Redier of France was the first to patent a mechanical alarm clock, in 1847. Because of a lack of production during WWII due to the appropriation of metal and machine shops to the war effort (and the breakdown of older clocks during the war) they became one of the first consumer items to be mass-produced just before the war ended. Atlas Obscura has a fascinating history of alarm clocks that’s worth a look.

Fax machine

Although it’s pretty much a dead technology now, it was the height of high tech in offices in the 80s and 90s, but you’d be hard pressed to find a fax machine that isn’t part of the built-in hardware of a multi-purpose networked printer nowadays, and that’s only because it’s such a cheap legacy to include. But it might surprise you to know that the prototypical fax machine, originally an “Electric Printing Telegraph,” dates back to 1843.

Basically, as soon as humans figured out how to send signals down telegraph wires, they started to figure out how to encode images — and you can bet that the second image ever sent in that way was a dirty picture. Or a cat photo.

Still, it took until 1964 for Xerox to finally figure out how to use this technology over phone lines and create the Xerox LDX. The scanner/printer combo was available to rent for $800 a month — the equivalent of around $6,500 today — and it could transmit pages at a blazing 8 per minute. The second generation fax machine only weighed 46 lbs and could send a letter-sized document in only six minutes, or ten page per hour. Whoot — progress!

You can actually see one of the Electric Printing Telegraphs in action in the 1948 movie Call Northside 777, in which it plays a pivotal role in sending a photograph cross-country in order to exonerate an accused man.

In case you’re wondering, the title of the film refers to a telephone number from back in the days before what was originally called “all digit dialing.” Up until then, telephone exchanges (what we now call prefixes) were identified by the first two letters of a word, and then another digit or two or three. (Once upon a time, in some areas of the US, phone numbers only had five digits.) So NOrthside 777 would resolve itself to 667-77, with 667 being the prefix. This system started to end in 1958, and a lot of people didn’t like that.

Of course, with the advent of cell phones, prefixes and even area codes have become pretty meaningless, since people tend to keep the number they had in their home town regardless of where they move to, and a “long distance call” is mostly a dead concept now as well, which is probably a good thing.

CGI

When do you suppose the first computer animation appeared on film? You may have heard that the original 2D computer generated imagery (CGI) used in a movie was in 1973 in the original film Westworld, inspiration for the recent TV series. Using very primitive equipment, the visual effects designers simulated pixilation of actual footage in order to show us the POV of the robotic gunslinger played by Yul Brynner. It turned out to be a revolutionary effort.

The first 3D CGI happened to be in this film’s sequel, Futureworld in 1976, where the effect was used to create the image of a rotating 3D robot head. However, the first ever CGI sequence was actually made in… 1961. Called Rendering of a planned highway, it was created by the Swedish Royal Institute of Technology on what was then the fastest computer in the world, the BESK, driven by vacuum tubes. It’s an interesting effort for the time, but the results are rather disappointing.

Microwave oven

If you’re a Millennial, then microwave ovens have pretty much always been a standard accessory in your kitchen, but home versions don’t predate your birth by much. Sales began in the late 1960s. By 1972 Litton had introduced microwave ovens as kitchen appliances. They cost the equivalent of about $2,400 today. As demand went up, prices fell. Nowadays, you can get a small, basic microwave for under $50.

But would it surprise you to learn that the first microwave ovens were created just after World War II? In fact, they were the direct result of it, due to a sudden lack of demand for magnetrons, the devices used by the military to generate radar in the microwave range. Not wanting to lose the market, their manufacturers began to look for new uses for the tubes. The idea of using radio waves to cook food went back to 1933, but those devices were never developed.

Around 1946, engineers accidentally realized that the microwaves coming from these devices could cook food, and voìla! In 1947, the technology was developed, although only for commercial use, since the devices were taller than an average man, weighed 750 lbs and cost the equivalent of $56,000 today. It took 20 years for the first home model, the Radarange, to be introduced for the mere sum of $12,000 of today’s dollars.

Music video

Conventional wisdom says that the first music video to ever air went out on August 1, 1981 on MTV, and it was “Video Killed the Radio Star” by The Buggles. As is often the case, conventional wisdom is wrong. It was the first to air on MTV, but the concept of putting visuals to rock music as a marketing tool goes back a lot farther than that.

Artists and labels were making promotional films for their songs back at almost the beginning of the 1960s, with the Beatles a prominent example. Before these, though, was the Scopitone, a jukebox that could play films in sync with music popular from the late 1950s to mid-1960s, and their predecessor was the Panoram, a similar concept popular in the 1940s which played short programs called Soundies.

However, these programs played on a continuous loop, so you couldn’t chose your song. Soundies were produced until 1946, which brings us to the real predecessor of music videos: Vitaphone Shorts, produced by Warner Bros. as sound began to come to film. Some of these featured musical acts and were essentially miniature musicals themselves. They weren’t shot on video, but they introduced the concept all the same. Here, you can watch a particularly fun example from 1935 in 3-strip Technicolor that also features cameos by various stars of the era in a very loose story.

Do you know of any things that are actually a lot older than people think? Let us know in the comments!

Photo credit: Jake von Slatt

Sunday Nibble Flashback: #9 Don’t pan(dem)ic!

I originally wrote this piece five days before the first lockdown in Southern California started, although by that point my theater company had already shut down and people were just waiting for the pandemic to hit. The lockdown began one year ago yesterday, March 20, 2020, so I thought that this was a good occasion to take a look back at where things were just before everything changed.

I’m actually writing these words a little over a week before you’ll read them — hey, that’s how it goes when you get ambitious and want to publish every day. Still, this past week has been… weird, and I can only assume that the week between when I wrote this and when you read it will be equally weird.

I do my regular grocery shopping on Thursday nights. This came about because at the previous full-time job I had, we got paid every other Thursday, so it was just a natural thing to do a weekly budget two weeks at a time, and get groceries for the week on the evening of payday and a week later.

It was also great because I’d go to the store after 8 p.m., so there’d hardly ever be crowds, and I don’t buy a whole lot because, honestly, I’m a cheap date. That’s because ever since the events of August 2016, I’ve been cooking my lunch for the week at home, usually on Sunday afternoons, so that I could avoid processed and pre-packaged foods, and control the nutritional content. In my case, this largely means cutting down the sodium.

After I was laid off from that job because the company went tits up, I moved into the land of living off of savings and unemployment, but kept the same schedule. And even as I moved into my very part-time job with ComedySportz LA, with paydays on the 10th and 25th, and then into my new full-time gig in the wonderful world of Medicare (which really fascinates me) with Paul Davis Insurance Services, where payday is every other Friday, I kept the exact same schedule. Grocery time on Thursday night.

And it worked out well and regularly right up until Thursday, March 12, and then I had flashbacks to the day the L.A. Riots started, when scared whypipo also stripped the grocery stores bare for no damn good reason. Those MoFos stocked up for months when it turned out that the city was only under martial law for a week.

So, anyway, I headed out to my regular Ralphs at my regular time that Thursday only to find that the normally easy parking lot resembled any Trader Joe’s anywhere on a normal day. So I noped out of that one and headed to my second choice because it’s not as fancy even though it’s the closer Ralphs, managed to find a spot in the parking lot, headed inside, saw the length of the lines and, again, thought, “Okay. I’ll try later.”

About an hour and a half later, I came back, and while the lines weren’t as long, a quick stroll through the store showed me that the meat department, canned goods, paper goods, and beverages had been stripped bare. What was the point? Despite my short list, I wasn’t going to find anything, so I got the hell out of there.

Friday night: No need to report to the theater to work because they’ve cancelled all remaining shows for March, but there was a check waiting for me, so I headed out, driving by the aforementioned down-market Ralphs only to realize, “Nope. It’s still crazy.” Got my check and then swung by a stand-by market that shares my first name. The lines weren’t as bad, but… all the same departments stripped to the shelves.

I headed down the street a couple of miles to a market that almost shares my first name, only to find almost the same situation. I was literally only able to find one item on my shopping list there.

Fortunately, because for some unknown reason Ralphs abruptly discontinued carrying the particular types of dog food that my Sheeba demands, I had already changed to a PetSmart that is a mere block from home, and they have not been subject to the same panic buying.

So my fur kid gets to eat better than I do.

Or not. I wound up inadvertently stockpiling enough canned tuna to last through a few weeks, but I also did it over a few weeks because Ralphs has been having this insane sale in the first place — 4 cans for $4.00 — but then a coupon on top of that for $2.00 off 4 cans. Or, in other words, 8 cans for $4.00, half a buck a can. Since the stuff has a pretty long shelf-life, I figured, okay, why not?

And all of this was entirely before Storpocalypse hit. Or is that Bumwadgeddon? I’m not sure what all this panic buying has been dubbed yet. All I know is that I’ve got three weeks’ worth of tuna in the cupboard. Oh yeah — since Ralphs likes to occasionally send me coupons for a free jar of the brand of mayonnaise that is not my first choice, I have two of those in the fridge.

Tuna salad for days, y’all! And I already had two weeks’ of bum-wad on hand. So this panic didn’t really affect me other than the inability to buy meat.

That was kind of a problem because my tradition, between my Saturday day job and Saturday theater job, was to go get nine ounces of ground sirloin at Ralphs and bring it home to make an amazing cheeseburger.

But that option was taken off the table since the meat departments in every grocery store I went to were completely empty. On a hunch on the way home from work on a Saturday, I stopped by a small carnicería in Van Nuys. Not only did they have plenty of meat, but unlike at Ralphs, I got to watch the butcher grind it for me, and it was basically the same price.

So try those little neighborhood mom and pop places if there’s something you can’t find at the big store — just don’t buy more than you need right now, but do give them the business. And they probably have toilet paper, but don’t be greedy, okay?

And FFS, don’t panic. The world isn’t ending. China already got this, and the U.S. may have acted quickly enough. And the economy may actually be fine, just like it has been after other nation-wide disasters.

There is nothing to fear but fear itself, and this is a line from the inaugural address of one of our best presidents ever. So… stop hoarding out of fear. Calm down, take a deep breath, and look at the actual statistics.

There’s no damn reason at all that you need three 24 packs of TP, 6 cases of bottled water, 18 cans of soup, a shit-ton of other canned goods, and enough bread to prove that your whining about being gluten-free was absolute bullshit.

The next several weeks will be crucial, and we may all wind up stuck at home, so yes, by all means make sure that you have two to three weeks worth of food stocked up. But you don’t need three months worth or enough for a household five times the size that yours is.

Take every precaution you need to, but don’t go crazy with the panic buying. You’re just hurting your friends and neighbors by taking more than you need.

Remember: six feet apart, and wash your hands often.

An

Wednesday Wonders: Baby, it’s cold inside

A hundred and ten years ago, in 1911: Heike Kamerlingh Onnes made an interesting discovery while futzing around with very low temperatures. It’s a discovery that will lead to many modern innovations that affect us just over a century later.

Strange things happen as the temperature drops toward absolute zero, which is basically the temperature equivalent of the speed of light in a vacuum (C) being the velocity limit for anything with mass. Oh, we’ve gotten really close to absolute zero — within nanokelvins — and in theory we could get really close to the speed of light, although that would take ridiculous amounts of energy.

But… where matter can’t be is right at these two figures: Exactly absolute zero,

exactly C. There’s nothing in the equations, though, that say that objects with mass cannot move faster than the speed of light or be colder than absolute zero.

Practically speaking, it would require infinite energy to jump from 99.99999% to 100.00001% of C, so that’s not possible, but scientists in Germany think they may have achieved temperatures below absolute zero.

Of course, these create weird situations like negative temperatures in an absolute sense, and not just as measured. That is, while we can say that it’s 24 below zero outside, that really isn’t a negative temperature by strict definition. It’s just a temperature that’s negative on the scale we’re using.

Remember: 1º on the Kelvin scale is actually –457.87ºF.

These kinds of negative temperatures are actually below that absolute physical limit, and so they represent thermal energy that behaves the opposite as temperatures above absolute zero. And, in all likelihood, an object moving faster than light would also travel backwards in time thanks to the time dilation effect being reversed.

These, though, are theoretical arguments. What we do know is that things get weird as the temperature drops. At a few nanokelvin, the only energy left in the system is quantum, and so these strange effects take over on a massive scale, pun intended.

The key here is that as atoms lose energy and cool down, they stop moving as much, eventually reaching a point where they’re just sitting there. But… there’s a principle in physics, Heisenberg’s uncertainty principle, which says that there is a fundamental limit to the precision with which you can measure two connected properties of any particle.

For example, if you measure position precisely, you can’t measure momentum with much accuracy, and vice versa. The sharper one measurement is, the fuzzier the other one becomes. Not to get too deep into the science of it, but there are two classes of elementary particle, Fermions and bosons.

Fermions are elitists, and when they’re in a bunch, they don’t like to occupy the same quantum energy state. Electrons are totally fermions, which is why that whole concept of an atom as electrons neatly orbiting a nucleus like planets orbit the Sun is only a metaphor and very inaccurate.

Each electron in an atom occupies a different quantum energy state, which is why there’s the concept of electron “shells” filling up, but the location of each electron is not a unique point that changes over time. It’s a statistical probability of a particular electron being in a particular place at any given time, and so the “shape” of those shells can vary from a sphere to two squashed and joined spheres to distended ovoid shapes, and so on.

Bosons, on the other hand, are egalitarians, don’t mind sharing the same quantum energy state. In fact, they love to do it. This leads to a very interesting form of matter known as a Bose-Einstein Condensate.

Basically, at a low enough temperature, a bunch of atoms can suddenly coalesce into a single quantum particle with the same energy state and even become visible to a regular microscope.

Why? Because when we stop their movement, we can measure their momentum at near zero. Therefore, our ability to measure where they are becomes very inaccurate. It’s like the fermions all gather together and then balloon up into one entity in order to hide their individual locations.

This would be the equivalent of a bunch of people preventing GPS tracking of each of them by leaving their phones in one room and then all of them heading out in opposite directions in a big circle. Or sphere, if they can manage that.

The discovery that Onnes made in 1911 is related to this phenomenon. In his case, he dipped a solid mercury wire into liquid helium at 4.2 degrees Kelvin and discovered that all electrical resistance went away. That is, he discovered a property of matter known as superconductivity.

The same principle and the low temperature led to the electromagnetic force interacting in a different way — fermions meet bosons under extreme conditions, and electric and magnetic sort of separate, or at least keep themselves at arm’s length, as it were.

This can lead to all sorts of interesting effects, like levitation.

This is the technology taking maglev trains to the next level. But superconductivity is also used in things like medical imaging devices, motors, generators, transformers, and computer parts.

But the holy grail of the field is finding he so-called “room temperature” superconductor. All right. In some ways, “room temperature” is a bit of a misnomer, and the warmest superconductor yet found has a transition temperature of –23ºC. But a more promising substance could be a superconductor at 53ºC. That’s the good news. The bad news is that it requires ridiculously high atmospheric pressures to do it — in the range of a million or more times the pressure at sea level.

Oh, well.

Of course, the U.S. Navy did file a patent for a “room remperature” superconductor just over a year ago, but it’s not clear from the patent whether they used the “Not 0ºK” definition of room temperature or the popular press definition of about 77ºF.

It makes sense, though, that barring low temperature, some other extreme would be needed to achieve the effect. Nature just seems to work like that, whether it’s extremely low temperatures or very high pressures required to create superconductivity, or the extreme gravity and energy conversion required to create that other holy grail so beloved of alchemy: transmutation of matter, specifically turning lead into gold.

Ah, yes. If those alchemists only knew that every star was constantly transmuting elements every second of every day strictly through the power of enormous gravity and pressure — hydrogen to helium and so on, right down to iron — then who knows. One of them might have managed fusion centuries ago.

Okay, not likely. But just over a century ago, superconductivity was discovered, and it’s been changing the world ever since. Happy 109th anniversary!

Why astrology is bunk

This piece, which I first posted in 2019, continues to get constant traffic and I haven’t had a week go by that someone hasn’t given it a read. So I felt that it was worth bringing to the top again.

I know way too many otherwise intelligent adults who believe in astrology, and it really grinds my gears, especially right now, because I’m seeing a lot of “Mercury is going retrograde — SQUEEEE” posts, and they are annoying and wrong.

The effect that Mercury in retrograde will have on us: Zero.

Fact

Mercury doesn’t “go retrograde.” We catch up with and then pass it, so it only looks like it’s moving backwards. It’s an illusion, and entirely a function of how planets orbit the sun, and how things look from here. If Mars had (semi)intelligent life, they would note periods when the Earth was in retrograde, but it’d be for the exact same reason.

Science

What force, exactly, would affect us? Gravity is out, because the gravitational effect of anything else in our solar system or universe is dwarfed by the Earth’s. When it comes to astrology at birth, your OB/GYN has a stronger gravitational effect on you than the Sun.

On top of that, the Sun has 99.9% of the mass of our solar system, which is how gravity works, so the Sun has the greatest gravitational influence on all of the planets. We only get a slight exception because of the size of our Moon and how close it is, but that’s not a part of astrology, is it? (Not really. They do Moon signs, but it’s not in the day-to-day.)

Some other force? We haven’t found one yet.

History

If astrology were correct, then there are one of two possibilities. A) It would have predicted the existence of Uranus and Neptune, and possibly Pluto, long before they were discovered, since astrology goes back to ancient times, but those discoveries happened in the modern era, or B) It would not have allowed for the addition of those three planets (and then the removal of Pluto) once discovered, since all of the rules would have been set down. And it certainly would have accounted for the 13th sign, Ophiuchus, which, again, wasn’t found until very recently, by science.

So…stop believing in astrology, because it’s bunk. Mercury has no effect on us whatsoever, other than when astronomers look out with telescopes and watch it transit the Sun, and use its movements to learn more about real things, like gravity.

Experiment

The late, great James Randi, fraud debunker extraordinaire, did a classroom exercise that demolishes the accuracy of those newspaper horoscopes, and here it is — apologies for the low quality video.

Yep. Those daily horoscopes you read are general enough to be true for anyone, and confirmation bias means that you’ll latch onto the parts that fit you and ignore the parts that don’t although, again, they’re designed to fit anyone — and no one is going to remember the generic advice or predictions sprinkled in or, if they do, will again pull confirmation bias only when they think they came true.

“You are an intuitive person who likes to figure things out on your own, but doesn’t mind asking for help when necessary. This is a good week to start something new, but be careful on Wednesday. You also have a coworker who is plotting to sabotage you, but another who will come to your aid. Someone with an S in their name will become suddenly important, and they may be an air sign. When you’re not working on career, focus on home life, although right now your Jupiter is indicating that you need to do more organizing than cleaning. There’s some conflict with Mars, which says that you may have to deal with an issue you’ve been having with a neighbor. Saturn in your third house indicates stability, so a good time to keep on binge watching  your favorite show, but Uranus retrograde indicates that you’ll have to take extra effort to protect yourself from spoilers.”

So… how much of that fit you? Or do you think will? Honestly, it is 100% pure, unadulterated bullshit that I just made up, without referencing any kind of astrological chart at all, and it could apply to any sign because it mentions none.

Plus I don’t think it’s even possible for Uranus to go retrograde from the Earth’s point of view.

Conclusion

If you’re an adult, you really shouldn’t buy into this whole astrology thing. The only way any of the planets would have any effect at all on us is if one of them suddenly slammed into the Earth. That probably only happened once, or not, but it’s probably what created the Moon. So ultimately not a bad thing… except for anything living here at the time.

Wednesday Wonders: How the world almost ended once

I happen to firmly believe that climate change is real, it is happening, and humans are contributing to and largely responsible for it, but don’t worry — this isn’t going to be a political story. And I’ll admit that I can completely understand some of the deniers’ arguments. No, not the ones that say that “global warming” is a hoax made up so that “evil liberals” in government can tax everyone even more. The understandable arguments are the ones that say, “How could mere humans have such a big effect on the world’s climate?” and “Climate change is cyclic and will happen with or without us.”

That second argument is actually true, but it doesn’t change the fact that our industrialization has had a direct and measurable impact in terms of more greenhouse gases emitted and the planet heating up. Also note: Just because you’re freezing your ass off under the polar vortex doesn’t mean that Earth isn’t getting hotter. Heat just means that there’s more energy in the system and with more energy comes more chaos. Hot places will be hotter. Cold places will be colder. Weather in general will become more violent.

As for the first argument, that a single species, like humans, really can’t have all that great an effect on this big, giant planet, I’d like to tell you a story that will demonstrate how wrong that idea is, and it begins nearly 2.5 billion years ago with the Great Oxygenation Event.

Prior to that point in time, the Earth was mostly populated by anaerobic organisms — that is, organisms that do not use oxygen in their metabolism. In fact, oxygen is toxic to them. The oceans were full of bacteria of this variety. The atmosphere at the time was about 30% carbon dioxide and close to 70% nitrogen, with perhaps a hint of methane, but no oxygen at all. Compare this to the atmosphere of Mars today, which is 95% carbon dioxide, 2.7% nitrogen, and less than 2% other gases. Side note: This makes the movie Mars Attacks! very wrong, because a major plot point was that the Martians could only breathe nitrogen, which is currently 78% of our atmosphere but almost absent in theirs. Oops!

But back to those anaerobic days and what changed them: A species of algae called cyanobacteria figured out the trick to photosynthesis — that is, producing energy not from food, but from sunlight and a few neat chemical processes. (Incidentally, this was also the first step on the evolutionary path to eyes.) Basically, these microscopic fauna would take in water and carbon dioxide, use the power of photons to break some bonds, and then unleash the oxygen from both of those elements while using the remaining carbon and hydrogen.

At first, things were okay because oxygen tended to be trapped by organic matter (any carbon containing compound) or iron (this is how rust is made), and there were plenty of both floating around to do the job, so both forms of bacteria got along fine. But there eventually became a point when there were not enough oxygen traps, and so things started to go off the rails. Instead of being safely sequestered, the oxygen started to get out into the atmosphere, with several devastating results.

First, of course, was that this element was toxic to the anaerobic bacteria, and so it started to kill them off big time. They just couldn’t deal with it, so they either died or adapted to a new ecological niche in low-oxygen environments, like the bottom of the sea. Second, though, and more impactful: All of this oxygen wound up taking our whatever atmospheric methane was left and converting it into carbon dioxide. Now the former is a more powerful greenhouse gas, and so was keeping the planet warm. The latter was and still is less effective. The end result of the change was a sudden and very long ice age known as the Huronian glaciation, which lasted for 300 million years — the oldest and longest ice age to date. The result of this was that most of the cyanobacteria died off as well.

So there you have it. A microscopic organism, much smaller than any of us and without any kind of technology or even intelligence to speak of, almost managed to wipe out all life forms on the planet and completely alter the climate for tens of millions of years, and they may have tipped the balance in as little as a million years.

We are much, much bigger than bacteria — about a million times, actually — and so our impact on the world is proportionally larger, even if they vastly outnumbered our current population of around 7.5 billion. But these tiny, mindless organisms managed to wipe out most of the life on Earth at the time and change the climate for far longer than humans have even existed.

Don’t kid yourself by thinking that humanity cannot and is not doing the same thing right now. Whether we’ll manage to turn the planet into Venus or Pluto is still up for debate. Maybe we’ll get a little of both. But to try to hand-wave it away by claiming we really can’t have that much of an impact is the road to perdition. If single-celled organisms could destroy the entire ecosystem, imagine how much worse we can do with our roughly 30 to 40 trillion cells, and then do your best to not contribute to that destruction.

Talky Tuesday: Language is (still) a virus

The following is a repost from the end of March 2020, just eleven days after the lockdown started in Los Angeles. This was the view from the other side of the pandemic, before we knew anything.

I used this Burroughs quote as a post title a couple of years ago in an entirely different context, but the idea has taken on new relevance, as I’m sure the world can now agree.

This post’s title comes from a William S. Burroughs quote which reads in full as, “Language is a virus from outer space.”

What he meant by the first part is that words enter a host, infect it, and cause a change in it. Just as a virus hijacks a host’s cells in order to become little factories to make more virus to spread a disease, words hijack a host’s neurons in order to become little factories to make more words to spread ideas and belief systems.

As for the “outer space” part, I think that Burroughs was being metaphorical, with the idea being that any particular language can appear totally alien to any other. While, say, Mongolian and Diné may both come from humans on Earth, if a speaker of either encountered someone who only spoke the other, they might as well be from different planets because, for all intents and purposes, they are from different worlds, unable to communicate with words.

And the language we use can quite literally create and shape our perceptions of the world, as I discussed in my original Language is a virus post. One of the most striking examples I cited in that link was Guugu Yimithirr, an aboriginal language that has no words for relative direction. Instead, they refer to everything based upon where it is relative to actual cardinal directions.

In other words, if you ask someone who speaks this language where you should sit, they won’t say, “In the chair on your left.” Rather, they’ll say something like, “In the chair to the north.” Or south, or east, or west. And a speaker of the language will know what that means, whether they can see outside or not.

Quick — right now, if someone said “Point east,” could you do it without thinking?

And that is how languages can change thoughts and perceptions.

But, sometimes — honestly, far too often — language can change perceptions to create tribalism and hostility, and during this plague year, that has suddenly become a huge topic of debate over a simple change of one C word in a phrase.

I’m writing, of course, about “coronavirus” vs. “Chinese virus.” And the debate is this: Is the latter phrase racist, or just a statement of fact?

One reporter from a rather biased organization did try to start the “it’s not” narrative with the stupidest question ever asked: “Mr. President, do you consider the term ‘Chinese food’ to be racist because it is food that originated from China?”

There are just two problems with this one. The first is that what Americans call “Chinese food” did not, in fact, originate in China. It was the product of Chinese immigrants in America who, being mostly men, didn’t know how to cook, and didn’t have access to a lot of the ingredients from back home. So… they improvised and approximated, and “Chinese food” was created by Chinese immigrants starting in San Francisco in the 19th century.

Initially, it was cuisine meant only for Chinese immigrants because racist Americans wouldn’t touch it, but when Chinatowns had sprung up in other cities, it was New York’s version that finally lured in the hipster foodies of the day to try it, and they were hooked.

In short, “Chinese food” was a positive and voluntary contribution to American culture, and the designation here is merely descriptive, so not racist. “Chinese virus” is a blatant misclassification at best and a definite attempt at a slur at worst, with odds on the latter.

But we’ve seen this with diseases before.

When it comes to simple misidentification of place of origin, let’s look back to almost exactly a century ago, when the Spanish flu went pandemic. From 1918 to 1919, it hit every part of the world, infected 500 million people and killed 50 million.

A little perspective: At the time, the world’s population was only 1.8 billion, so this represents an infection rate of 28% and a mortality rate among the infected of 2.8%. If COVID-19 has similar statistics — and it seems to — then that means this pandemic will infect 2.1 billion people and kill 211 million.

By the way, while the 1918 pandemic was very fatal to children under 5 and adults over 65, it also hit one other demographic particularly hard: 20 to 40 year-olds.

So if you’re in that age range and think that COVID-19 won’t kill you, think again — particularly if you smoke or vape or have asthma, and don’t take the quarantine seriously. And remember: the rich and world leaders are not immune either — not now and not then.

The president of the United States, Woodrow Wilson, was infected in the 1918 H1N1 pandemic in 1919, and while he survived, this assault on his health probably led to the stroke he had late in that year, an incident that was covered up by his wife, with the help of the president’s doctor. The First Lady became de facto president for the remainder of his second term.

In modern times, the first world leader to test positive for coronavirus was Prince Albert II of Monaco, followed not long after by Prince Charles and Boris Johnson. Of course, I’m writing these words a bit ahead of when you’ll read them, so who knows what will have happened by then.

In medical circles, the name “Spanish Flu” has been abandoned, and that particular pandemic is now known as H1N1, which I’m sure looks really familiar to you, because this has been the standard nomenclature for flu viruses for a while: H#N#, sans location, animal, or occupation, more on which in a minute.

But first, let’s get to the reasons behind naming a disease after a place. The H1N1 Pandemic was a simple case of mistaken identity and also contingent upon that whole “Great War” stuff going on in Europe.

See, other countries had been hit by it first, but in the interests of the old dick-waving “Gotta appear strong before our enemies” toxic masculinity, all of them suppressed the news. It wasn’t until Spain started seeing it in their citizens and, because they were neutral, they announced outbreaks, that the world suddenly pointed fingers and said, “Ooh… this came from Spain. Hence, it’s the Spanish flu.”

Except, not. Ironically, it seems now that the Spanish flu originated in… China. Although that’s according to historians. Virologists, on the other hand, have linked it to an H1 strain later identified in pigs in Iowa in the U.S.

Either way, all of the countries involved in WW I, aka “The Great War,” kept mum about it.

So the name “Spanish flu” was a simple mistake. On the other hand, the names of other diseases actually are outright xenophobic or racist, and we only have to look as far  as syphilis to see why.

Basically, syphilis is an STI that was the most feared of its kind until… AIDS, because syphilis was not treatable or curable until penicillin was discovered in 1928 — although it was not produced on a mass scale until 1945, thanks to needs created by WW II, and facilitated by the War Production Board.

Hm. Sound familiar?

But the reason it became known as the French disease outside of France was that it began to spread after Charles VIII of France invaded Italy in 1494-95 to reclaim a kingdom he thought should be his. It was eventually so devastating that Charles had to take his troops home, and so it began to spread in France and across Europe.

Since it first showed up in French soldiers, it was quickly dubbed the French disease in Italy and England, although the French preferred to call it the Italian disease. In reality, it most likely originated in the New World, and was brought back to Europe by… Columbus and his Spanish soldiers, who then somehow managed to spread it to the French as they worked for them as mercenaries.

Hm. STI. A bunch of male soldiers. I wonder how that worked, exactly.

And I am totally destroying my future google search suggestions by researching all of this for you, my loyal readers, just so you know! Not to mention that I can’t wait to see what sort of ads I start getting on social media. “Confidential STI testing!” “Get penicillin without a prescription.” “These three weird tricks will cure the STI. Doctors hate them!”

But the naming of diseases all came to a head almost five years ago when the World Health Organization (WHO)  finally decided, “You know what? We shouldn’t name diseases after people, places, animals, occupations, or things anymore, because that just leads to all kinds of discrimination and offense, and who needs it?”

This directly resulted from the backlash against the naming of the last disease ever named for a place, despite the attempt to obfuscate that in its official terminology. Remember MERS, anyone?  No? That one came about in 2012, was first identified in Saudi Arabia, and was named Middle East respiratory syndrome.

Of course, it didn’t help when things were named swine flu or avian flu, either. A lot of pigs and birds died over those designations. So away went such terminology, especially because of the xenophobic and racist connotations of naming a disease after an entire country or people.

Of course, some antiquated folk don’t understand why it’s at the least racist and at the most dangerous to name diseases the old way, as evinced by the editorial tone of this article from a right-wing publication like The Federalist. But they actually kind of manage to prove the point that yes, such terminology is out of fashion, because the only 21st century example they can list is the aforementioned MERS.

The most recent one before that? Lyme disease, named for Lyme, Connecticut, and designated in… the mid-70s. Not exactly the least racist of times, although this disease was named for a pretty white-bread area.

The only other examples of diseases named for countries on their list: the aforementioned Spanish flu, Japanese encephalitis, named in 1871 (seriously, have you ever heard of that one?); and the German measles, identified in the 18th century, although more commonly known as rubella.

So, again — it’s a list that proves the exact opposite of what it set out to do, and calling coronavirus or COVID-19 the “Chinese virus” or “Chinese disease” is, in fact, racist as hell. And it won’t save a single life.

But calling it that will certainly endanger lives by causing hate crimes — because language is a virus, and when people are infected by malignant viruses, like hate speech, the results can be deadly.

Inoculate yourselves against them with education if possible. Quarantine yourselves against them through critical thinking otherwise. Most of all, through these trying times, stay safe — and stay home!

Image source: Coronavirus Disease 2019 Rotator Graphic for af.mil. (U.S. Air Force Graphic by Rosario “Charo” Gutierrez)

Momentous Monday: Questions that plague us

From March 2020, three days into first COVID-19 lockdown, before we knew the extent the plague would reach or how long the lockdowns and social distancing would last.

It can easily be argued that Europe conquered the Americas not through armed assault, but via unintended biological warfare. While Christopher Columbus and those who came after arrived in the New World with plants, animals, and diseases, it’s the latter category that had the most profound effect.

This transfer of things between the Old World and New has been dubbed The Columbian Exchange, Thanks to the European habit starting the next century of stealing Africans to enslave, diseases from that continent were also imported to the Americas.

Of course, in Europe and Africa, everyone had had time to be exposed to all of these things: measles, smallpox, mumps, typhus, whooping cough, malaria, and yellow fever. As a result, they either killed off a large number of children before six, or left survivors with natural immunity.

Influenza, aka flu, was the one exception that no one became immune to because that virus kept mutating and evolving as well.

Depending upon the area, the death rates of Native Americans were anywhere from 50 to 99 percent of the population. And they didn’t really send as many diseases back as they were “gifted with” by us, although Columbus’ men did bring syphilis home to Europe thanks to their habit of fucking sheep,

Of course, conquest through infection and violence is nothing new, as the 1997 book Germs, Guns, and Steel by Jared Diamond posits.

Nothing will freak out a human population faster than a deadly disease, especially one that just won’t go away, and the plague, aka The Black Death, regularly decimated Europe for three hundred years. It had a profound effect on art during its reign, which stretched all the way through the Renaissance and on into the Age of Reason.

But one of the positive side effects of that last visit of the plague to London in 1665 is that it lead to the Annus Mirabilis, or “year of wonders” for one Isaac Newton, a 23-year-old (when it started) mathematician, physicist, and astronomer.

Just like many students are experiencing right now, his university shut down in the summer of 1865 to protect everyone from the plague, and so Newton self-isolated in his home in Woolsthorpe for a year and a half, where he came up with his theories on calculus, optics, and the law of gravitation.

He basically kick-started modern physics. His ideas on optics would lead directly to quantum physics, and his ideas on gravitation would inspire Einstein to come up with his general and special theories of relativity.

Meanwhile, calculus gave everyone the tool they would need to deal with all of the very complicated equations that would lead to and be born from the above mentioned subjects.

And if Isaac Newton hadn’t been forced to shelter in place and stay at home for eighteen months, this might have never happened, or only happened much later, and in that case, you might not even have the internet on which to read this article.

In case you didn’t realize it, communicating with satellites — which relay a lot of internet traffic — and using GPS to find you both rely on quantum physics because these systems are based on such precise timing that relativistic effects do come into play. Clocks on satellites in orbit run at a different rate than clocks down here, and we need to do the math to account for it.

Plus we never would have been able to stick those satellites into the right orbits at the right velocities in the first place without knowing how gravity works, and without the formulae to do all the necessary calculations.

There’s a modern example of a terrible pandemic ultimately leading to a greater good, though, and it’s this. America and a lot of the western world would not have same-sex marriages or such great advances in LGBTQ+ rights without the AIDS crisis that emerged in 1981.

AIDS and the thing that causes it, HIV, are actually a perfect match for the terms you’ve been hearing lately. “Novel coronavirus” is the thing that causes it, or HIV. But neither one becomes a serious problem until a person develops the condition because of it, either COVID-19 or AIDS.

But getting back to how the AIDS crisis advanced gay rights, it began because the federal government ignored the problem for too long and people died. Hm. Sound familiar? And, as I mentioned above, nothing will make people flip their shit like a life-threatening disease, especially one that seems to be an incurable pandemic.

And so the gay community got down to business and organized, and groups like ACT-UP and Queer Nation took to the streets and got loud and proud. In 1987 in San Francisco (one of the places hardest hit by AIDS), the NAMES Project began creation of the AIDS Memorial Quilt, commemorating all of the people who died of the disease.

And a funny thing happened going into the 90s. All of a sudden, gay characters started to be represented in a positive light in mainstream media. And then gay performers started to come out — Scott Thompson of The Kids in the Hall fame being one of the early notable examples, long before Ellen did.

Around the time Thompson came out, of course, a famous straight person, Magic Johnson, announced in 1991 that he was HIV positive, and that’s when people who were not part of the LGBTQ+ community freaked the fuck out.

Note, though, that Magic is still alive today. Why? Because when he made his announcement, straight people got all up on that shit and figured out ways to reduce viral loads and extend lifespans and turn AIDS into a not death sentence, like it used to be almost 30 years ago.

And almost 40 years after the crisis started, we seem to have finally created a generation of young people (whatever we’re calling the ones born from about 1995 to now) who are not homo- or transphobic, really aren’t into labels, and don’t try to define their sexualities or genders in binary terms in the first place.

On the one hand, it’s terrible that it took the deaths of millions of people to finally get to this point. On the other hand, maybe, just maybe, this current pandemic will inspire a similar kind of activism that might just lead to all kinds of positives we cannot even predict right now, but by 2040 or 2050 will be blatantly obvious.

Stay safe, stay at home, wash your hands a lot, and figure out your own “Woolsthorpe Thing.” Who knows. In 2320, your name could be enshrined in all of human culture for so many things.