Wonderous Wednesday: 5 Things that are older than you think

Quarantine is hard, so in lieu of not posting anything, here’s a blast from the past, an article posted in February 2019, but which is still relevant today.

A lot of our current technology seems surprisingly new. The iPhone is only twelve years old, for example, although the first Blackberry, a more primitive form of smart phone, came out in 1999. The first actual smart phone, IBM’s Simon Personal Communicator, was introduced in 1992 but not available to consumers until 1994. That was also the year that the internet started to really take off with people outside of universities or the government, although public connections to it had been available as early as 1989 (remember Compuserve, anyone?), and the first experimental internet nodes were connected in 1969.

Of course, to go from room-sized computers communicating via acoustic modems along wires to handheld supercomputers sending their signals wirelessly via satellite took some evolution and development of existing technology. Your microwave oven has a lot more computing power than the system that helped us land on the moon, for example. But the roots of many of our modern inventions go back a lot further than you might think. Here are five examples.

Alarm clock

As a concept, alarm clocks go back to the ancient Greeks, frequently involving water clocks. These were designed to wake people up before dawn, in Plato’s case to make it to class on time, which started at daybreak; later, they woke monks in order to pray before sunrise.

From the late middle ages, church towers became town alarm clocks, with the bells set to strike at one particular hour per day, and personal alarm clocks first appeared in 15th-century Europe. The first American alarm clock was made by Levi Hutchins in 1787, but he only made it for himself since, like Plato, he got up before dawn. Antoine Redier of France was the first to patent a mechanical alarm clock, in 1847. Because of a lack of production during WWII due to the appropriation of metal and machine shops to the war effort (and the breakdown of older clocks during the war) they became one of the first consumer items to be mass-produced just before the war ended. Atlas Obscura has a fascinating history of alarm clocks that’s worth a look.

Fax machine

Although it’s pretty much a dead technology now, it was the height of high tech in offices in the 80s and 90s, but you’d be hard pressed to find a fax machine that isn’t part of the built-in hardware of a multi-purpose networked printer nowadays, and that’s only because it’s such a cheap legacy to include. But it might surprise you to know that the prototypical fax machine, originally an “Electric Printing Telegraph,” dates back to 1843. Basically, as soon as humans figured out how to send signals down telegraph wires, they started to figure out how to encode images — and you can bet that the second image ever sent in that way was a dirty picture. Or a cat photo. Still, it took until 1964 for Xerox to finally figure out how to use this technology over phone lines and create the Xerox LDX. The scanner/printer combo was available to rent for $800 a month — the equivalent of around $6,500 today — and it could transmit pages at a blazing 8 per minute. The second generation fax machine only weighed 46 lbs and could send a letter-sized document in only six minutes, or ten page per hour. Whoot — progress! You can actually see one of the Electric Printing Telegraphs in action in the 1948 movie Call Northside 777, in which it plays a pivotal role in sending a photograph cross-country in order to exonerate an accused man.

In case you’re wondering, the title of the film refers to a telephone number from back in the days before what was originally called “all digit dialing.” Up until then, telephone exchanges (what we now call prefixes) were identified by the first two letters of a word, and then another digit or two or three. (Once upon a time, in some areas of the US, phone numbers only had five digits.) So NOrthside 777 would resolve itself to 667-77, with 667 being the prefix. This system started to end in 1958, and a lot of people didn’t like that.

Of course, with the advent of cell phones prefixes and even area codes have become pretty meaningless, since people tend to keep the number they had in their home town regardless of where they move to, and a “long distance call” is mostly a dead concept now as well, which is probably a good thing.

CGI

When do you suppose the first computer animation appeared on film? You may have heard that the original 2D computer generated imagery (CGI) used in a movie was in 1973 in the original film Westworld, inspiration for the recent TV series. Using very primitive equipment, the visual effects designers simulated pixilation of actual footage in order to show us the POV of the robotic gunslinger played by Yul Brynner. It turned out to be a revolutionary effort.

The first 3D CGI happened to be in this film’s sequel, Futureworld in 1976, where the effect was used to create the image of a rotating 3D robot head. However, the first ever CGI sequence was actually made in… 1961. Called Rendering of a planned highway, it was created by the Swedish Royal Institute of Technology on what was then the fastest computer in the world, the BESK, driven by vacuum tubes. It’s an interesting effort for the time, but the results are rather disappointing.

Microwave oven

If you’re a Millennial, then microwave ovens have pretty much always been a standard accessory in your kitchen, but home versions don’t predate your birth by much. Sales began in the late 1960s. By 1972 Litton had introduced microwave ovens as kitchen appliances. They cost the equivalent of about $2,400 today. As demand went up, prices fell. Nowadays, you can get a small, basic microwave for under $50.

But would it surprise you to learn that the first microwave ovens were created just after World War II? In fact, they were the direct result of it, due to a sudden lack of demand for magnetrons, the devices used by the military to generate radar in the microwave range. Not wanting to lose the market, their manufacturers began to look for new uses for the tubes. The idea of using radio waves to cook food went back to 1933, but those devices were never developed.

Around 1946, engineers accidentally realized that the microwaves coming from these devices could cook food, and voìla! In 1947, the technology was developed, although only for commercial use, since the devices were taller than an average man, weighed 750 lbs and cost the equivalent of $56,000 today. It took 20 years for the first home model, the Radarange, to be introduced for the mere sum of $12,000 of today’s dollars.

Music video

Conventional wisdom says that the first music video to ever air went out on August 1, 1981 on MTV, and it was “Video Killed the Radio Star” by The Buggles. As is often the case, conventional wisdom is wrong. It was the first to air on MTV, but the concept of putting visuals to rock music as a marketing tool goes back a lot farther than that. Artists and labels were making promotional films for their songs back at almost the beginning of the 1960s, with the Beatles a prominent example. Before these, though, was the Scopitone, a jukebox that could play films in sync with music popular from the late 1950s to mid-1960s, and their predecessor was the Panoram, a similar concept popular in the 1940s which played short programs called Soundies. However, these programs played on a continuous loop, so you couldn’t chose your song. Soundies were produced until 1946, which brings us to the real predecessor of music videos: Vitaphone Shorts, produced by Warner Bros. as sound began to come to film. Some of these featured musical acts and were essentially miniature musicals themselves. They weren’t shot on video, but they introduced the concept all the same. Here, you can watch a particularly fun example from 1935 in 3-strip Technicolor that also features cameos by various stars of the era in a very loose story.

Do you know of any things that are actually a lot older than people think? Let us know in the comments!

Photo credit: Jake von Slatt

Sunday Nibble #13: Taking pause

I don’t know what designation historians will come up with for the year 2020 — or even if it will be limited to just one year — but it will definitely be one of those great cultural markers that represents a hard stop, an irrefutable before and after point in human history.

It’s also going to have that significance in every single country and culture on the planet, and I can’t even think of a precedent in all of human history. There are certainly hard stops that had far-ranging though limited effects, like the fall of the Roman Empire, the end of the Aztec Empire, and the Reconquista, to mention three that mostly affected the western world.

Larger regions were affected by things like the Napoleonic Wars, and both the Great War and its unimaginatively named sequel World War II — but there were places that largely escaped the direct influence of those events. Asia, Australia, and most of Africa were untouched by Napoleon.

The World Wars may not have directly threatened every country on every continent, but may have indirectly changed things for them. It certainly changed world politics forever by leaving us with the Cold War and its aftermath.

This current plague is different in that no country on the planet has escaped it, and no person in the world is unaffected, period.

It’s as if the entire planet has become London in 1666, when the entire city was shut down by plague. The bad news there is that the thing that effectively ended it was the Great Fire of London, which destroyed densely populated and impoverished areas, driving out the rats that carried the fleas that were the ultimate cause of the disease. The true human death toll isn’t known.

Contemporary writers claimed that few people perished, but the fire burned so hot that entire communities could have been cremated without leaving any evidence behind.

It does feel, though, like we’re going to see another Great Fire in a metaphorical sense, as old institutions and ways collapse, never to exist again. If the lockdowns and lack of governmental help last long enough, then we may see widespread revolutions. At the very least, there may be general strikes that will starve the ruling classes of their income.

There is hope in the darkness, though, and I see it whenever I take the dog on a very limited walk and look up at the sky to see how clean it is. We’ve also had a lot more rain here than we’ve had for a while, and it’s unseasonal. It feels like the planet has decided to take a shower and clean up while we’re all inside.

I have friends who are at home sewing masks and others who are making videos or hosting shows on Zoom to keep people entertained. Still others are making sure that friends get things they need if they don’t have them, all while social distancing.

My improv group has been meeting regularly on Mondays via Zoom for some mutual self-care and to perform, and the main ComedySportz L.A. improv company itself has been having online shows that have been selling out every Saturday night.

I’ve seen very little in the way of stupid directly and for the most part people are maintaining social distance and wearing masks. The few moments of stupid I’ve seen haven’t been recent, and were in the grocery store, when a large group of people, generally youngish, and clearly probably not all living, together would come in to hit the liquor aisle and then all stand really close to each other.

Currently, the only stupid I’ve seen are the very few people who’ve gone to the grocery store without a mask or, extra special stupid, they’ve had a mask, but it’s pulled down so that it doesn’t cover their nose.

Sigh.

I do think that there’s a special place in hell, though, for a few Instagram “influencers” I’ve noticed who are still going out into the world to shoot their “OMG this is so fucking important” bullshit. I won’t mention names of the offenders, but one in particular was stupid enough to post time-stamped video of a bunch of unmasked people working in what I assume is some sort communal office space, or a group of people riding in the same van very close together.

Oh yeah, in that one, the person shooting also shows the speedometer, and ass-boy is doing 125 mph down the highway — while one of the group is standing in the back of the van.

I will mention one influencer who’s doing the right thing: Juanpa Zurita, who is stuck in isolation with his entire family somewhere in Mexico. They’ve been spending their time making masks and face guards for health care workers, not going outside, as well as pranking each other, and otherwise just being entertaining.

So, I don’t know. Maybe future historians will call this period “The Year When the World Stayed at Home,” or “The Great Pause,” or “The Global Reset.”

Another name for it might be “The Darwin Awards Ultimate World Championship.”

I am doing my best to not win any awards in that competition, and I hope that you are, too. Tomorrow was originally supposed to be the end of the lockdown here in L.A., but it was extended to May 15 over a week ago. I’m not holding out any hope that that date won’t be extended, either.

But whatever it takes to pull the planet through this, let’s just team up and do it.

Momentous Monday: Mkay…

That’s MK as in MK-Ultra, and it’s one of the few conspiracies that actually happened although, of course, it didn’t stay secret forever. The program began on April 13, 1953, which is why I bring it up today. It was exposed by the Church Committee in 1975, meaning it stayed a secret for 22 years.

That committee was formed by the U.S. Senate precisely to investigate illegal activities by the CIA, like spying on U.S. citizens. MK-Ultra, though, was even darker than that. Its goal was nothing less than mind-control, and it had its roots in Nazi concentration camps and Japan’s infamous Unit 731. The CIA even worked with officers and torturers from both places.

The Nazis had used mescaline on prisoners as a way of trying to make them compliant. Meanwhile, Japan had focused mostly on biological weapons, although they weren’t beyond using live vivisection as a method of torture.

In case you’re wondering, while the Nazis’ main (but not only) targets were Jews, Japan mostly went after the Chinese, and they’re still not big fans of them. They aren’t so fond of Koreans either, though. But that’s got nothing to do with MK-Ultra.

Oddly enough, it was the Korean War that was the catalyst for the whole project starting, as American POWs returning from there began making claims against the U.S. that were not true. Well, not true according to Allen Dulles, newly-appointed head of the CIA.

But the determination, and the warning, was that the “commies” on the other side of the Cold War had developed mind control and brainwashing, and the U.S. had to do the same to fight back.

Never mind whether that last part was true or not. And, by the way, it only took six years for this idea to leak into literature with the publication in 1959 of The Manchurian Candidate, which came out as a very amazing and chilling movie three years later. Here’s the opening scene. You should all go watch this film now.

Again, the program started three days after Dulles gave a speech about the dangers of brainwashing and, taking a cue from the Nazis, the CIA worked with LSD, which happened to be legal at the time and, in fact, was being investigated as a psychiatric medication. Even Cary Grant tripped balls.

Of course, the big difference was that in those studies, the subjects had informed consent. The CIA, on the other hand, was pretty much playing Bill Cosby and slipping the drugs to people without their knowledge or consent.

That’s probably where tips from the Japanese biowarfare programs came in, by the way — how to “infect” somebody with something without their knowledge — although the government was also kind of open about it, at least in secret, if that makes sense.

See, after MK-Ultra got started, a man named Sidney Gottlieb arranged for the CIA to pay almost a quarter million dollars in order to import the world’s entire supply of LSD to the U.S., and then (using front organizations) urged hospitals, clinics, prisons, and other such institutions to start experimenting with it and reporting the results.

There’s a 2019 book called Poisoner in Chief that details all of this. If you’re sitting around the house not doing anything else, you should read it. Basically, the government tricked a bunch of medical and corrections professionals into unknowingly carrying out very unethical experiments for them.

That Gottlieb link above is worth a read, too, because in excerpts from the book, it details how the CIA moved its MK-Ultra program offshore to go beyond clinical abuse of LSD and actually get into abduction, torture, and worse.

The goal of brainwashing was to destroy the existing mind and replace it with a new one, although whether it actually works is up for debate. It’s easy to destroy the existing mind — i.e. “ego” — but very difficult to build a new one, at least without consent.

But if you can get consent, you don’t need to destroy anything. The new mind will build itself for you.

I can attest to this from personal experience. When I was in high school, I fell under the influence of a very evil group called Young Life, which is an evangelical Christian organization that basically invades schools and tries to recruit your kids.

How my school, or any school, let it happen, I’ll never know, but their recruiter, a 28-year-old guy named Sandy, used to somehow regularly get access to campus and come hang out and talk to us during recess and lunch.

It all started innocuously enough, with Monday night meetings that were mostly fun hangouts with skits and singing and whatever, but then at the end there’d be the, “Hey, Jesus is cool” message. And at those meetings, it didn’t come with any of the collateral “But Jesus hates (fill in the blank)” crap.

As an adult, it was clear that they targeted the awkward kids who didn’t fit in with the jocks and cheerleaders and whatnot. Marching band, for example, was lousy with Young Life members. And that was the brainwashing hook: “Hey, you’re cool here!”

I drank that Kool Aid for almost two years. I went to a couple of sleep-away camps and worked (for free) for six weeks at one in Canada, and around the end of high school I started going to a fundie Pentecostal evangelical Four Square church that openly preached the gospel of hatred against the LGBTQ community, Jews, liberals, and so on.

Thankfully, I was saved from this crap by… (wait for it) actually reading the Bible during my freshman year of college — ironically, at a Jesuit university — and halfway through the Old Testament I realized, “Holy crap, this is complete and utter bullshit.”

But the brainwashing pattern there is clear. Friend those who think they’re friendless. Make them feel needed and wanted. Reel them in.

Or… follow the government method, and drug or torture them into compliance. Come to think of it, that was the religious method too, until churches discovered marketing.

But not all of the MK-Ultra “experiments” took place in clinics. One incident in particular eventually led to the investigations of the Church Committee. In 1953, a man named Frank Olson died after a fall out of his 10th-floor hotel room window in New York City. He was actually an MK-Ultra insider and he knew all about various things, including the tortures overseas.

Nine days before the fall, he and a group of other members of the team had been dosed with LSD without their knowledge or consent by Gottlieb at a retreat for the CIA’s Technical Services staff. Well, Gottlieb did inform them, but only after they’d finished the spiked bottle of Cointreau.

It was not a great experience for several of the men, including Olson, who started considering resigning the next day. The problem was, as mentioned above, he knew everything about everything. It’s entirely likely that his trip out that hotel window was not a suicide.

Now, I’ve had personal experience with LSD, so I know what it can do. In the right doses and settings, it can be remarkable. But I can also see how somebody being given it without their knowledge and in very high amounts would easily freak out.

Without warning, it would feel like the sudden onset of acute psychosis, with hallucinations and even loss of a sense of self. Another big effect is hyper-awareness of everything, especially all of the minute sounds and smells your body produces. Yes, I’ve heard myself blink.

Your brain’s need to spot patterns in things goes into overdrive, and under the influence it isn’t limited to spotting faces in toast. Any random pattern, like white noise on a TV or a stucco ceiling will suddenly turn into elaborate geometric patterns of astounding complexity and regularity.

Mine tended to follow the kaleidoscope pattern of six triangles joined in a hexagon, although your mileage may vary. As for the “stained glass windows” I would see when I closed my eyes, those colors would generally be what I can only describe as electric neon shades of pink, purple, and cyan.

Once, while listening to Pink Floyd’s Great Gig in the Sky, those stained glass patterns also included lots and lots of boobs, probably because of the female vocalist, but it was an odd touch considering that I’m mostly on the gay side of the Kinsey scale. Not completely, but close enough for jazz hands.

So do governments contemplate insanely heinous and unethical acts for the sake of national self-preservation? All the time. Do they carry them out often? Not really, because saner heads do prevail and do put the brakes on some of the more batshit insane ideas.

Ideas like Operation Northwoods, which would have used false-flag operations to justify an invasion of Cuba in the early 60s, or the 638 ideas for assassinating Fidel Castro that were considered, but most of them never implemented.

Hm. The CIA seemed to have a boner for getting rid of Castro right before the Cuban Missile Crisis, but we know about all of that again thanks to the Church Committee. And they were so successful at it that the man died at 90 in 2016.

Keep that last part in mind the next time you think that there might be a government conspiracy going on. Governments are no good at them, and people are no good at keeping secrets. Ergo, most conspiracies fall apart quickly, and either never happen or are exposed.

As Ben Franklin said, “Three may keep a secret, if two of them are dead.”

Image source: Voice of America/public domain

Theatre Thursday: Fact and fiction

Six hundred and seven years ago today, Henry V was crowned king of England. You probably know him as that king from the movie with Kenneth Branagh, or the BBC series aired under the title The Hollow Crown.

Either way, you know him because of Shakespeare. He was the king who grew up in Shakespeare’s second tetralogy. Yes, that’s a set of four plays and since it was his second, Shakespeare sort of did the Star Wars thing first: he wrote eight plays on the subject of the English Civil war.

And, much like Lucas, he wrote the original tetralogy first, then went back and did the prequels. Richard II, Henry IV, Part 1, Henry IV, Part 2 and Henry V were written after but happened before Henry VI, Part 1, Henry VI, Part 2, Henry VI, Part 3 and Richard III.

Incidentally, Henry VI, Part 1, is famous for having Joan of Arc (aka Joan la Pucelle in the play) as one of the antagonists. Funny thing is, that name wasn’t slander on Shakespeare’s part. That’s what she preferred to call herself.

Meanwhile, Richard III, of course, is the Emperor Palpatine of the series, although we never did get a Richard IV, mainly because he never existed in history. Well, not officially. Richard III’s successor was Henry VII, and Shakespeare never wrote about him, either, although he did gush all over Henry VIII, mainly because he was the father of the Bard’s patron, Elizabeth I. CYA.

If you’ve ever seen the film My Own Private Idaho, directed by Gus Van Sant and staring River Phoenix and Keanu Reeves, then you’ve seen a modern retelling of the two parts of Henry IV.

Now when it comes to adapting true stories to any dramatic medium, you’re going to run into the issue of dramatic license. A documentary shouldn’t have this problem and shouldn’t play with the truth, although it happens. Sometimes, it can even prove fatal.

But when it comes to a dramatic retelling, it is often necessary to fudge things, sometimes a little and sometimes a lot. It’s not at all uncommon for several characters to be combined into a composite just to make for a cleaner plot. After all, is it that big of a difference if, say, King Flagbarp IX in real life was warned about a plot against him in November by his chamberlain Norgelglap, but the person who told him the assassin’s name in February was his chambermaid Hegrezelda?

Maybe, maybe not, but depending on what part either of those characters plays in the rest of the story, as well as the writer’s angle, they may both be combined as Norgelglap or as Hegrezelda, or become a third, completely fictionalized character, Vlanostorf.

Time frames can also change, and a lot of this lands right back in Aristotle’s lap. He created the rules of drama long before hacks like the late Syd Field tried (and failed), and Ari put it succinctly. Every dramatic work has a beginning, a middle, and an end, and should have unity of place, unity of time, and unity of action.

A summary of the last three is this, although remember that Aristotle was writing about the stage. For film and video, your mileage will vary slightly.

The story takes place in one particular location, although that location can be a bit broad. It can be the king’s castle, or it can be the protagonist’s country.

It should take place over a fairly short period of time. Aristotle liked to keep it to a day, but there’s leeway, and we’ve certainly seen works that have taken place over an entire lifetime — although that is certainly a form of both unity of time and unity of place, if you consider the protagonist to be the location as well.

Unity of action is a little abstract, but in a nutshell it’s this: Your plot is about one thing. There’s a single line that goes from A to Z: What your protagonist wants, and how they get it.

Now, my own twist on the beginning, middle, and end thing is that this is a three act structure that gives us twenty-seven units. (Aristotle was big on 5 acts, which Shakespeare used, but that’s long since fallen out of fashion.)

Anyway, to me, we have Act I, II, and III. Beginning, middle, and end. But each of those has its own beginning, middle and end. So now we’re up to nine: I: BME; II: BME; III: BME.

Guess what? Each of those subunits also has a beginning, middle, and end. I’m not going to break that one down further than this. The beginning of the beginning, Act I: B, has its own BME, repeat eight more times.

The end result is 3 x 3 x 3, or twenty-seven.

And that’s my entire secret to structure. You’re welcome.

But because of these little constraints, and because history is messy, it’s necessary to switch things up to turn a true story into a “based on true events” work. Real life doesn’t necessarily have neat beginnings, middles, and endings. It also doesn’t necessarily take place in one spot, or in a short period of time.

So it becomes the artist’s job to tell that story in a way that is as true to reality as possible without being married to the facts.

Although it is also possible to go right off the rails with it, and this is one of the reasons I totally soured on Quentin Tarantino films. It’s one thing to fudge facts a little bit, but when he totally rewrites history in Inglorious Basterds, ignores historical reality in Django Unchained, and then curb stomps reality and pisses on its corpse in Once Upon a Time in Hollywood, I’m done.

Inglorious Misspelling is a particularly egregious example because the industry does a great disservice in selling false history to young people who unfortunately, aren’t getting the best educations right now.

Anecdotal moment: A few years back, an Oscar-winning friend of mine had a play produced that told the story of the 442nd Infantry Regiment. They were a company composed almost entirely of second-generation Japanese Americans during WW II, and joining the company was the alternative given to going to an internment camp.

Of course, being racists, the U.S. government couldn’t send them to the Pacific Theatre to fight, so they sent them to Europe, and a lot of the play takes place in Italy, where the regiment was stationed. And, at intermission, my playwright friend heard two 20-something audience members talking to each other. One of them asked, “What was the U.S. even doing in Italy in World War II?” and the other just shrugged and said, “Dunno.”

So, yeah. If you’re going to go so far as to claim that Hitler was killed in a burning movie theater before the end of the war, just stop right there before you shoot a frame. Likewise with claiming that the Manson murders never happened because a couple of yahoos ran into the killers first.

Yeah, Quentin, you were old, you were there, you remember. Don’t stuff younger heads with shit.

But I do digress.

In Shakespeare’s case, he was pretty accurate in Henry V, although in both parts of Henry IV, he created a character who was both one of his most memorable and one of his more fictional: Sir John Falstaff. In fact, the character was so popular that, at the Queen’s command, Shakespeare gave him his own spinoff, The Merry Wives of Windsor. Hm. Shades of Solo in the Star Wars universe?

Falstaff never existed in real life, but was used as a way to tell the story of the young and immature Henry (not yet V) of Monmouth, aka Prince Hal.

Where Shakespeare may have played more fast and loose was in Richard III. In fact, the Bard vilified him when it wasn’t really deserved. Why? Simple. He was kissing up to Elizabeth I. She was a Tudor, daughter of Henry VIII who, as mentioned previously, was the son of Henry VII, the king who took over when Richard III lost the war of the roses.

The other time that Shakespeare didn’t treat a king so well in a play? King John — which I personally take umbrage to, because I’m directly descended from him. No, really. But the idea when Willie Shakes did that was to draw a direct contrast to how Good Queen Bess did so much better in dealing with Papal interference. (TL;DR: He said, “Okay,” she said, “Eff off.”)

Since most of my stage plays have been based on true stories, I’ve experienced this directly many times, although one of the more interesting came with the production of my play Bill & Joan, because I actually accidentally got something right.

When I first wrote the play, the names of the cops in Mexico who interrogated him were not included in the official biography, so I made up two fictional characters, Enrique and Tito. And so they stayed like that right into pre-production in 2013.

Lo and behold, a new version of the biography of Burroughs I had originally used for research came out, and I discovered two amazing things.

First… I’d always known that Burroughs’ birthday was the day before mine, but I suddenly found out that his doomed wife actually shared my birthday. And the show happened to run during both dates.

Second… the names of the cops who interrogated him were finally included, and one of them was named… Tito.

Of course, I also compressed time, moved shit around, made up more than a few characters, and so forth. But the ultimate goal was to tell the truth of the story, which was: Troubled couple who probably shouldn’t have ever gotten together deals with their issues in the most violent and tragic way possible, and one of them goes on to become famous. The other one dies.

So yes, if you’re writing fiction it can be necessary to make stuff up, but the fine line is to not make too much stuff up. A little nip or tuck here and there is fine. But, outright lies? Nah. Let’s not do that.

Momentous Monday: Interesting times

There is an alleged Chinese curse that did not come from China at all and which may not have even been meant to be a curse when first mentioned by Joseph Chamberlain. The phrase goes like this: “May you live in interesting times.”

The implication, of course, is that interesting times are dangerous ones.

Right now, in the spring of 2020 C.E., the entire planet is living in interesting times, and I have a feeling that all of human history is going through a process of change that will be marked and noted by historians from here on out.

Congratulations, fellow humans. We are indeed living through a profound moment that will leave a different world behind, and those of us who survive it will be able to tell future generations, “Yeah. I was there. We never saw it coming, but it changed everything.”

At the moment, the day to day changes may seem weird and trivial — or not — but consider this. When was your last normal trip to the grocery store? When was the last time you found everything on your list? Why is there still no goddamn TP?

Or eggs and skim milk — those are the weird shortages, actually, because America just makes so goddamn much of both. Oh, sure, we’re lousy with over-priced “organic” bullshit eggs, as well as 2% and Whole milk, but if you’re into non-fat, you’re out of luck.

And get away from me with recommending any kind of “milk” that didn’t come out of a mammal, because that’s not milk. Coconut, almond, soy, whatever? Yep. Not milk. You’re drinking nut juice.

How does that sound?

Gas prices have dropped but that’s okay, because no one is driving anywhere. Those of us who can work from home are maintaining. Those of us who can’t… well, it’s a whole new world.

Certain people seem to think we can end the American lockdown by Easter, which is April 12. Cooler heads say, “Hell no.” This may go on through May or June, and seeing as how the U.S. suddenly became the most infected country in the world on March 26th, the idea of “It’s all over by Easter” is irresponsible as hell.

And remember that this is a pandemic, as in “It doesn’t just affect your town or county or state or country.” This is worldwide. And, as I mentioned above, this one is going to go into the history books along with some of the greatest hits of Events that Changed Everything.

For example:

476 CE: Fall of the Western Roman Empire. The long-term result of this little collapse was the creation of what would become modern Europe. Freed from the yoke of one oppressive empire, various local tribes — which had been allowed to maintain their culture in exchange for providing fealty, soldiers, and taxes to the mothership in Rome — were suddenly free to discover their own identities.

1206 CE: Genghis Kahn begins his conquest of Asia, and almost takes Europe as well. He wiped entire countries and civilizations off of the map, and changed the course of history in Europe forever.

1492 CE: Columbus is allowed to begin the exploitation of the New World, which will lead to an eventual super power that will basically become the new Roman Empire. In effect, this is the continuation of what the Fall of Rome started in Europe

1776 – 1815 CE: A motherlode, from the American Revolution through the French Revolution and the defeat of Napoleon at Waterloo. The monarchical system is basically ripped out of power forever. It starts when those pesky colonists (in the land conquered by the Europeans who existed because Rome fell) rebelled against their mother country and won. France followed by rebelling at home and winning, only to wind up launching the next would-be dictator because they let the “party purity” assholes take control of their revolution. That would-be dictator (Napoleon) was defeated by the British, who had lost the American Revolution. Monarchy in Europe was mostly told to fuck off from this point forward.

1917 – 1918 CE: Double whammy of the Russian Revolution and The Great War. The former would lead to the first successful, long-term revolutionary state (France didn’t make that cut for reasons noted above), while the “Great War” would lead to a sequel, WW II, which would lead to all kinds of things, including the Cold War between the aforementioned Super Power and the USSR

1990 CE: The collapse of the USSR, apparently (but not really) ending the Cold War and pushing the U.S. into the number one spot.

2020 CE – ???: Worldwide pandemic and lack of leadership possibly ends in the collapse of the U.S., leaving China as the world’s last super power; and the independent Republic of California as a major player in the world economy, although we could also see the creation of the country of Pacifica, made up of Washington, Oregon, California, Nevada, and Arizona.

Or… we could somehow manage to get our shit together and survive this whole thing, but I’m not crossing my fingers at this point. We still have to figure out how to have a national election through all of this and, no matter what anyone might think, if the election doesn’t happen, the President doesn’t stay in power.

Rather, his time in office expires on January 20, 2021, along with the Vice President, and the Speaker of the House. Rules of secession would turn the presidency over to the President Pro-Tem of the Senate.

If we still have a government by that point, of course. Enjoy your sheltering in place, and I’ll see you on the other side.

Talky Tuesday: Language is (still) a virus

I used this Burroughs quote as a post title a couple of years ago in an entirely different context, but the idea has taken on new relevance, as I’m sure the world can now agree.

This post’s title comes from a William S. Burroughs quote which reads in full as, “Language is a virus from outer space.”

What he meant by the first part is that words enter a host, infect it, and cause a change in it. Just as a virus hijacks a host’s cells in order to become little factories to make more virus to spread a disease, words hijack a host’s neurons in order to become little factories to make more words to spread ideas and belief systems.

As for the “outer space” part, I think that Burroughs was being metaphorical, with the idea being that any particular language can appear totally alien to any other. While, say, Mongolian and Diné may both come from humans on Earth, if a speaker of either encountered someone who only spoke the other, they might as well be from different planets because, for all intents and purposes, they are from different worlds, unable to communicate with words.

And the language we use can quite literally create and shape our perceptions of the world, as I discussed in my original Language is a virus post. One of the most striking examples I cited in that link was Guugu Yimithirr, an aboriginal language that has no words for relative direction. Instead, they refer to everything based upon where it is relative to actual cardinal directions.

In other words, if you ask someone who speaks this language where you should sit, they won’t say, “In the chair on your left.” Rather, they’ll say something like, “In the chair to the north.” Or south, or east, or west. And a speaker of the language will know what that means, whether they can see outside or not.

Quick — right now, if someone said “Point east,” could you do it without thinking?

And that is how languages can change thoughts and perceptions.

But, sometimes — honestly, far too often — language can change perceptions to create tribalism and hostility, and during this plague year, that has suddenly become a huge topic of debate over a simple change of one C word in a phrase.

I’m writing, of course, about “coronavirus” vs. “Chinese virus.” And the debate is this: Is the latter phrase racist, or just a statement of fact?

One reporter from a rather biased organization did try to start the “it’s not” narrative with the stupidest question ever asked: “Mr. President, do you consider the term ‘Chinese food’ to be racist because it is food that originated from China?”

There are just two problems with this one. The first is that what Americans call “Chinese food” did not, in fact, originate in China. It was the product of Chinese immigrants in America who, being mostly men, didn’t know how to cook, and didn’t have access to a lot of the ingredients from back home. So… they improvised and approximated, and “Chinese food” was created by Chinese immigrants starting in San Francisco in the 19th century.

Initially, it was cuisine meant only for Chinese immigrants because racist Americans wouldn’t touch it, but when Chinatowns had sprung up in other cities, it was New York’s version that finally lured in the hipster foodies of the day to try it, and they were hooked.

In short, “Chinese food” was a positive and voluntary contribution to American culture, and the designation here is merely descriptive, so not racist. “Chinese virus” is a blatant misclassification at best and a definite attempt at a slur at worst, with odds on the latter.

But we’ve seen this with diseases before.

When it comes to simple misidentification of place of origin, let’s look back to almost exactly a century ago, when the Spanish flu went pandemic. From 1918 to 1919, it hit every part of the world, infected 500 million people and killed 50 million.

A little perspective: At the time, the world’s population was only 1.8 billion, so this represents an infection rate of 28% and a mortality rate among the infected of 2.8%. If COVID-19 has similar statistics — and it seems to — then that means this pandemic will infect 2.1 billion people and kill 211 million.

By the way, while the 1918 pandemic was very fatal to children under 5 and adults over 65, it also hit one other demographic particularly hard: 20 to 40 year-olds.

So if you’re in that age range and think that COVID-19 won’t kill you, think again — particularly if you smoke or vape or have asthma, and don’t take the quarantine seriously. And remember: the rich and world leaders are not immune either — not now and not then.

The president of the United States, Woodrow Wilson, was infected in the 1918 H1N1 pandemic in 1919, and while he survived, this assault on his health probably led to the stroke he had late in that year, an incident that was covered up by his wife, with the help of the president’s doctor. The First Lady became de facto president for the remainder of his second term.

In modern times, the first world leader to test positive for coronavirus was Prince Albert II of Monaco, followed not long after by Prince Charles and Boris Johnson. Of course, I’m writing these words a bit ahead of when you’ll read them, so who knows what will have happened by then.

In medical circles, the name “Spanish Flu” has been abandoned, and that particular pandemic is now known as H1N1, which I’m sure looks really familiar to you, because this has been the standard nomenclature for flu viruses for a while: H#N#, sans location, animal, or occupation, more on which in a minute.

But first, let’s get to the reasons behind naming a disease after a place. The H1N1 Pandemic was a simple case of mistaken identity and also contingent upon that whole “Great War” stuff going on in Europe.

See, other countries had been hit by it first, but in the interests of the old dick-waving “Gotta appear strong before our enemies” toxic masculinity, all of them suppressed the news. It wasn’t until Spain started seeing it in their citizens and, because they were neutral, they announced outbreaks, that the world suddenly pointed fingers and said, “Ooh… this came from Spain. Hence, it’s the Spanish flu.”

Except, not. Ironically, it seems now that the Spanish flu originated in… China. Although that’s according to historians. Virologists, on the other hand, have linked it to an H1 strain later identified in pigs in Iowa in the U.S.

Either way, all of the countries involved in WW I, aka “The Great War,” kept mum about it.

So the name “Spanish flu” was a simple mistake. On the other hand, the names of other diseases actually are outright xenophobic or racist, and we only have to look as far  as syphilis to see why.

Basically, syphilis is an STI that was the most feared of its kind until… AIDS, because syphilis was not treatable or curable until penicillin was discovered in 1928 — although it was not produced on a mass scale until 1945, thanks to needs created by WW II, and facilitated by the War Production Board.

Hm. Sound familiar?

But the reason it became known as the French disease outside of France was that it began to spread after Charles VIII of France invaded Italy in 1494-95 to reclaim a kingdom he thought should be his. It was eventually so devastating that Charles had to take his troops home, and so it began to spread in France and across Europe.

Since it first showed up in French soldiers, it was quickly dubbed the French disease in Italy and England, although the French preferred to call it the Italian disease. In reality, it most likely originated in the New World, and was brought back to Europe by… Columbus and his Spanish soldiers, who then somehow managed to spread it to the French as they worked for them as mercenaries.

Hm. STI. A bunch of male soldiers. I wonder how that worked, exactly.

And I am totally destroying my future google search suggestions by researching all of this for you, my loyal readers, just so you know! Not to mention that I can’t wait to see what sort of ads I start getting on social media. “Confidential STI testing!” “Get penicillin without a prescription.” “These three weird tricks will cure the STI. Doctors hate them!”

But the naming of diseases all came to a head almost five years ago when the World Health Organization (WHO)  finally decided, “You know what? We shouldn’t name diseases after people, places, animals, occupations, or things anymore, because that just leads to all kinds of discrimination and offense, and who needs it?”

This directly resulted from the backlash against the naming of the last disease ever named for a place, despite the attempt to obfuscate that in its official terminology. Remember MERS, anyone?  No? That one came about in 2012, was first identified in Saudi Arabia, and was named Middle East respiratory syndrome.

Of course, it didn’t help when things were named swine flu or avian flu, either. A lot of pigs and birds died over those designations. So away went such terminology, especially because of the xenophobic and racist connotations of naming a disease after an entire country or people.

Of course, some antiquated folk don’t understand why it’s at the least racist and at the most dangerous to name diseases the old way, as evinced by the editorial tone of this article from a right-wing publication like The Federalist. But they actually kind of manage to prove the point that yes, such terminology is out of fashion, because the only 21st century example they can list is the aforementioned MERS.

The most recent one before that? Lyme disease, named for Lyme, Connecticut, and designated in… the mid-70s. Not exactly the least racist of times, although this disease was named for a pretty white-bread area.

The only other examples of diseases named for countries on their list: the aforementioned Spanish flu, Japanese encephalitis, named in 1871 (seriously, have you ever heard of that one?); and the German measles, identified in the 18th century, although more commonly known as rubella.

So, again — it’s a list that proves the exact opposite of what it set out to do, and calling coronavirus or COVID-19 the “Chinese virus” or “Chinese disease” is, in fact, racist as hell. And it won’t save a single life.

But calling it that will certainly endanger lives by causing hate crimes — because language is a virus, and when people are infected by malignant viruses, like hate speech, the results can be deadly.

Inoculate yourselves against them with education if possible. Quarantine yourselves against them through critical thinking otherwise. Most of all, through these trying times, stay safe — and stay home!

Image source: Coronavirus Disease 2019 Rotator Graphic for af.mil. (U.S. Air Force Graphic by Rosario “Charo” Gutierrez)

Sunday Nibble #10: Plus ça change

It seems that any sudden societal upheaval in America follows the same basic pattern as the COVID-19 situation, as follows.

  1. Rumors of something bad coming, ignored.
  2. A little bit of the bad thing happens, the media starts to mention it.
  3. A couple more bad things happen, and suddenly the media turns it into a trend.
  4. Continue escalating hype until people freak.
  5. Store shelves stripped bare.
  6. The government fails to react.
  7. Shit gets real.
  8. The government finally sort of does… something?

Specifically, I’m thinking of the L.A. riots, which were nearly 30 years ago, but the same pattern seems to apply to the AIDS crisis (without the hoarding but with the freaking, I think) and it probably applies to the Watts Riots and the Spanish Flu and every other sudden crisis.

But I’m having a definite déjà vu over this one, even though I was a far younger and very naïve person (politically and otherwise) back on April 29, 1992. Okay, same day of the month as this post, a month early, totally unintended.

But that April day was when Los Angeles exploded in violence because the police officers who had beaten Rodney King for no reason were acquitted.

From what I remember, the story broke by the minute, and my dad freaked out about it as soon as he heard the verdict. Of course, he had lived here through the Watts Riots, so he had previous experience. I did not.

Time to stock up on everything, said he, and the stores were insane — much like they were a week before all of California shut down ten days ago.

Water and TP aisles empty, a lot of other essentials practically gone. Well, you know the drill. You all just lived through it.  At the time, though, the assholeishness of it didn’t occur to me because I was still working on installing that whole self-awareness subroutine, but, looking back… yeah. Even my dad had been a greedy asshole about it. Everyone had.

The shutdown due to the riots lasted all of about five days. And, on top of that, I realized that my dad really shouldn’t have been so worried. It was Woodland Hills, way out in the West Valley, aka “The place all the white people moved to in the 60s in order to avoid sending their kids to school with non-white people.”

Poetic justice: I went to school there with a lot of non-white people, and now a lot of the part of Woodland Hills I grew up in and where my parents lived is now heavily Hispanic. I love it. It was when this influx began that all the scared whypipo moved to the Simi Valley.” (My parents tried to join the exodus, but no one wanted to buy their house.)

As for Simi Valley, it’s the home of the Reagan Library, which tells you everything you need to know about it and its demographics. They wanted the place built there, even though the only real connection he had to the city was that he was once governor of the state.

Oh, yeah. One other thing Simi Valley: It was also the venue to which the trial of the cops who beat Rodney King was moved, apparently, with the ultimate defense goal of finding a jury favorable to… the cops. Why would that jury be favorable? Because so many police officers lived there.

And then LA. exploded into violence over a jury verdict delivered in a different county. But that explosion never got anywhere near Woodland Hills because, of course it didn’t.

Now, the eight steps at the top of this article seemed to have taken place all in one day in the case of the L.A. riots — maybe because it threatened rich white people?

Other times, events have moved in much slower motion. Reading the history on it, in the case of the AIDS crisis it took well over a decade to go from point 1 to point 8, and point 6 was intentionally extended, most likely causing the deaths of tens of thousands of people.

And in our modern age, we’ve gone through the cycle in a hyper-fast manner. Still slower than the L.A. riots — or maybe not, because all of the trial drama and build up for that  one took months.

But when it came to Corona Lockdown, we went from 1 to 8 in about three months at most, also stalling for far too long at 6, and we all reacted in the same damn exact way.

Let’s be greedy little bitches and grab everything we can.

And that is wrong, wrong, wrong.

I think that the key, though, is in step 7, as in when shit gets real, but for the 1%. First off, when they realize that they are not immune — and we’ve already had an A-list actor and spouse, several members of Congress, and various other celebrities test positive.

Second is when this realization is going to make them start spending their money on fixing shit, and they’re going to realize that they only caught it because the people they depend on do not have the same access to health care and income security that they do.

All the sheltering in place in the world does no good if their maid has to take public transportation because she can’t afford a car or insurance, and can’t take sick days off if nobody pays her for them.

If a billionaire can’t work for a month it makes no difference, because all of their passive and residual income from investments or rents and royalties keeps rolling in. Until, of course, the stock market tanks and their investments become a bit less valuable, and that’s another thing that makes them think about how helping others will help themselves.

Did I mention that the maid and all those other low-paid workers who interact closely with the billionaire probably don’t have the best health insurance or lowest deductible plan, if any?

And that Mr. or Ms. 1% doesn’t even really notice the help much so that they certainly don’t notice when the maid is coughing all over the counters while cleaning them, or that they themselves have a habit of leaning over their personal assistant from much closer than six feet while telling her what you need her to schedule, all because they’re trying to stare down her top.

They won’t even put two and two together when they suddenly feel feverish, because the only way they’re going to decide to get tested is if they come down with full-blown symptoms or if they hear that someone in their social circle has tested positive or reported symptoms.

Even then, and even if they test positive, they aren’t going to do a thing to help anyone outside of their circles until the big red flag is hoist.

That’s right. We won’t see really important action from the 1% until the grandest event of them all: Somebody in their class dies from this virus — and that is inevitable. Once that happens, you’re going to see mountains moved like never before to block the spread and find a cure.

Just look at how the straight community’s tune changed the second that Magic Johnson announced he was HIV positive. Hey, there’s a reason Magic is still alive and a year older than Rock Hudson was when he AIDS killed him. You do the math.

Yep. Suddenly, death comes calling on their kind and the 1% goes socialist harder than your Bernie bro nephew who’s majoring in PoliSci at Berkeley.

“Pay the peons to stay home and the hell away from me! Give them all the health insurance they need for free so they don’t make my family sick. And let’s do something about all these homeless. No more evictions for now, everyone gets enough money to pay their rent. Ah, hell. Here’s property I bought and never developed, cover it in motor homes. Just keep the homeless the hell out of where I am, okay? And figure out how everybody who can works from home. Give ’em the equipment to do it.”

It’s Scrooge the morning after the four ghosts visit. Sad, but if they’re paying for your Christmas goose, just shut up and cash the checks, no matter how big an asshole your Scrooge was up until their sudden revelation.

Kind of ironic but fitting, really, that the deadly virus of “Trickle Down Economics” that Ronald Reagan foisted on America in the 80s — and which directly created the shitshow we’re living now — might actually start to trickle the hell down because of another deadly virus.

See, the big flaw with “trickle down economics” was the assumption that if you gave rich people more money, they would liberally toss it down on their subordinates, everyone would get raises, and it would be good times.

In reality? Not so much. The only trickle down the working class experienced was getting pissed on by the owners.

The fatal flaw of capitalism is that people — no matter their social status or personal wealth or lack thereof — tend to act, on an anonymous playing field, in their own best interests and no one else’s.

Yes, there are definitely altruistic human beings. Mr. Rogers’ “helpers” do exist, but they are few and far between.

In capitalism, which is a zero sum game, most of the players will only be altruistic when incentivized, and the incentive that works the best is to steer them toward an action that, while serving others instead of themselves, will ultimately cost them less in the long run.

Death is the great equalizer, after all. Not to mention that there is no one so rich that they wouldn’t trade their entire fortune in exchange for fending off death. If our modern robber barons can pull the same trick for only a quarter of their fortune, they will think it had been worth the price, and their selfishness might ultimately leave the world a better place.

We shall see.