Momentous Monday: Meet the Middletons

Thanks to boredom and Amazon Prime, I watched a rather weird movie from the 1930s tonight. While it was only 55 minutes long, it somehow seemed much longer because it was so packed with… all kinds of levels of stuff.

The title is The Middleton Family at the New York World’s Fair, and while the content is 7exactly what it says on the tin, there are so goddamn many moving parts in that tin that this is one worth watching in depth, mainly because it’s a case study in how propaganda can be sometimes wrong, sometimes right and, really, only hindsight can excavate the truth from the bullshit.

While it seems like a feature film telling the fictional story of the (snow-white but they have a black maid!) Middleton Family from Indiana who goes back east ostensibly to visit grandma in New York but, in reality, in order to attend the New York World’s Fair of 1939, in reality this was nothing more than a piece of marketing and/or propaganda created by the Westinghouse Corporation, major sponsors of the fair, poised on the cusp of selling all kinds of new and modern shit to the general public.

Think of them as the Apple or Microsoft of their day, with solutions to everything, and the World’s Fair as the biggest ThingCon in the world.

Plus ça change, right?

But there’s also a second, and very political, vein running through the family story. See, Dad decided to bring the family to the fair specifically to convince 16 year-old son Bud that, despite the bad economic news he and his older friends have been hearing about there being no job market (it is the Great Depression, after all) that there are, in fact, glorious new careers waiting out there.

Meanwhile, Mom is hoping that older daughter Babs will re-connect with high school sweetheart Jim, who had previously moved to New York to work for (wait for it) Westinghouse. Babs is having none of it, though, insisting that she doesn’t love him but, instead, is in love with her art teacher, Nick.

1939: No reaction.

2020: RECORD SCRATCH. WTF? Yeah, this is one of the first of many disconnect moments that are nice reminders of how much things have changed in the 82 years since this film happened.

Girl, you think you want to date your teacher, and anyone should be cool with that? Sorry, but listen to your mama. Note: in the world of the film, this relationship will become problematic for other reasons but, surprise, the reason it becomes problematic then is actually problematic in turn now. More on which later.

Anyway, obviously richer than fuck white family travels from Indiana to New York (they’re rich because Dad owns hardware stores and they brought their black maid with them) but are too cheap to spring for a hotel, instead jamming themselves into Grandma’s house, which is pretty ritzy as well and that says grandma has money too, since her place is clearly close enough to Flushing Meadows in Queens to make the World’s Fair a daily day trip over the course of a weekend.

But it’s okay — everyone owned houses then! (Cough.)

And then it’s off to the fair, and this is where the real value of the film comes in because when we aren’t being propagandized by Westinghouse, we’re actually seeing the fair, and what’s really surprising is how modern and familiar everything looks. Sure, there’s nothing high tech about it in modern terms, but if you dropped any random person from 2020 onto those fairgrounds, they would not feel out of place.

Well, okay, you’d need to put them in period costume first and probably make sure that if they weren’t completely white they could pass for Italian or Greek.

Okay, shit. Ignore that part, let’s move along — as Jimmy, Babs’ high school sweetheart and Westinghouse Shill character, brings us into the pavilion. And there are two really weird dynamics here.

First is that Jimmy is an absolute cheerleader for capitalism, which is jarring without context — get back to that in a moment.

The other weird bit is that Bud seems to be more into Jimmy than Babs ever was, and if you read too much gay subtext into their relationship… well, you can’t read too much , really. Watch it through that filter, and this film takes on a very different and subversive subplot. Sure, it’s clear that the family really wishes Jimmy was the guy Babs stuck with, but it sure feels like Bud wouldn’t mind calling him “Daddy.”

But back to Jimmy shilling for Westinghouse. Here’s the thing: Yeah, sure, he’s all “Rah-Rah capitalism!” and this comes into direct conflict with Nicholas, who is a self-avowed communist. But… the problem is that in America, in 1939, capitalism was the only tool that socialism could use to lift us out of depression and, ultimately, create the middle class.

There’s even a nod to socialism in the opening scene, when Bud tells his dad that the class motto for the guys who graduated the year before was, “WPA, here we come!” The WPA was the government works program designed to create jobs with no particular aim beyond putting people to work.

But once the WPA partnered with those corporations, boom. Jobs. And this was the beginning of the creation of the American Middle Class, which led to the ridiculous prosperity for (white) people from the end of WW II until the 1980s.

More on that later, back to the movie now. As a story with relationships, the film actually works, because we do find ourselves invested in the question, “Who will Babs pick?” It doesn’t help, though, that the pros and cons are dealt with in such a heavy-handed manner.

Jimmy is amazing in every possible way — young, tall, intelligent, handsome, and very knowledgeable at what he does. Meanwhile, Nicholas is short, not as good-1ooking (clearly cast to be more Southern European), obviously a bit older than Babs, and has a very unpleasant personality.

They even give him a “kick the puppy” moment when Babs introduces brother Bud, and Nicholas pointedly ignores the kid. But there’s that other huge issue I already mentioned that just jumps out to a modern audience and yet never gets any mention by the other characters. The guy Babs is dating is her art teacher. And not as in past art teacher, either. As in currently the guy teaching her art.

And she’s dating him and considering marriage.

That wouldn’t fly more than a foot nowadays, and yet in the world of 1939 it seems absolutely normal, at least to the family. Nowadays, it would be the main reason to object to the relationship. Back then, it isn’t even considered.


The flip side of the heavy-handed comes in some of Jimmy’s rebukes of Nicholas’ claims that all of this technology and automation will destroy jobs. While the information Jimmy provides is factual, the way his dialogue here is written and delivered comes across as condescending and patronizing to both Nicholas and the audience, and these are the moments when Jimmy’s character seems petty and bitchy.

But he’s also not wrong, and history bore that out.

Now this was ultimately a film made to make Westinghouse look good, and a major set piece involved an exhibit at the fair that I actually had to look up because at first it was very easy to assume that it was just a bit of remote-controlled special effects set up to pitch an idea that didn’t really exist yet — the 1930s version of vaporware.

Behold Elektro! Here’s the sequence from the movie and as he was presented at the fair. Watch this first and tell me how you think they did it.

Well, if you thought remote operator controlling movement and speaking lines into a microphone like I did at first, that’s understandable. But the true answer is even more amazing: Elektro was completely real.

The thing was using sensors to actually interpret the spoken commands and turn them into actions, which it did by sending light signals to its “brain,” located at the back of the room. You can see the lights flashing in the circular window in the robot’s chest at around 2:30.

Of course, this wouldn’t be the 1930s if the robot didn’t engage in a little bit of sexist banter — or smoke a cigarette. Oh, such different times.

And yet, in a lot of ways, the same. Our toys have just gotten a lot more powerful and much smaller.

You can probably guess which side of the argument wins, and while I can’t disagree with what Westinghouse was boosting at the time, I do have to take issue with one explicit statement. Nicholas believes in the value of art, but Jimmy dismisses it completely, which is a shame.

Sure, it’s coming right out of the Westinghouse corporate playbook, but that part makes no sense, considering how much of the world’s fair and their exhibit hall itself relied on art, design, and architecture. Even if it’s just sizzle, it still sells the steak.

So no points to Westinghouse there but, again, knowing what was about to come by September of 1939 and what a big part industry would have in ensuring that the anti-fascists won, I can sort of ignore the tone-deafness of the statement.

But, like the time-capsule shown in the film, there was a limited shelf-life for the ideas Westinghouse was pushing, and they definitely expired by the dawn of the information age, if not a bit before that.

Here’s the thing: capitalism as a system worked in America when… well, when it worked… and didn’t when it didn’t. Prior to about the early 1930s, when it ran unfettered, it didn’t work at all — except for the super-wealthy robber barons.

Workers had no rights or protections, there were no unions, or child-labor laws, or minimum wages, standard working hours, safety rules, or… anything to protect you if you didn’t happen to own a big chunk of shit.

In other words, you were management, or you were fucked.

Then the whole system collapsed in the Great Depression and, ironically, it took a member of the 1% Patrician Class (FDR) being elected president to then turn his back on his entire class and dig in hard for protecting the workers, enacting all kinds of jobs programs, safety nets, union protections, and so on.

Or, in other words, capitalism in America didn’t work until it was linked to and reined-in by socialism. So we never really had pure capitalism, just a hybrid.

And, more irony: this socio-capitalist model was reinforced after Pearl Harbor Day, when everyone was forced to share and work together and, suddenly, the biggest workforce around was the U.S. military. It sucked in able-bodied men between 17 and 38, and the weird side-effect of the draft stateside was that suddenly women and POC were able to get jobs because there was no one else to do them.

Manufacturing, factory jobs, support work and the like boomed, and so did the beginnings of the middle class. When those soldiers came home, many of them returned to benefits that gave them cheap or free educations, and the ability to buy homes.

They married, they had kids, and they created the Boomers, who grew up in the single most affluent time period in America ever.

Side note: There were also people who returned from the military who realized that they weren’t like the other kids. They liked their own sex, and couldn’t ever face returning home. And so major port towns — San Francisco, Los Angeles, Long Beach, San Diego, Boston, New York, Miami, New Orleans — were flooded with the seeds of future GLB communities. Yes, it was in that order back then, and TQIA+ hadn’t been brought into the fold yet. Well, later, in the 60s. There really wasn’t a name for it or a community in the 1940s.

In the 60s, because the Boomers had grown up with affluence, privilege, and easy access to education, they were also perfectly positioned to rebel their asses off because they could afford to, hence all of the protests and whatnot of that era.

And this sowed the seeds of the end of this era, ironically.

The socio-capitalist model was murdered, quite intentionally, beginning in 1980, when Ronald fucking Reagan became President, and he and his cronies slowly began dismantling everything created by every president from FDR through, believe it or not, Richard Nixon. (Hint: EPA.)

The mantra of these assholes was “Deregulate Everything,” which was exactly what the world was like in the era before FDR.

Just one problem, though. Deregulating any business is no different from getting an alligator to not bite you by removing their muzzle and then saying to them, “You’re not going to bite me, right?”

And then believing them when they swear they won’t before wondering why you and everyone you know has only one arm.

Still, while it supports an economic system that just isn’t possible today without a lot of major changes, The Middletons still provides a nice look at an America that did work because it focused on invention, industry, and manufacturing not as a way to enrich a few shareholders, but as a way to enrich everyone by creating jobs, enabling people to actually buy things, and creating a rising tide to lift all boats.

As for Bud, he probably would have wound up in the military, learned a couple of skills, finished college quickly upon getting out, and then would have gone to work for a major company, possibly Westinghouse, in around 1946, starting in an entry-level engineering job, since that’s the skill and interest he picked up during the War.

Along the way, he finds a wife, gets married and starts a family, and thanks to his job, he has full benefits — for the entire family, medical, dental, and vision; for himself, life insurance to benefit his family; a pension that will be fully vested after ten years; generous vacation and sick days (with unused sick days paid back every year); annual bonuses; profit sharing; and union membership after ninety days on the job.

He and the wife find a nice house on Long Island — big, with a lot of land, in a neighborhood with great schools, and easy access to groceries and other stores. They’re able to save long-term for retirement, as well as for shorter-term things, like trips to visit his folks in Indiana or hers in Miami or, once the kids are old enough, all the way to that new Disneyland place in California, which reminds Bud a lot of the World’s Fair, especially Tomorrowland.

If he’s typical for the era, he will either work for Westinghouse for his entire career, or make the move to one other company. Either way, he’ll retire from an executive level position in about 1988, having been in upper management since about 1964.

With savings, pensions, and Social Security, he and his wife decide to travel the world. Meanwhile, their kids, now around 40 and with kids about to graduate high school, aren’t doing so well, and aren’t sure how they’re going to pay for their kids’ college.

They approach Dad and ask for help, but he can’t understand. “Why don’t you just do what I did?” he asks them.

“Because we can’t,” they reply.

That hopeful world of 1939 is long dead — although, surprisingly, the actor who played Bud is still quite alive.

Image: Don O’Brien, Flickr, 2.0 Generic (CC BY 2.0), the Middleton Family in the May 1939 Country Gentleman ad for the Westinghouse World’s Fair exhibits.

Wonderous Wednesday: 5 Things that are older than you think

A lot of our current technology seems surprisingly new. The iPhone is only about fourteen years old, for example, although the first Blackberry, a more primitive form of smart phone, came out in 1999. The first actual smart phone, IBM’s Simon Personal Communicator, was introduced in 1992 but not available to consumers until 1994. That was also the year that the internet started to really take off with people outside of universities or the government, although public connections to it had been available as early as 1989 (remember Compuserve, anyone?), and the first experimental internet nodes were connected in 1969.

Of course, to go from room-sized computers communicating via acoustic modems along wires to handheld supercomputers sending their signals wirelessly via satellite took some evolution and development of existing technology. Your microwave oven has a lot more computing power than the system that helped us land on the moon, for example. But the roots of many of our modern inventions go back a lot further than you might think. Here are five examples.

Alarm clock

As a concept, alarm clocks go back to the ancient Greeks, frequently involving water clocks. These were designed to wake people up before dawn, in Plato’s case to make it to class on time, which started at daybreak; later, they woke monks in order to pray before sunrise.

From the late middle ages, church towers became town alarm clocks, with the bells set to strike at one particular hour per day, and personal alarm clocks first appeared in 15th-century Europe. The first American alarm clock was made by Levi Hutchins in 1787, but he only made it for himself since, like Plato, he got up before dawn. Antoine Redier of France was the first to patent a mechanical alarm clock, in 1847. Because of a lack of production during WWII due to the appropriation of metal and machine shops to the war effort (and the breakdown of older clocks during the war) they became one of the first consumer items to be mass-produced just before the war ended. Atlas Obscura has a fascinating history of alarm clocks that’s worth a look.

Fax machine

Although it’s pretty much a dead technology now, it was the height of high tech in offices in the 80s and 90s, but you’d be hard pressed to find a fax machine that isn’t part of the built-in hardware of a multi-purpose networked printer nowadays, and that’s only because it’s such a cheap legacy to include. But it might surprise you to know that the prototypical fax machine, originally an “Electric Printing Telegraph,” dates back to 1843.

Basically, as soon as humans figured out how to send signals down telegraph wires, they started to figure out how to encode images — and you can bet that the second image ever sent in that way was a dirty picture. Or a cat photo.

Still, it took until 1964 for Xerox to finally figure out how to use this technology over phone lines and create the Xerox LDX. The scanner/printer combo was available to rent for $800 a month — the equivalent of around $6,500 today — and it could transmit pages at a blazing 8 per minute. The second generation fax machine only weighed 46 lbs and could send a letter-sized document in only six minutes, or ten page per hour. Whoot — progress!

You can actually see one of the Electric Printing Telegraphs in action in the 1948 movie Call Northside 777, in which it plays a pivotal role in sending a photograph cross-country in order to exonerate an accused man.

In case you’re wondering, the title of the film refers to a telephone number from back in the days before what was originally called “all digit dialing.” Up until then, telephone exchanges (what we now call prefixes) were identified by the first two letters of a word, and then another digit or two or three. (Once upon a time, in some areas of the US, phone numbers only had five digits.) So NOrthside 777 would resolve itself to 667-77, with 667 being the prefix. This system started to end in 1958, and a lot of people didn’t like that.

Of course, with the advent of cell phones, prefixes and even area codes have become pretty meaningless, since people tend to keep the number they had in their home town regardless of where they move to, and a “long distance call” is mostly a dead concept now as well, which is probably a good thing.


When do you suppose the first computer animation appeared on film? You may have heard that the original 2D computer generated imagery (CGI) used in a movie was in 1973 in the original film Westworld, inspiration for the recent TV series. Using very primitive equipment, the visual effects designers simulated pixilation of actual footage in order to show us the POV of the robotic gunslinger played by Yul Brynner. It turned out to be a revolutionary effort.

The first 3D CGI happened to be in this film’s sequel, Futureworld in 1976, where the effect was used to create the image of a rotating 3D robot head. However, the first ever CGI sequence was actually made in… 1961. Called Rendering of a planned highway, it was created by the Swedish Royal Institute of Technology on what was then the fastest computer in the world, the BESK, driven by vacuum tubes. It’s an interesting effort for the time, but the results are rather disappointing.

Microwave oven

If you’re a Millennial, then microwave ovens have pretty much always been a standard accessory in your kitchen, but home versions don’t predate your birth by much. Sales began in the late 1960s. By 1972 Litton had introduced microwave ovens as kitchen appliances. They cost the equivalent of about $2,400 today. As demand went up, prices fell. Nowadays, you can get a small, basic microwave for under $50.

But would it surprise you to learn that the first microwave ovens were created just after World War II? In fact, they were the direct result of it, due to a sudden lack of demand for magnetrons, the devices used by the military to generate radar in the microwave range. Not wanting to lose the market, their manufacturers began to look for new uses for the tubes. The idea of using radio waves to cook food went back to 1933, but those devices were never developed.

Around 1946, engineers accidentally realized that the microwaves coming from these devices could cook food, and voìla! In 1947, the technology was developed, although only for commercial use, since the devices were taller than an average man, weighed 750 lbs and cost the equivalent of $56,000 today. It took 20 years for the first home model, the Radarange, to be introduced for the mere sum of $12,000 of today’s dollars.

Music video

Conventional wisdom says that the first music video to ever air went out on August 1, 1981 on MTV, and it was “Video Killed the Radio Star” by The Buggles. As is often the case, conventional wisdom is wrong. It was the first to air on MTV, but the concept of putting visuals to rock music as a marketing tool goes back a lot farther than that.

Artists and labels were making promotional films for their songs back at almost the beginning of the 1960s, with the Beatles a prominent example. Before these, though, was the Scopitone, a jukebox that could play films in sync with music popular from the late 1950s to mid-1960s, and their predecessor was the Panoram, a similar concept popular in the 1940s which played short programs called Soundies.

However, these programs played on a continuous loop, so you couldn’t chose your song. Soundies were produced until 1946, which brings us to the real predecessor of music videos: Vitaphone Shorts, produced by Warner Bros. as sound began to come to film. Some of these featured musical acts and were essentially miniature musicals themselves. They weren’t shot on video, but they introduced the concept all the same. Here, you can watch a particularly fun example from 1935 in 3-strip Technicolor that also features cameos by various stars of the era in a very loose story.

Do you know of any things that are actually a lot older than people think? Let us know in the comments!

Photo credit: Jake von Slatt

Talky Tuesday Special: Erin go, bro!

In which I don’t so much write about language as indulge in my Irish gift of gab in the most meta way possible.

Americans may or may not be celebrating St. Patrick’s Day tomorrow, depending upon which state they live in and whether big gatherings have been allowed yet, although I don’t think that’s the case anywhere. Last year, New York City  cancelled theirs one (of the biggest in the country), along with Chicago and Boston.

But the salient point is that, like Cinco de Mayo to actual Mexicans, St. Patrick’s Day isn’t all that big a deal over in Ireland. It’s more of a religious holiday than a boozefest. They still celebrate it, just not on the same scale as… oh. Never mind. Ireland has cancelled, too.

In honor of the day, I’m bringing up my mother’s people because the Irish in America are a very good example of a group that was once an identifiable and hated minority that went on to assimilate with a vengeance.

If you trust traditional sources, that might not seem the case. According to the Census, 10.5% consider themselves of Irish descent. But, of course, that’s totally unreliable because it’s a self-reported figure. Some people may have no idea where their grandparents or great-grandparents came from. Others may not care about their Irish ancestry, or may identify more strongly with another group or country in their background.

But when you look at objective sources and include any degree of Irish heritage, the number changes dramatically. DNA tests via Ancestry show that two thirds of all people tested have at least some Irish blood in them.

In other words, in that regard, the Irish are the hidden majority in this country. Nice trick, considering that for so much of their history here, mo mhuintir (my people) were a much hated minority.

A group of refuges, fleeing what is basically an attempt at genocide back home, suddenly flood the country. Many of them speak a foreign language or speak English badly; they practice a different religion than most Americans of the time; they are perceived as a big threat to American jobs (although they only took the ones Americans didn’t want); and they are accused of being violent criminals, addicts, or rapists.

Sound familiar? Ripped from recent headlines?

Yes, but all of those attributes were applied to the Irish who came to America in the wake of the potato famine, or the Great Hunger, of the 1850s. A big part of it was religious discrimination. At the time, America was predominantly Protestant, thanks to its initial British invaders… sorry, settlers, but another big group who came over, the Germans, tended to be Lutheran. The Catholic Germans of the north stayed home.

The English never had any problem with the Germans because, surprise, by the time America was founded, the English royal family was actually… German. They ruled via four Georges, one William, and somebody known as Victoria.

After she died, the name of the house changed twice, first with her successor, and then again during World War I (then known as the Great War) because Windsor sounded so much more British than the German Hanover, and the British were fighting the Germans, after all.

That’s right. World War I wasn’t so much a war as a family squabble.

The Germans in America did just fine, though, and I have plenty of them in my background as well. My last name is German, and my great grandfather came from there. He was pretty successful as well, and as far as I know, the only elected official (mayor) who’s my direct ancestor for at least four hundred years.

My Irish ancestors, not so much. They were depicted in the press in completely stereotyped and racist ways — and yes, even though Irish is a nationality, the prejudice they faced was a type of racism because the Irish were not considered to be white by the native-born of the era.

Note that the mention of Germans also being stereotyped in that era refers to the Catholic ones, who finally came over as Germany dissolved into civil war in the mid to late 1800s. Note that this is exactly when my great-grandfather came over with his family.

It was 1883 and he was 18. The village he came from was Michelbach, in Gaggenau, just outside of Stuttgart. It’s close enough to Hamburg to assume that it was very Catholic, but I don’t have to assume.

Thanks to a genealogist who, while studying the village as a whole, found my query online, I know all about all of my ancestors from there back to the late 17th century, thanks to the Catholic Church they were preserved in. So those Bastians were probably Catholic. My dad was definitely not.

I don’t think he practiced any religion except for the Ritual of the Earliest Tee Time via its patron, St. Golf, but I suspect that it was because his mother, who was a combination of French, Welsh, Scottish, maybe Native American, and who knows what-all else, wasn’t at all religious.

But if it was one civil war that brought my German ancestors to America, it was another that really messed with my Irish ancestors. This would be the American Civil War itself, and, ironically (or not) it was a perfect example of the rich pitting one downtrodden class against another.

April 1, 1863 was the date the government in the north set for all men between 20 and 45 to register for the first ever draft, whether they were citizens or immigrants seeking citizenship. On top of this, while you’d think that everyone in the North was against slavery, you’d be wrong. In fact, not only did the business elites in New York support it because they profited off of the cheap labor, too, but so did the lower classes, because they feared the possibility of freed slaves coming to take their jobs.

Hm. That whole mishmash sounds familiar, too.

Oh… there was one other big flaw in the law, and it was this. Anyone could buy their way out of being drafted by either finding a substitute to take their place, or paying $300. Obviously, this meant that buying their way out was impossible for the poor and working class, and these people went apeshit.

This led to the draft riots, the second largest act of civil insurrection in U.S. history, ironically only beaten out by the Civil War itself. Of course, as the riots started, the disgruntled poor, largely Irish, didn’t go after the rich bastards in charge. Nope — they went after the black community instead.

Even then, America used divide and conquer. An object lesson for today. Keep in mind that before the Civil War, the Irish were shoved into the same social circles as blacks who were not slaves, and there was a lot of intermarriage and the like going on. Sadly, the above scare tactics of “they’ll take your jobs” during the Civil War worked, permanently damaging the Irish/Black relationship.

But… it planted a seed, so to speak, and there are plenty of black Americans today who happen to have Irish genes in them.

So how did the Irish manage to climb up the ladder to become respected and considered “white?” Simple… America, never one to back down on xenophobia, simply found new targets. After the whole Irish thing, there were suddenly Italians, Eastern Europeans, Chinese, and Mexicans flooding our shores.

After all that, the Irish didn’t seem all that bad.

By the time that Great War ended, the Irish were totally assimilated. And they saw their first president elected in 1960… wait, right, no. JFK was not the first Irish-American president. That would have been bloody, bloody Andrew Jackson. Too bad he didn’t also claim the title of biggest racist asshole prior to… well, you know who.

But, surprisingly, even as recently as 1960, the big worry was whether an Irish Catholic president would follow the Pope instead of the Constitution. (Hint: It was unfounded.)

So happy St. Patrick’s Day. Although it’s not really celebrated that much in Ireland, it probably is in America because, if all y’all strip down to your genes, you probably do have a little Irish in you. Erin go bragh!

Momentous Monday: Mkay…

That’s MK as in MK-Ultra, and it’s one of the few conspiracies that actually happened although, of course, it didn’t stay secret forever. The program began on April 13, 1953. It was exposed by the Church Committee in 1975, meaning it stayed a secret for 22 years.

That committee was formed by the U.S. Senate precisely to investigate illegal activities by the CIA, like spying on U.S. citizens. MK-Ultra, though, was even darker than that. Its goal was nothing less than mind-control, and it had its roots in Nazi concentration camps and Japan’s infamous Unit 731. The CIA even worked with officers and torturers from both places.

The Nazis had used mescaline on prisoners as a way of trying to make them compliant. Meanwhile, Japan had focused mostly on biological weapons, although they weren’t beyond using live vivisection as a method of torture.

In case you’re wondering, while the Nazis’ main (but not only) targets were Jews, Japan mostly went after the Chinese, and they’re still not big fans of them. They aren’t so fond of Koreans either, though. But that’s got nothing to do with MK-Ultra.

Oddly enough, it was the Korean War that was the catalyst for the whole project starting, as American POWs returning from there began making claims against the U.S. that were not true. Well, not true according to Allen Dulles, newly-appointed head of the CIA.

But the determination, and the warning, was that the “commies” on the other side of the Cold War had developed mind control and brainwashing, and the U.S. had to do the same to fight back.

Never mind whether that last part was true or not. And, by the way, it only took six years for this idea to leak into literature with the publication in 1959 of The Manchurian Candidate, which came out as a very amazing and chilling movie three years later. Here’s the opening scene. You should all go watch this film now.

Again, the program started three days after Dulles gave a speech about the dangers of brainwashing and, taking a cue from the Nazis, the CIA worked with LSD, which happened to be legal at the time and, in fact, was being investigated as a psychiatric medication. Even Cary Grant tripped balls.

Of course, the big difference was that in those studies, the subjects had informed consent. The CIA, on the other hand, was pretty much playing Bill Cosby and slipping the drugs to people without their knowledge or consent.

That’s probably where tips from the Japanese biowarfare programs came in, by the way — how to “infect” somebody with something without their knowledge — although the government was also kind of open about it, at least in secret, if that makes sense.

See, after MK-Ultra got started, a man named Sidney Gottlieb arranged for the CIA to pay almost a quarter million dollars in order to import the world’s entire supply of LSD to the U.S., and then (using front organizations) urged hospitals, clinics, prisons, and other such institutions to start experimenting with it and reporting the results.

There’s a 2019 book called Poisoner in Chief that details all of this. If you’re sitting around the house not doing anything else, you should read it. Basically, the government tricked a bunch of medical and corrections professionals into unknowingly carrying out very unethical experiments for them.

That Gottlieb link above is worth a read, too, because in excerpts from the book, it details how the CIA moved its MK-Ultra program offshore to go beyond clinical abuse of LSD and actually get into abduction, torture, and worse.

The goal of brainwashing was to destroy the existing mind and replace it with a new one, although whether it actually works is up for debate. It’s easy to destroy the existing mind — i.e. “ego” — but very difficult to build a new one, at least without consent.

But if you can get consent, you don’t need to destroy anything. The new mind will build itself for you.

I can attest to this from personal experience. When I was in high school, I fell under the influence of a very evil group called Young Life, which is an evangelical Christian organization that basically invades schools and tries to recruit your kids.

How my school, or any school, let it happen, I’ll never know, but their recruiter, a 28-year-old guy named Sandy, used to somehow regularly get access to campus and come hang out and talk to us during recess and lunch.

It all started innocuously enough, with Monday night meetings that were mostly fun hangouts with skits and singing and whatever, but then at the end there’d be the, “Hey, Jesus is cool” message. And at those meetings, it didn’t come with any of the collateral “But Jesus hates (fill in the blank)” crap.

As an adult, it was clear that they targeted the awkward kids who didn’t fit in with the jocks and cheerleaders and whatnot. Marching band, for example, was lousy with Young Life members. And that was the brainwashing hook: “Hey, you’re cool here!”

I drank that Kool Aid for almost two years. I went to a couple of sleep-away camps and worked (for free) for six weeks at one in Canada, and around the end of high school I started going to a fundie Pentecostal evangelical Four Square church that openly preached the gospel of hatred against the LGBTQ community, Jews, liberals, and so on.

Thankfully, I was saved from this crap by… (wait for it) actually reading the Bible during my freshman year of college — ironically, at a Jesuit university — and halfway through the Old Testament I realized, “Holy crap, this is complete and utter bullshit.”

But the brainwashing pattern there is clear. Friend those who think they’re friendless. Make them feel needed and wanted. Reel them in.

Or… follow the government method, and drug or torture them into compliance. Come to think of it, that was the religious method too, until churches discovered marketing.

But not all of the MK-Ultra “experiments” took place in clinics. One incident in particular eventually led to the investigations of the Church Committee. In 1953, a man named Frank Olson died after a fall out of his 10th-floor hotel room window in New York City. He was actually an MK-Ultra insider and he knew all about various things, including the tortures overseas.

Nine days before the fall, he and a group of other members of the team had been dosed with LSD without their knowledge or consent by Gottlieb at a retreat for the CIA’s Technical Services staff. Well, Gottlieb did inform them, but only after they’d finished the spiked bottle of Cointreau.

It was not a great experience for several of the men, including Olson, who started considering resigning the next day. The problem was, as mentioned above, he knew everything about everything. It’s entirely likely that his trip out that hotel window was not a suicide.

Now, I’ve had personal experience with LSD, so I know what it can do. In the right doses and settings, it can be remarkable. But I can also see how somebody being given it without their knowledge and in very high amounts would easily freak out.

Without warning, it would feel like the sudden onset of acute psychosis, with hallucinations and even loss of a sense of self. Another big effect is hyper-awareness of everything, especially all of the minute sounds and smells your body produces. Yes, I’ve heard myself blink.

Your brain’s need to spot patterns in things goes into overdrive, and under the influence it isn’t limited to spotting faces in toast. Any random pattern, like white noise on a TV or a stucco ceiling will suddenly turn into elaborate geometric patterns of astounding complexity and regularity.

Mine tended to follow the kaleidoscope pattern of six triangles joined in a hexagon, although your mileage may vary. As for the “stained glass windows” I would see when I closed my eyes, those colors would generally be what I can only describe as electric neon shades of pink, purple, and cyan.

Once, while listening to Pink Floyd’s Great Gig in the Sky, those stained glass patterns also included lots and lots of boobs, probably because of the female vocalist, but it was an odd touch considering that I’m mostly on the gay side of the Kinsey scale. Not completely, but close enough for jazz hands.

So do governments contemplate insanely heinous and unethical acts for the sake of national self-preservation? All the time. Do they carry them out often? Not really, because saner heads do prevail and do put the brakes on some of the more batshit insane ideas.

Ideas like Operation Northwoods, which would have used false-flag operations to justify an invasion of Cuba in the early 60s, or the 638 ideas for assassinating Fidel Castro that were considered, but most of them never implemented.

Hm. The CIA seemed to have a boner for getting rid of Castro right before the Cuban Missile Crisis, but we know about all of that again thanks to the Church Committee. And they were so successful at it that the man died at 90 in 2016.

Keep that last part in mind the next time you think that there might be a government conspiracy going on. Governments are no good at them, and people are no good at keeping secrets. Ergo, most conspiracies fall apart quickly, and either never happen or are exposed.

As Ben Franklin said, “Three may keep a secret, if two of them are dead.”

Image source: Voice of America/public domain

Wednesday Wonders: How the world almost ended once

I happen to firmly believe that climate change is real, it is happening, and humans are contributing to and largely responsible for it, but don’t worry — this isn’t going to be a political story. And I’ll admit that I can completely understand some of the deniers’ arguments. No, not the ones that say that “global warming” is a hoax made up so that “evil liberals” in government can tax everyone even more. The understandable arguments are the ones that say, “How could mere humans have such a big effect on the world’s climate?” and “Climate change is cyclic and will happen with or without us.”

That second argument is actually true, but it doesn’t change the fact that our industrialization has had a direct and measurable impact in terms of more greenhouse gases emitted and the planet heating up. Also note: Just because you’re freezing your ass off under the polar vortex doesn’t mean that Earth isn’t getting hotter. Heat just means that there’s more energy in the system and with more energy comes more chaos. Hot places will be hotter. Cold places will be colder. Weather in general will become more violent.

As for the first argument, that a single species, like humans, really can’t have all that great an effect on this big, giant planet, I’d like to tell you a story that will demonstrate how wrong that idea is, and it begins nearly 2.5 billion years ago with the Great Oxygenation Event.

Prior to that point in time, the Earth was mostly populated by anaerobic organisms — that is, organisms that do not use oxygen in their metabolism. In fact, oxygen is toxic to them. The oceans were full of bacteria of this variety. The atmosphere at the time was about 30% carbon dioxide and close to 70% nitrogen, with perhaps a hint of methane, but no oxygen at all. Compare this to the atmosphere of Mars today, which is 95% carbon dioxide, 2.7% nitrogen, and less than 2% other gases. Side note: This makes the movie Mars Attacks! very wrong, because a major plot point was that the Martians could only breathe nitrogen, which is currently 78% of our atmosphere but almost absent in theirs. Oops!

But back to those anaerobic days and what changed them: A species of algae called cyanobacteria figured out the trick to photosynthesis — that is, producing energy not from food, but from sunlight and a few neat chemical processes. (Incidentally, this was also the first step on the evolutionary path to eyes.) Basically, these microscopic fauna would take in water and carbon dioxide, use the power of photons to break some bonds, and then unleash the oxygen from both of those elements while using the remaining carbon and hydrogen.

At first, things were okay because oxygen tended to be trapped by organic matter (any carbon containing compound) or iron (this is how rust is made), and there were plenty of both floating around to do the job, so both forms of bacteria got along fine. But there eventually became a point when there were not enough oxygen traps, and so things started to go off the rails. Instead of being safely sequestered, the oxygen started to get out into the atmosphere, with several devastating results.

First, of course, was that this element was toxic to the anaerobic bacteria, and so it started to kill them off big time. They just couldn’t deal with it, so they either died or adapted to a new ecological niche in low-oxygen environments, like the bottom of the sea. Second, though, and more impactful: All of this oxygen wound up taking our whatever atmospheric methane was left and converting it into carbon dioxide. Now the former is a more powerful greenhouse gas, and so was keeping the planet warm. The latter was and still is less effective. The end result of the change was a sudden and very long ice age known as the Huronian glaciation, which lasted for 300 million years — the oldest and longest ice age to date. The result of this was that most of the cyanobacteria died off as well.

So there you have it. A microscopic organism, much smaller than any of us and without any kind of technology or even intelligence to speak of, almost managed to wipe out all life forms on the planet and completely alter the climate for tens of millions of years, and they may have tipped the balance in as little as a million years.

We are much, much bigger than bacteria — about a million times, actually — and so our impact on the world is proportionally larger, even if they vastly outnumbered our current population of around 7.5 billion. But these tiny, mindless organisms managed to wipe out most of the life on Earth at the time and change the climate for far longer than humans have even existed.

Don’t kid yourself by thinking that humanity cannot and is not doing the same thing right now. Whether we’ll manage to turn the planet into Venus or Pluto is still up for debate. Maybe we’ll get a little of both. But to try to hand-wave it away by claiming we really can’t have that much of an impact is the road to perdition. If single-celled organisms could destroy the entire ecosystem, imagine how much worse we can do with our roughly 30 to 40 trillion cells, and then do your best to not contribute to that destruction.

Theatre Thursday: Fact and fiction

About six hundred and eight years ago, Henry V was crowned king of England. You probably know him as that king from the movie with Kenneth Branagh, or the BBC series aired under the title The Hollow Crown.

Either way, you know him because of Shakespeare. He was the king who grew up in Shakespeare’s second tetralogy. Yes, that’s a set of four plays and since it was his second, Shakespeare sort of did the Star Wars thing first: he wrote eight plays on the subject of the English Civil war.

And, much like Lucas, he wrote the original tetralogy first, then went back and did the prequels. Richard II, Henry IV, Part 1, Henry IV, Part 2 and Henry V were written after but happened before Henry VI, Part 1, Henry VI, Part 2, Henry VI, Part 3 and Richard III.

Incidentally, Henry VI, Part 1, is famous for having Joan of Arc (aka Joan la Pucelle in the play) as one of the antagonists. Funny thing is, that name wasn’t slander on Shakespeare’s part. That’s what she preferred to call herself.

Meanwhile, Richard III, of course, is the Emperor Palpatine of the series, although we never did get a Richard IV, mainly because he never existed in history. Well, not officially. Richard III’s successor was Henry VII, and Shakespeare never wrote about him, either, although he did gush all over Henry VIII, mainly because he was the father of the Bard’s patron, Elizabeth I. CYA.

If you’ve ever seen the film My Own Private Idaho, directed by Gus Van Sant and staring River Phoenix and Keanu Reeves, then you’ve seen a modern retelling of the two parts of Henry IV.

Now when it comes to adapting true stories to any dramatic medium, you’re going to run into the issue of dramatic license. A documentary shouldn’t have this problem and shouldn’t play with the truth, although it happens. Sometimes, it can even prove fatal.

But when it comes to a dramatic retelling, it is often necessary to fudge things, sometimes a little and sometimes a lot. It’s not at all uncommon for several characters to be combined into a composite just to make for a cleaner plot. After all, is it that big of a difference if, say, King Flagbarp IX in real life was warned about a plot against him in November by his chamberlain Norgelglap, but the person who told him the assassin’s name in February was his chambermaid Hegrezelda?

Maybe, maybe not, but depending on what part either of those characters plays in the rest of the story, as well as the writer’s angle, they may both be combined as Norgelglap or as Hegrezelda, or become a third, completely fictionalized character, Vlanostorf.

Time frames can also change, and a lot of this lands right back in Aristotle’s lap. He created the rules of drama long before hacks like the late Syd Field tried (and failed), and Ari put it succinctly. Every dramatic work has a beginning, a middle, and an end, and should have unity of place, unity of time, and unity of action.

A summary of the last three is this, although remember that Aristotle was writing about the stage. For film and video, your mileage will vary slightly.

The story takes place in one particular location, although that location can be a bit broad. It can be the king’s castle, or it can be the protagonist’s country.

It should take place over a fairly short period of time. Aristotle liked to keep it to a day, but there’s leeway, and we’ve certainly seen works that have taken place over an entire lifetime — although that is certainly a form of both unity of time and unity of place, if you consider the protagonist to be the location as well.

Unity of action is a little abstract, but in a nutshell it’s this: Your plot is about one thing. There’s a single line that goes from A to Z: What your protagonist wants, and how they get it.

Now, my own twist on the beginning, middle, and end thing is that this is a three act structure that gives us twenty-seven units. (Aristotle was big on 5 acts, which Shakespeare used, but that’s long since fallen out of fashion.)

Anyway, to me, we have Act I, II, and III. Beginning, middle, and end. But each of those has its own beginning, middle and end. So now we’re up to nine: I: BME; II: BME; III: BME.

Guess what? Each of those subunits also has a beginning, middle, and end. I’m not going to break that one down further than this. The beginning of the beginning, Act I: B, has its own BME, repeat eight more times.

The end result is 3 x 3 x 3, or twenty-seven.

And that’s my entire secret to structure. You’re welcome.

But because of these little constraints, and because history is messy, it’s necessary to switch things up to turn a true story into a “based on true events” work. Real life doesn’t necessarily have neat beginnings, middles, and endings. It also doesn’t necessarily take place in one spot, or in a short period of time.

So it becomes the artist’s job to tell that story in a way that is as true to reality as possible without being married to the facts.

Although it is also possible to go right off the rails with it, and this is one of the reasons I totally soured on Quentin Tarantino films. It’s one thing to fudge facts a little bit, but when he totally rewrites history in Inglorious Basterds, ignores historical reality in Django Unchained, and then curb stomps reality and pisses on its corpse in Once Upon a Time in Hollywood, I’m done.

Inglorious Misspelling is a particularly egregious example because the industry does a great disservice in selling false history to young people who unfortunately, aren’t getting the best educations right now.

Anecdotal moment: A few years back, an Oscar-winning friend of mine had a play produced that told the story of the 442nd Infantry Regiment. They were a company composed almost entirely of second-generation Japanese Americans during WW II, and joining the company was the alternative given to going to an internment camp.

Of course, being racists, the U.S. government couldn’t send them to the Pacific Theatre to fight, so they sent them to Europe, and a lot of the play takes place in Italy, where the regiment was stationed. And, at intermission, my playwright friend heard two 20-something audience members talking to each other. One of them asked, “What was the U.S. even doing in Italy in World War II?” and the other just shrugged and said, “Dunno.”

So, yeah. If you’re going to go so far as to claim that Hitler was killed in a burning movie theater before the end of the war, just stop right there before you shoot a frame. Likewise with claiming that the Manson murders never happened because a couple of yahoos ran into the killers first.

Yeah, Quentin, you were old, you were there, you remember. Don’t stuff younger heads with shit.

But I do digress.

In Shakespeare’s case, he was pretty accurate in Henry V, although in both parts of Henry IV, he created a character who was both one of his most memorable and one of his more fictional: Sir John Falstaff. In fact, the character was so popular that, at the Queen’s command, Shakespeare gave him his own spinoff, The Merry Wives of Windsor. Hm. Shades of Solo in the Star Wars universe?

Falstaff never existed in real life, but was used as a way to tell the story of the young and immature Henry (not yet V) of Monmouth, aka Prince Hal.

Where Shakespeare may have played more fast and loose was in Richard III. In fact, the Bard vilified him when it wasn’t really deserved. Why? Simple. He was kissing up to Elizabeth I. She was a Tudor, daughter of Henry VIII who, as mentioned previously, was the son of Henry VII, the king who took over when Richard III lost the war of the roses.

The other time that Shakespeare didn’t treat a king so well in a play? King John — which I personally take umbrage to, because I’m directly descended from him. No, really. But the idea when Willie Shakes did that was to draw a direct contrast to how Good Queen Bess did so much better in dealing with Papal interference. (TL;DR: He said, “Okay,” she said, “Eff off.”)

Since most of my stage plays have been based on true stories, I’ve experienced this directly many times, although one of the more interesting came with the production of my play Bill & Joan, because I actually accidentally got something right.

When I first wrote the play, the names of the cops in Mexico who interrogated him were not included in the official biography, so I made up two fictional characters, Enrique and Tito. And so they stayed like that right into pre-production in 2013.

Lo and behold, a new version of the biography of Burroughs I had originally used for research came out, and I discovered two amazing things.

First… I’d always known that Burroughs’ birthday was the day before mine, but I suddenly found out that his doomed wife actually shared my birthday. And the show happened to run during both dates.

Second… the names of the cops who interrogated him were finally included, and one of them was named… Tito.

Of course, I also compressed time, moved shit around, made up more than a few characters, and so forth. But the ultimate goal was to tell the truth of the story, which was: Troubled couple who probably shouldn’t have ever gotten together deals with their issues in the most violent and tragic way possible, and one of them goes on to become famous. The other one dies.

So yes, if you’re writing fiction it can be necessary to make stuff up, but the fine line is to not make too much stuff up. A little nip or tuck here and there is fine. But, outright lies? Nah. Let’s not do that.

Talky Tuesday: Language is (still) a virus

The following is a repost from the end of March 2020, just eleven days after the lockdown started in Los Angeles. This was the view from the other side of the pandemic, before we knew anything.

I used this Burroughs quote as a post title a couple of years ago in an entirely different context, but the idea has taken on new relevance, as I’m sure the world can now agree.

This post’s title comes from a William S. Burroughs quote which reads in full as, “Language is a virus from outer space.”

What he meant by the first part is that words enter a host, infect it, and cause a change in it. Just as a virus hijacks a host’s cells in order to become little factories to make more virus to spread a disease, words hijack a host’s neurons in order to become little factories to make more words to spread ideas and belief systems.

As for the “outer space” part, I think that Burroughs was being metaphorical, with the idea being that any particular language can appear totally alien to any other. While, say, Mongolian and Diné may both come from humans on Earth, if a speaker of either encountered someone who only spoke the other, they might as well be from different planets because, for all intents and purposes, they are from different worlds, unable to communicate with words.

And the language we use can quite literally create and shape our perceptions of the world, as I discussed in my original Language is a virus post. One of the most striking examples I cited in that link was Guugu Yimithirr, an aboriginal language that has no words for relative direction. Instead, they refer to everything based upon where it is relative to actual cardinal directions.

In other words, if you ask someone who speaks this language where you should sit, they won’t say, “In the chair on your left.” Rather, they’ll say something like, “In the chair to the north.” Or south, or east, or west. And a speaker of the language will know what that means, whether they can see outside or not.

Quick — right now, if someone said “Point east,” could you do it without thinking?

And that is how languages can change thoughts and perceptions.

But, sometimes — honestly, far too often — language can change perceptions to create tribalism and hostility, and during this plague year, that has suddenly become a huge topic of debate over a simple change of one C word in a phrase.

I’m writing, of course, about “coronavirus” vs. “Chinese virus.” And the debate is this: Is the latter phrase racist, or just a statement of fact?

One reporter from a rather biased organization did try to start the “it’s not” narrative with the stupidest question ever asked: “Mr. President, do you consider the term ‘Chinese food’ to be racist because it is food that originated from China?”

There are just two problems with this one. The first is that what Americans call “Chinese food” did not, in fact, originate in China. It was the product of Chinese immigrants in America who, being mostly men, didn’t know how to cook, and didn’t have access to a lot of the ingredients from back home. So… they improvised and approximated, and “Chinese food” was created by Chinese immigrants starting in San Francisco in the 19th century.

Initially, it was cuisine meant only for Chinese immigrants because racist Americans wouldn’t touch it, but when Chinatowns had sprung up in other cities, it was New York’s version that finally lured in the hipster foodies of the day to try it, and they were hooked.

In short, “Chinese food” was a positive and voluntary contribution to American culture, and the designation here is merely descriptive, so not racist. “Chinese virus” is a blatant misclassification at best and a definite attempt at a slur at worst, with odds on the latter.

But we’ve seen this with diseases before.

When it comes to simple misidentification of place of origin, let’s look back to almost exactly a century ago, when the Spanish flu went pandemic. From 1918 to 1919, it hit every part of the world, infected 500 million people and killed 50 million.

A little perspective: At the time, the world’s population was only 1.8 billion, so this represents an infection rate of 28% and a mortality rate among the infected of 2.8%. If COVID-19 has similar statistics — and it seems to — then that means this pandemic will infect 2.1 billion people and kill 211 million.

By the way, while the 1918 pandemic was very fatal to children under 5 and adults over 65, it also hit one other demographic particularly hard: 20 to 40 year-olds.

So if you’re in that age range and think that COVID-19 won’t kill you, think again — particularly if you smoke or vape or have asthma, and don’t take the quarantine seriously. And remember: the rich and world leaders are not immune either — not now and not then.

The president of the United States, Woodrow Wilson, was infected in the 1918 H1N1 pandemic in 1919, and while he survived, this assault on his health probably led to the stroke he had late in that year, an incident that was covered up by his wife, with the help of the president’s doctor. The First Lady became de facto president for the remainder of his second term.

In modern times, the first world leader to test positive for coronavirus was Prince Albert II of Monaco, followed not long after by Prince Charles and Boris Johnson. Of course, I’m writing these words a bit ahead of when you’ll read them, so who knows what will have happened by then.

In medical circles, the name “Spanish Flu” has been abandoned, and that particular pandemic is now known as H1N1, which I’m sure looks really familiar to you, because this has been the standard nomenclature for flu viruses for a while: H#N#, sans location, animal, or occupation, more on which in a minute.

But first, let’s get to the reasons behind naming a disease after a place. The H1N1 Pandemic was a simple case of mistaken identity and also contingent upon that whole “Great War” stuff going on in Europe.

See, other countries had been hit by it first, but in the interests of the old dick-waving “Gotta appear strong before our enemies” toxic masculinity, all of them suppressed the news. It wasn’t until Spain started seeing it in their citizens and, because they were neutral, they announced outbreaks, that the world suddenly pointed fingers and said, “Ooh… this came from Spain. Hence, it’s the Spanish flu.”

Except, not. Ironically, it seems now that the Spanish flu originated in… China. Although that’s according to historians. Virologists, on the other hand, have linked it to an H1 strain later identified in pigs in Iowa in the U.S.

Either way, all of the countries involved in WW I, aka “The Great War,” kept mum about it.

So the name “Spanish flu” was a simple mistake. On the other hand, the names of other diseases actually are outright xenophobic or racist, and we only have to look as far  as syphilis to see why.

Basically, syphilis is an STI that was the most feared of its kind until… AIDS, because syphilis was not treatable or curable until penicillin was discovered in 1928 — although it was not produced on a mass scale until 1945, thanks to needs created by WW II, and facilitated by the War Production Board.

Hm. Sound familiar?

But the reason it became known as the French disease outside of France was that it began to spread after Charles VIII of France invaded Italy in 1494-95 to reclaim a kingdom he thought should be his. It was eventually so devastating that Charles had to take his troops home, and so it began to spread in France and across Europe.

Since it first showed up in French soldiers, it was quickly dubbed the French disease in Italy and England, although the French preferred to call it the Italian disease. In reality, it most likely originated in the New World, and was brought back to Europe by… Columbus and his Spanish soldiers, who then somehow managed to spread it to the French as they worked for them as mercenaries.

Hm. STI. A bunch of male soldiers. I wonder how that worked, exactly.

And I am totally destroying my future google search suggestions by researching all of this for you, my loyal readers, just so you know! Not to mention that I can’t wait to see what sort of ads I start getting on social media. “Confidential STI testing!” “Get penicillin without a prescription.” “These three weird tricks will cure the STI. Doctors hate them!”

But the naming of diseases all came to a head almost five years ago when the World Health Organization (WHO)  finally decided, “You know what? We shouldn’t name diseases after people, places, animals, occupations, or things anymore, because that just leads to all kinds of discrimination and offense, and who needs it?”

This directly resulted from the backlash against the naming of the last disease ever named for a place, despite the attempt to obfuscate that in its official terminology. Remember MERS, anyone?  No? That one came about in 2012, was first identified in Saudi Arabia, and was named Middle East respiratory syndrome.

Of course, it didn’t help when things were named swine flu or avian flu, either. A lot of pigs and birds died over those designations. So away went such terminology, especially because of the xenophobic and racist connotations of naming a disease after an entire country or people.

Of course, some antiquated folk don’t understand why it’s at the least racist and at the most dangerous to name diseases the old way, as evinced by the editorial tone of this article from a right-wing publication like The Federalist. But they actually kind of manage to prove the point that yes, such terminology is out of fashion, because the only 21st century example they can list is the aforementioned MERS.

The most recent one before that? Lyme disease, named for Lyme, Connecticut, and designated in… the mid-70s. Not exactly the least racist of times, although this disease was named for a pretty white-bread area.

The only other examples of diseases named for countries on their list: the aforementioned Spanish flu, Japanese encephalitis, named in 1871 (seriously, have you ever heard of that one?); and the German measles, identified in the 18th century, although more commonly known as rubella.

So, again — it’s a list that proves the exact opposite of what it set out to do, and calling coronavirus or COVID-19 the “Chinese virus” or “Chinese disease” is, in fact, racist as hell. And it won’t save a single life.

But calling it that will certainly endanger lives by causing hate crimes — because language is a virus, and when people are infected by malignant viruses, like hate speech, the results can be deadly.

Inoculate yourselves against them with education if possible. Quarantine yourselves against them through critical thinking otherwise. Most of all, through these trying times, stay safe — and stay home!

Image source: Coronavirus Disease 2019 Rotator Graphic for (U.S. Air Force Graphic by Rosario “Charo” Gutierrez)

Look, up in the sky!

Throughout history, humans have been fascinated with the sky, and a lot of our myths were attempts to explain what goes on up there. In many cultures, the five planets visible to the naked eye (Mercury, Venus, Mars, Jupiter, and Saturn) were named after deities or attributes of the planets with surprising consistency.

Mercury was often named for its swiftness in orbiting the Sun; Venus was always associated with beauty because of its brightness; Mars’s red color led to it being named either for a violent deity or that color; Jupiter was always associated with the chief deity even though nobody in those times had any idea it was the largest planet; and Saturn, at least in the west, was named after Jupiter’s father.

This led to Uranus, which wasn’t discovered until the 18th century, being named after Saturn’s father, i.e. Jupiter’s grandfather. Neptune, discovered in the 19th century, and Pluto, discovered in the 20th century before being rightfully demoted from planetary status, were only named for Jupiter’s less cool brothers.

Since the planets were given attributes associated with deities, their relationship to each other must have meant something, and so the bogus art of astrology was invented, although it was obviously not complete prior to Uranus, Neptune, and Pluto being added, but then was clearly incorrect during the entire period of time that Pluto was a planet. (Hint: That was a joke. It was incorrect the entire time.)

Since humans are also hard-wired to see patterns, the stars above led to the definition of constellations, the night-time version of the “What is that cloud shaped like?” game.

It wasn’t really until the renaissance and the rise of science, including things like optics (one of Newton’s discoveries), which gave us telescopes, that we really started to take a look at the skies study them. But it is still astounding how so many laypeople know so little about what’s up there that we have had completely natural phenomena freaking us out since forever. Here are five examples of things in the sky that made people lose their stuff.

1. Total eclipse of the heart… er… Sun

Until science openly explained them, eclipses of any kind were scary. For one thing, nobody knew when they were coming until Royal Astronomer became a thing, but only the elite were privy to the information, so the Sun would go out or the Moon would turn blood red, or either one of them would appear to lose a piece at random and without warning. Generally, the belief was that the Moon or Sun (particularly the latter) was being consumed by some malevolent yet invisible beast that needed to be scared away.

But long after modern science explained that an eclipse was nothing more than the Moon passing in front of the Sun or the Earth passing in front of the Moon, shit went down in 1878, at least in low-information areas.

The thing about this eclipse was that it had been predicted close to a century before, had been well-publicized, and was going to put the path of totality across the entire U.S. for the first time since its founding. There’s even a book about it, American Eclipse. But there’s also a tragic part of the story. While the news had spread across most of the growing nation, it didn’t make it to Texas, and farm workers there, confronted with the sudden loss of the Sun, took it to mean all kinds of things. A lot of them thought that it was a portent of the return of Jesus, and in at least one case, a father killed his children and then himself in order to avoid the apocalypse.

2. Captain Comet!

Ah, comets. They are an incredibly rare sight in the sky and well worth traveling to see if that’s what you need to do. I remember a long trek into the darkness when I was pretty young to go see Comet Hyakutake, and yes it was worth it. It was a glorious blue-green fuzzball planted in space with a huge tail. Of course, I knew what it was. In the past, not so much.

In the ancient world, yet again, they were seen as bad omens because something in the heavens had gone wrong. The heavens, you see, were supposed to be perfect, but there was suddenly this weird… blot on them. Was it a star that grew fuzzy? Was it coming to eat the Earth? What could be done?

That may all sound silly, but as recently as 1910, people still flipped their shit over the return of one of the more predictable and periodic of “fuzzy stars.” That would be Comet Halley. And, by the way, it’s not pronounced “Hay-lee.” It’s “Hall-lee.”

And why did it happen? Simple. A French astronomer who should have known better, wrote that the tail of the comet was full of gases, including hydrogen and cyanide, and if the Earth passed through the tail, we would either be gassed to death or blown up. Unfortunately, another French astronomer at the time actually played “Got your back” with him, and that was all it took.

It was pseudoscience bullshit at its finest, propagated by the unquestioning and uninformed (when it comes to science) media, and it created a panic even though it was completely wrong.

The worst part about Halley’s 1910 appearance? It bore out Mark Twain’s statement, paraphrased probably: “I came into the world with it, I will go out with it.” And he did. Goddamit.

3. Meteoric rise is an oxymoron

And it definitely is, because a meteor only becomes one because it’s falling. And while we’re here, let’s look at three often confused words: Meteor, meteoroid, and meteorite.

The order is this: Before it gets here and is still out in space, it’s a meteoroid. Once it hits our atmosphere and starts to glow and burn up, it has become a meteor. Only the bits that actually survive to slam into the planet get to be called meteorites. Oid, or, ite. I suppose you could think of it as being in the vOID, coming fOR you, and then crash, goodnITE.

So the things that mostly cause panic are meteors, and quite recently, a meteor blowing up over Russia convinced people that they were under attack. It was a fireball that crashed into the atmosphere on February 15, 2013, and it actually did cause damage and injuries on the ground.

The numbers on the Chelyabinsk meteor are truly staggering, especially to think that they involved no high explosives, just friction and pure physics (Hello again, Sir Isaac!) The thing was about 66 feet in diameter, which is the length of a cricket pitch, or about four feet longer than a bowling lane. It compares to a lot of things, and you can find some fun examples here.

But there was nothing fun about this asteroid. It came screaming through our atmosphere at about 41,000 miles an hour at a steep angle. The heat from the friction of our atmosphere quickly turned it into a fireball of the superbolide variety, which is one that is brighter than the sun. It exploded about 18 miles up. That explosion created a fireball of hot gas and dust a little over four miles in diameter. The kinetic energy of the event was about 30 times the force of the atom bomb dropped on Hiroshima.

Over 7,200 buildings were damaged and 1,500 people were injured enough to need medical attention, mostly due to flying glass and other effects of the shockwave. Unlike other items on this list, these events actually can be dangerous, although this was the first time in recorded history that people were known to have been injured by a meteor. The Tunguska event, in 1908, was a little bit bigger and also in Russia, but happened in a remote and sparsely populated area, with no reported human injuries. Local reindeer were not so lucky.

4. Conjunction junction, what’s your function?

A conjunction is defined as any two or more objects in space which appear to be close together or overlapping when viewed from the Earth. Every solar eclipse is a conjunction of the Sun and Moon as seen from the Earth. Oddly enough, a lunar eclipse is not a conjunction from our point of view, because it’s our planet that’s casting the shadow on the Moon.

Conjunctions shouldn’t be all that surprising for a few reasons.

First is that most of the planets pretty much orbit in the same plane, defined by the plane in which the Earth orbits because that makes the most sense from an observational standpoint.

The inclination of Earth’s orbit is zero degrees by definition and the plane we orbit in is called the ecliptic. You can probably figure out where that name came from. Out of the planets, the one with the greatest inclination is Mercury, at 7º. Counting objects in the solar system in general, the dwarf planet Pluto has an inclination of 17.2º — which is just another argument against it being a true planet. None of the planets not yet mentioned have an inclination of more than 4º, which really isn’t a whole lot.

The second reason conjunctions should not be all that surprising is because each planet has to move at a particular velocity relative to its distance from the Sun to maintain its orbit. The farther out you are, the faster you have to go. Although this is a function of gravity, the airplane analogy will show you why this makes sense.

As an airplane gains speed, the velocity of air over the wings increases, generating more lift, bringing the airplane higher. In space, there’s no air to deal with, but remember that any object in orbit is essentially falling around the body it orbits, but doing it fast enough to keep missing.

If it slows down too much, it will start to fall, but if it speeds up its orbit will get bigger. This is directly analogous to ballistics, which describes the arc of a flying projectile. The faster it launches the farther it goes and the bigger the arc it makes. An arc in orbit becomes an ellipse.

Since every planet is moving at the speed required to keep it at the distance it is, things are likely to sync up occasionally. Sometimes, it will only be one or two planets, but on certain instances, it will be most or all of them. This video is a perfect example. Each one of the balls is on a string of a different length, so its natural period is different. Sometimes, the pattern becomes quite chaotic, but every so often it syncs up perfectly. Note that all of them did start in sync, so it is mathematically inevitable that they will sync up again at the point that all of the different period multiply to the same number. Our solar system is no different since the planets all started as rings of gas and dust swirling around the Sun. There was a sync point somewhen.

So conjunctions are a completely normal phenomenon, but that doesn’t mean that people haven’t gone completely stupid with them. The first way is via astrology, which isn’t even worth debunking because it’s such a load. The Sun is 99.8% of the mass of the solar system, so it constantly has more influence in every possible way over everything else hands down. What influence did the planets have upon your personality at birth? Less than zero. The only relevant factor, really, is that people’s personalities are formed by their relative age when they started school, so that is influenced by the season they were born in, but that’s about it.

As for the modern version of people going completely stupid over conjunctions, it happened in the early 1980s, after the 1974 publication of the book The Jupiter Effect by John Gribbin and Stephen Plagemann. In short, they predicted that a conjunction of the planets on March 10, 1982 would cause a series of earthquakes that would wipe out Los Angeles.

Since you’re reading this in at least the year 2020 and I’m quite safely in Los Angeles, you know how their prediction turned out. This didn’t stop them from backtracking a month later and releasing a follow-up book called The Jupiter Effect Reconsidered (aka We Want More Money from the Gullible) in which they claimed, “Oh… we miscalculated. The date was actually in 1980, and the conjunction (that hadn’t happened yet) caused Mount St. Helens to erupt.”

Still, just like with the whole end of the world 2012 predictions, at least some people bought into it.

5. The original star wars

The last item on our list is possibly a one-off, occurring on April 14, 1561 in Nuremberg, Germany. Whether it actually even happened is debatable since only a single account of it survives in the form of a broadsheet — basically the blog post of its day. If it had been as widespread as the story makes it seem, after all, there should have been reports from all across Europe unless, of course, the point of view from Nuremberg created the exceptional event in the first place.

It was described as an aerial battle that began between 4 and 5 a.m. when “a dreadful apparition occurred on the sun.” I’ll quote the rest of the paragraph in translation in full from the article linked above: “At first there appeared in the middle of the sun two blood-red semicircular arcs, just like the moon in its last quarter. And in the sun, above and below and on both sides, the color was blood, there stood a round ball of partly dull, partly black ferrous color. Likewise there stood on both sides and as a torus about the sun such blood-red ones and other balls in large numbers, about three in a line and four in a square, also some alone.”

The unknown author goes on to describe the objects — spheres, rods, and crosses — as battling with each other for about an hour, swirling back and forth. Eventually, the objects seemed to become fatigued and fell to the Earth, where they “wasted away… with immense smoke.

Now, what could have caused this phenomenon? The obvious answers are that it was a slow news day or that it was religious propaganda or some other wind-up. But if it were an actual phenomenon and really only remarked on in one village, then it’s quite possible that it was a meteor shower with an apparent radiant, or source, that happened to line up with the Sun.

It was a Monday, with a new Moon. The Sun rose in the east at 5:05 a.m., so the invisible Moon was somewhere around that part of the sky, too. But this also immediately calls the story into question, since the phenomenon seen coming from the Sun happened before sunrise according to the account. But if we consider that to just be human error, what we have is the Pearl Harbor effect. The attackers come in with the rising Sun behind them, making them hard to see or understand.

On top of that, if they’re coming in from that direction, they’re coming in at a very shallow angle. See the notes on the Russian meteor above. This can lead to some super-heated objects, which would glow red as reported, and anything not red hot against the Sun would appear black. If it happened to be a swarm of objects, like a bunch of small rocks and dust or a bigger piece that broke up, all flying in at once, the chaotic motion could certainly make it seem like a battle.

There is a meteor shower that happens around that time of year called the Lyrids, which is very short-lived, although I haven’t yet been able to find out whether its radiant was near the Sun in 1561. But a particularly heavy shower coming in at just the right angle could have an unusual effect in a limited area.

Or… the author just pulled it out of his ass for his own reasons. We can never know.

Photo credit: Staff Sgt. Christopher Ruano, F.E. Warren Air Force Base.

Momentous Monday: Backwards and in high heels

The famous astronomer Herschel was responsible for a lot of accomplishments, including expanding and organizing the catalog of nebulae and star clusters, discovering eight comets, polishing and mounting mirrors and telescopes to optimize their light-gathering powers, and keeping meticulous notes on everything.

By being awarded a Gold Medal of the Royal Astronomical Society and being named an honorary member thereof, holding a government position and receiving a salary as a scientist, Herschel became the first woman to do so.

What? Did you I think I was talking about the other one? You know — the only one most of you had heard of previously because he discovered Uranus. Oh, and he had that whole penis thing going on.

Caroline Lucretia Herschel, who was William’s younger sister by eleven years and was born in 1850, did not have a penis, and so was ignored by history. Despite the honors she received, one of her great works, the aforementioned expansion of the New General Catalogue (NGC), was published with her brother’s name on it.

If you’re into astronomy at all, you know that the NGC is a big deal and has been constantly updated ever since.

While she lacked William’s junk, she shared his intellectual curiosity, especially when it came to space and studying the skies. It must have been genetic — William’s son John Herschel was also an astronomer of some repute — and it was his Aunt Caroline, not Dad, who gave him a huge boost.

She arranged all of the objects then in the NGC so they were grouped by similar polar coordinates — that is, at around the same number of degrees away from the celestial poles. This enabled her nephew to systematically resurvey them, add more data about them, and discover new objects.

Caroline was not the first woman in science to be swept under history’s rug by the men. The neverending story of the erasure of women told in Hidden Figures was ancient by the time the movie came out, never mind the time that it actually happened. Caroline was in good company.

Maria Winckelmann Kirch, for example, was also an astronomer, born 80 years before Caroline and most likely the first woman to actually discover a comet. But of course history gave that honor to her husband, Gottfried Kirch, who was thirty years her senior. However, Herr Kirch himself confirms in his own notes that she was the one who found it:

“Early in the morning (about 2:00 AM) the sky was clear and starry. Some nights before, I had observed a variable star and my wife (as I slept) wanted to find and see it for herself. In so doing, she found a comet in the sky. At which time she woke me, and I found that it was indeed a comet… I was surprised that I had not seen it the night before”. [Source]

Maria’s interest and abilities in science came from a person we might think of as unlikely nowadays: a Lutheran minister, who happened to be her father. Why did he teach her? Because he believed that his daughter deserved the same education any boy at the time did, so he home-schooled her. This ended when Maria lost both of her parents when she was 13, but a neighbor and self-taught astronomer, Christoph Arnold, took her on as an apprentice and took her in as part of the family.

Getting back to Hidden Figures, though, one of the earliest “computers,” as these women of astronomy were known, was Henrietta Leavitt. Given what was considered the boring and onerous task of studying a class of stars known as Cepheid variables, she actually discovered something very important.

The length of time it takes a Cepheid to go through its brightest to darkest sequence is directly proportional to its luminosity. This means that if know the timing of that sequence, you know how bright the star is. Once you know that, you can look at how bright it appears to be from Earth and, ta-da! Using very basic laws of optics, you can then determine how far away the star is.

It’s for this reason that Cepheids are known as a “standard candle.” They are the yardsticks of the universe that allow us to measure the unmeasurable. And her boss at the time took all the credit, so I’m not even going to mention his name.

And this is why we have The Leavitt Constant and the Leavitt Telescope today.

No, just kidding. Her (male) boss, who shall still remain nameless here because, “Shame, shame,” took all of the credit for work he didn’t do, and then some dude named Edwin Hubble took that work and used to to figure out how far away various stars actually were, and so determined that the universe was A) oh so very big,  and B) expanding. He got a constant and telescope named after him. Ms. Leavitt… not so much.

There are way too many examples of women as scientific discovers being erased, with the credit being given to men, and in every scientific field. You probably wouldn’t be on the internet reading this now if no one had ever come up with the founding concepts of computer programming, aka “how to teach machines to calculate stuff for us.”

For that, you’d have to look to a woman who was basically the daughter of the David Bowie of her era, although he wasn’t a very dutiful dad. He would be Lord Byron. She would be Ada Lovelace, who was pretty much the first coder ever — and this was back in the days when computers were strictly analog, in the form of Charles Babbage’s difference and analytical engines.

The former was pretty much just an adding machine, and literally one that could only do that. So, for example, if you gave it the problem “What is two times 27,” it would find the solution by just starting with two, and then adding two to it 26 times.

The latter analytical engine was much more like a computer, with complex programming. Based on the French Jacquard loom concept, which used punched cards to control weaving, it truly mimicked all of the common parts of a modern computer as well as programming logic.

Basically, a computer does what it does by working with data in various places. There’s the slot through which you enter the data; the spot that holds the working data; the one that will pull bits out of that info, do operations on it, and put it back in other slots with the working data; and the place where it winds up, which is the user-readable output.

The analytical engine could also do all four math operations: addition, subtraction, multiplication, and division.

An analog version of this would be a clerk in a hotel lobby with a bunch of pigeonhole mail boxes behind, some with mail, some not. Guests come to the desk and ask (input), “Any mail for me?” The clerk goes to the boxes, finds the right one based on input (guest room number, most likely), then looks at the box (quaintly called PEEK in programming terms).

If the box is empty (IF(MAIL)=FALSE), the Clerk returns the answer “No.” But if it’s not empty (IF(MAIL)=TRUE), the clerk retrieves that data and gives it to the guest. Of course, the guest is picky, so tells the Clerk, “No junk mail and no bills.”

So, before handing it over, the Clerk goes through every piece, rejecting that above (IF(OR(“Junk”,”Bill”),FALSE,TRUE), while everything else is kept by the same formula. The rejected data is tossed in the recycle bin, while the rest is given to the guest — output..

Repeat the process for every guest who comes to ask.

Now, Babbage was great at creating the hardware and figuring out all of that stuff. But when it came to the software, he couldn’t quite get it, and this is where Ada Lovelace came in. She created the algorithms that made the magic happen — and then was forgotten.

By the way, Bruce Sterling and William Gibson have a wonderfully steampunk alternate history novel that revolves around the idea that Babbage and Lovelace basically launched the home computer revolution a couple of centuries early, and with the British computer industry basically becoming the PC to France’s Mac. It’s worth a read.

Three final quick examples: Nettie Maria Stevens discovered the concept of biological sex being passed through chromosomes long before anyone else; it was Lise Meitner, not Otto Hahn, who discovered nuclear fission; and, in the ultimate erasure, it was Rosalind Franklin, and neither Watson nor Crick, who determined the double helix structure of DNA.

This erasure is so pronounced and obvious throughout history that it even has a name: The Matilda Effect, named by the historian Margaret Rossiter for the suffragist Matilda Joslyn Gage.

Finally, a note on the title of this piece. It comes from a 1982 comic strip called Frank and Ernest, and it pretty much sums up the plight of women trying to compete in any male-dominated field. They have to work harder at it and are constantly getting pushed away from advancement anyway.

So to all of the women in this article, and all women who are shattering glass ceilings, I salute you. I can’t help but think that the planet would be a better place with a matriarchy.

For all of the above histories and more, it’s plain to see why finally having a female Vice President of the United States (and a person of color at that) is a truly momentous and significant moment in the history of the country and the world.

Momentous Monday: Relativity

ts, One hundred ninety years ago, somewhere in Massachusetts, a child was born. He grew up to become a man, and he moved west. It was the era of Manifest Destiny in America, a dark time in our history.

At least we weren’t using the term “Great White Hope.” Yet. To be honest, we should have used the term “Great White Death.” But, among the death there was still hope, and that child born in Massachusetts who grew up to be a man put his ideals into action.

Along with a great wave of German immigrants to America, all of whom despised slavery, this man went west, crossed the Missouri river and landed in Kansas. For me, the movie How the West Was Won is a family documentary.

 When he arrived in Kansas, he helped found the town of Burlington, was one of two attorneys in the town (and also one of two blacksmiths, the other of whom was the other attorney), mayor of the town at one point, and a proud member of the Republican Party.

Yeah… quite the opposite of my politics now, or so you’d think. Except that, before the Civil War and up until FDR, the Republicans were the liberal party in America, and the Democrats were regressive.

That child who grew up to be a great man moved west in order to help bring Kansas into the union as a free (i.e., non-slave) state. And that child, who grew up to be a great man, was my great-great-grandfather, Silas Fearl.

Fast-forward to nearly two-hundred years after his birth, and the evolution of the internet, and I am in touch with people who share my ancestry with him. It makes us very distant relatives, to be sure, but it means that we have a very definite connection, some by blood and some by marriage.

And this is the reason for this post. One of those third or fourth cousins, via Silas Fearl by blood, posted some pictures of her kids, and when I looked at them the thing that most struck me was this. “Wow. This person and I have an ancestor in common.” And, in fact, looking at these faces, I could see certain elements of my own face, of my dad’s, and of my grandpa’s, and of the great uncles I managed to meet, and of the people in a family portrait taken when my father’s father was an infant.

Even so many steps apart on the branches of humanity’s family tree, I could see some of me and my immediate family in them… and across the distance of never having met and Facebook, my first reaction was an enormous empathy. “This is a bit of me, and I want to protect it from everything bad forever.”

And, in a lot of ways, I have to suspect that this is just an illusion, an effect created by the empirical proof I have seen that means “You and I are related to each other.” That, and the evolutionary and biological forces that make us most protective of those who share our DNA.

Except that… I’ve felt this same way toward people who are absolutely not related, but I’ve still seen myself in them… and this is when I realize the harm that intellect can do to our species.

Intellect relies on so-called facts that it has been told. So, “Hey, you and this person are related” is a fact that ropes emotions into relating to the news. So… subject, object, emotion, bond.

In reality, anybody whose picture I see online is related, it’s just not as straightforward as “You and this person have the same great-great-grandfather.” I can trace part of my ancestry back to King Henry II of England and his wife, Eleanor of Aquitaine — The Lion in Winter is, for me, another unintended family documentary.

By that connection, I’m related to most of the population of England and the eastern US. Now, go back through them to another common ancestor, Charlemagne, and I’m related to most western Europeans and Americans — if you expand the definition of “America” to include all countries on both continents, north and south.

And, if you go back far enough to the last point in humanity’s evolutionary history at which the family tree’s branches split, then you could honestly say that everybody you have ever met is related to you and shares your DNA and your blood to some degree.

You should be able to recognize your features in them no matter their race, gender identity, sexual orientation, or religion. You should be able to see their humanity, and yours, in their faces.

And, go back far enough then we are related to all animal life on this planet. Go back a little farther, and we are related to all life not only on this planet, but in the universe. Go back far enough and follow the laws of physics, and all of us, everyone, everywhere, were once the exact same bit of incredibly condensed matter.

The universe is the mother of us all, and all divisions are illusionary.

I’m reminded of some old Beatles lyrics at the moment. “I am he as you are he as you are me and we are all together.” (And I had to look that up. It’s from I Am the Walrus and not Come Together.) Anyway, that’s a pretty good summation of my realization.

Once we put human history on a cosmic scale, our differences and squabbles become absolutely meaningless. All of us were born from the stars. All of us are in this together. Let’s act like it…

Image: The author’s great-grandparents and their four sons, including the author’s paternal grandfather.