New Horizons

I’ve always been a giant nerd for three things: History, language, and science. History fascinates me because it shows how humanity has progressed over the years and centuries. We were wandering tribes reliant on whatever we could kill or scavenge, but then we discovered the secrets of agriculture (oddly enough, hidden in the stars), so then we created cities, where we were much safer from the elements.

Freed from a wandering existence, we started to develop culture — arts and sciences — because we didn’t have to spend all of our time picking berries or hunting wild boar. Of course, at the same time, we also created things like war and slavery and monarchs, which are really the ultimate evil triumvir of all of humanity, and three things we really haven’t shaken off yet, even if we sometimes call them by different names. At the same time, humanity also strove for peace and freedom and equality.

It’s a back and forth struggle as old as man, sometimes forward and sometimes back. It’s referred to as the cyclical theory of history. Arthur Schlesinger, Jr. developed the theory with specific reference to American history, although it can apply much farther back than that. Anthony Burgess, author of A Clockwork Orange, explored it specifically in his earlier novel The Wanting Seed, although it could be argued that both books cover two different aspects of the cycle. The short version of the cycle: A) Society (i.e. government) sees people as good and things progress and laws become more liberal. B) Society (see above) sees people as evil and things regress as laws become harsher and draconian, C) Society (you know who) finally wakes up and realizes, “Oh. We’ve become evil…” Return to A. Repeat.

This is similar to Hegel’s Dialectic — thesis, antithesis, synthesis, which itself was parodied in Robert Anton Wilson and Robert Shea’s Illuminatus! Trilogy, which posited a five stage view of history instead of three, adding parenthesis and paralysis to the mix.

I’m not entirely sure that they were wrong.

But enough of history, although I could go on about it for days. Regular readers already know about my major nerdom for language, which is partly related to history as well, so let’s get to the science.

The two areas of science I’ve always been most interested in also happen to be at completely opposite ends of the scale. On the large end are astronomy and cosmology, which deal with things on scales way bigger than what we see in everyday life. I’m talking the size of solar systems, galaxies, local clusters, and the universe itself. Hey, when I was a kid, humans had already been in space for a while, so it seemed like a totally normal place to be. The first space disaster I remember was the Challenger shuttle, and that was clearly human error.

At the other end of the size scale: chemistry and quantum physics. Chemistry deals with interactions among elements and molecules which, while they’re too small for us to see individually, we can still see the results. Ever make a vinegar and baking soda volcano? Boom! Chemistry. And then there’s quantum physics, which deals with things so small that we can never actually see them, and we can’t even really be quite sure about our measurements of them, except that the models we have also seem to give an accurate view of how the universe works.

Without understanding quantum physics, we would not have any of our sophisticated computer devices, nor would we have GPS (which also relies on Einstein’s Relativity, which does not like quantum physics, nor vice versa.) We probably wouldn’t even have television or any of its successors, although we really didn’t know that at the time TV was invented, way before the atomic bomb. Not that TV relies on quantum mechanics, per se, but its very nature does depend on the understanding that light can behave as either a particle or a wave and figuring out how to force it to be a particle.

But, again, I’m nerding out and missing the real point. Right around the end of 2018, NASA did the amazing, and slung their New Horizons probe within photo op range of the most distant object we’ve yet visited in our solar system. Called Ultima Thule, it is a Kuiper Belt object about four billion miles away from earth, only about 19 miles long, and yet we still managed to get close enough to it to get some amazing photos.

And this really is the most amazing human exploration of all. New Horizons was launched a generation or two after both Viking probes, and yet got almost as far in under half the time — and then, after rendezvousing with disgraced dwarf planet Pluto went on to absolutely nail a meeting with a tiny rock so far from the sun that it probably isn’t even really all that bright. And all of this was done with plain old physics, based on rules worked out by some dude in the 17th century. I think they named some sort of cookie after him, but I could be wrong. Although those original rules, over such great distances, wouldn’t have really worked out without the tweaking that the quantum rules gave us.

Exploring distant space is really a matter of combining our knowledge of the very, very big with the very, very small — and this should really reflect back on our understanding of history. You cannot begin to comprehend the macro if you do not understand the micro.

Monarchs cannot do shit without understanding the people beneath them. This isn’t just a fact of history. For the scientifically inclined, the one great failing of Einstein’s theories — which have been proven experimentally multiple times — is that they fall entirely apart on the quantum level. This doesn’t mean that Einstein was wrong. Just that he couldn’t or didn’t account for the power of the very, very tiny.

And, call back to the beginning: Agriculture, as in the domestication of plants and animals, did not happen until humans understood the cycle of seasons and the concept of time. Before we built clocks, the only way to do that was to watch the sun, the moon, and the stars and find the patterns. In this case, we had to learn to pay attention to the very, very slow, and to keep very accurate records. Once we were able to predict things like changes in the weather, or reproductive cycles, or when to plant and when to harvest, all based on when the sun or moon rose or set, ta-da. We had used science to master nature and evolve.

And I’ve come full circle myself. I tried to separate history from science, but it’s impossible. You see, the truth that humanity learns by objectively pursuing science is the pathway to free us from the constant cycle of good to bad to oops and back to good. Repeat.

Hey, let’s not repeat. Let’s make a concerted effort to agree when humanity achieves something good, then not flip our shit and call it bad. Instead, let’s just keep going ever upward and onward. Change is the human condition. If you want to restore the world of your childhood, then there’s something wrong with you, not the rest of us. After all, if the negative side of humanity had won when we first learned how to domesticate plants and animals and create cities, we might all still be wandering, homeless and nearly naked, through an inhospitable world, with our greatest advancements in technology being the wheel and fire — and the former not used for transportation, only for grinding whatever plants we’d picked that day into grain. Or, in other words, moderately intelligent apes with no hope whatsoever of ever learning anything or advancing toward being human.

Not a good look, is it? To quote Stan Lee: “Excelsior!”

Onward. Adelante. Let’s keep seeking those new and broader horizons.

Bye bye bunny

You’ve probably heard of Coney Island, which is a beachfront amusement park located on Long Island, New York, in the borough of Brooklyn. If you’re from Southern California, it’s somewhat analogous to the Santa Monica Pier, and the now defunct Ocean Park, which closed in 1967. But… have you ever wondered how Coney Island got its name?

It wouldn’t be unreasonable to assume that it was named after a member of the Coney family. After all, a lot of places are. The name New York itself refers back to the famous Yorks of England. Perhaps Coney Island was named after the famous Nathan Coney, who founded Nathan’s Famous Hot Dogs, a world-renowned place. Oh… except that it was founded by Nathan Handwerker in 1916, long after Coney Island had been named. And, to be fair, “Handwerker” is a really great name for somebody who makes their living crafting foodstuffs by hand.

When was Coney Island named, exactly? Well, most likely when the place had been settled by the Dutch and what we now call New York was known as New Amsterdam. They decided to name this stretch of Long Island Konijn Eiland.  You don’t really need to speak Dutch to realize that those words sound a lot like the final name. In fact, Konine Eyelant is pretty much it. So where did the Konijn/Coney come from?

Let’s jump back just a moment to my childhood, when we used to visit my paternal grandmother, who lived in a town called Atascadero, up the end of a street called Conejo Road. And what does Conejo mean? Well, if you grow up in a place with a big Spanish influence, like Southern California, you’ll learn very quickly that “conejo” means “hare.” So grandma lived on Hare Road. And that’s exactly how Coney Island wound up with the name. The Dutch knew it and later settlers just followed…

The place was hare island, originally because it was covered by them, later from linguistic inertia. But, at the same time, it was a misnomer to name the entire place “Hare Island,” because they weren’t everywhere, just in certain places. Like where later New Yorkers built their amusement park.

Note that I’m not using the word rabbit, because there is still no agreement on how this word wound up in English. It may have come from generic Franco-Germanic terms for “little animal,” but who knows? Ultimately, the sounds that led to the name for this creature are most likely Germanic.

As for bunny, again, no one knows. It may have come from a term for a squirrel or a tail, or could have somehow been derived from “cunny,” a diminutive for the aforementioned coney, although with rather unfortunate connotations in the modern era, at least in English.

Then there’s hare, which gives root to “harrier,” either dogs made for running down rabbits, aka hares, or the description of military airplanes that can jump and shoot the shit out of other planes.

None of which would have flown over Coney Island. And the real answer to all of this, may I abandon my linguistic purist roots, is this: In the great long run — as in centuries away from now — folk etymologies are as good as reality. If I say now that Coney Island was named that because the Dutch thought the place was overrun with hares, then so be it… the Dutch win. If, however, my version — or the version in my links wins — and someday the place is renamed Bunny Brooklyn, or whatever… that will be our future history. And that’s just the thing. History is fleeting and, while I like to try to teach what I can learn from what we know now, I also know that in a century or two or three everything we think we know now will be proven wrong.

All I can really say for now is that my grandma lived in a place named for lots of rabbits, and they were definitely there. An amusement park in Long Island was named for the same, although what they called rabbits probably were not. As a kid, I owned and took care of a lot of bunnies, and they were amazing. As an adult, I do improv, a lot of which involves a game called “Bunny, Bunny.” But forget bunnies and rabbits. If you’re really keeping track, it’s coneys and hares.

Same thing as bunnies and rabbits, except not as cute and more durable, and with different words. Really…

Friday Free-for-all #24

In which I answer a random question generated by a website. Here’s this week’s question Feel free to give your own answers in the comments.

What could you give a 40-minute presentation on with absolutely no preparation?

Wow. This is an interesting question because, honestly, there are so many possibilities. My strong points are musical theory, film history, English language and grammar, history in general, and astronomy. I could also include theatre history, playwriting, character development, improv, and dog training.

Hell, I could probably also talk my way through forty minutes on Medicare, but I also know enough about the industry to know that I shouldn’t. Well, technically, can’t. So we’ll leave that one off of the list.

The strongest and easiest one for me? Musical theory, I suppose, because as I’ve mentioned elsewhere, I really consider music to be my second language after English. So I could easily go on for forty minutes or more on the 12-tone system, the Circle of Fifths, how chords are related to each other and so on, and how everything is really based on a series of combinations of duotones that are just either Major or minor intervals.

And if I happen to rip through that before the forty minutes are over, don’t worry. You’ll get an entire course in musical history from the Baroque Era right up to the modern day

Or, if you prefer, a history of film, decade by decade, from the late 19th century to the early 21st, with plenty of juicy details about scandals galore. Gasp!

Then there’s my quick course in “How not to Make Common Mistakes in English,” which will walk you through how to remember differences between similar words, e.g. “To connects two things, so it isn’t too long,” or dealing with using pronouns properly, which is an exercise in omission. That is, Rule Number One, you always come last. Rule Number Two, get rid of all of the other pronouns and see if it makes sense.”

Ergo, if you write, “Myself and him went to the store,” step one is to put yourself last: “Him and myself went to the store.” Now, get rid of the other pronouns in turn and see what you get.

“Him went to the store.” Wrong.

“Myself went to the store.” Also wrong.

So the sentence you’re looking for is, “He and I went to the store.” Simple and straight forward. I don’t know why so many people make this mistake. It’s just lazy speaking, period.

And my five dollar lesson on who and whom: Rephrase what you’re saying as a statement with the singular masculine pronoun, and see what happens. That is, do this:

Original: To (who/whom) did you give the book?

Rephrased: I gave the book to (he/him).

Correct: I gave the book to him/To whom did you give the book?

Original: (Who/whom) lives here?

Rephrased: (He/him) lives here.

Correct: He lives here/Who lives here?

And when the pronoun is “he,” then the other one is who; when it’s “him,” it’s “who.” The big tell on this is that “m” ending, which is the only reason I don’t teach it with “she” and “her,” because it’s just easier to remember that letter.

Generally, “whom” will be the indirect object of a sentence; the person who received something. “Who” will be the subject; the person who does something.

Regarding history, I can give a good amateur spiel on all things Roman through about Constantine, but especially during the era of the so-called Twelve Caesars, and cover American history and politics from the Civil War on.

When it comes to astronomy, I am a total cosmology geek, and I could nerd out on your asses with anything form the history of the universe to how stars work, how planets form, what black holes and neutron stars are, how astronomy relates to chemistry, why time travel or faster than light speeds are not possible, and even a bit of quantum physics.

If that’s too much, then strap in for some theatre history, from its origins which probably pre-dated the Greeks, but that’s where we start dating it in the West, and just stay prepared for a really wild ride.

Playwriting and character development? Yeah that comes right after music for my personal fluency, but it’s also harder for me to teach only because it’s become so intuitive.

I can ultimately pull apart my musical talents and explain to you why, for example, a C Major chord followed by an E7 chord is so satisfying, even though the latter contains the augmented fifth and major seventh of the former, but that’s all because it leads back into the relative minor, which shares a key signature with your starting place.

But, when it comes to me trying to explain how to structure a story, the only thing I can say is that Aristotle’s “beginning, middle, and end” thing was sort of right, except that each of those also have their own beginning, middle, and end (we’re up to nine), which would leave us with how to structure each of three acts.

But, oops… Each act, with its own beginning and end, has one of each for, well, each beginning, middle, and end. So now, we multiple nine by three, get twenty-seven, and boom.

Those are the blocks you build any dramatic story with.

Funny story: Music tends to work in blocks of four, put two blocks together, you get eight, repeat over and over, you get a song. Three and four only play together well in units of twelve, and one of the most ubiquitous forms of American music is the song based on the twelve-bar blues pattern.

Basically, it involves three “acts.” The first is four bars of the dominant chord, generally referred to as I. The next four shift up for the first two, then come back down. The first two are built on the fourth note of the I chord, so are referred to as IV.

In the key of C, the IV is F, which is straightforward: C, D, E, F. Boom.

So the pattern, in the key of C, so far is:

     C Maj | C Maj | C Maj | C Maj |

     F Maj | F Maj | C Maj | C Maj |

Finally, the last four bars follow the pattern V, IV, I, V. That’s because the V is a natural bridge between the I and IV for various complicated reasons.

This gives us, BTW, the landing pattern of:

     G7    | F Maj | C Maj | G7    |

Oh yeah… jumping back a bit… the V is the fifth note based on the one or tonic, which gives us C, D, E, F, G. And why does G work so hard in leading back to C?

Because reasons. But here are two big easy ones, even though this might sail over heads for the moment. F and C get along because the only accidental in F’s key — Bb — also happens to turn C’s dominant, i.e. V, aka G, into a minor chord. Long story, don’t ask.

Meanwhile… in a major scale, G hates F, because her seventh is an F#. However, drop that to a regular F, she suddenly becomes a G7, and 7th chords are just hardwired in our brains to lead right back into the dominant chords.

And that’s the funny thing hiding in the progression above. Yeah, sure. It starts out I, IV, V, but that final V chord happens to have both the IV and V in it, without any of those messy annoying sharps and flats, and, yeah…

We wind up landing so damn hard back home that it should be obvious.

This is also the secret of doing musical improv. Follow the rules, and you  can make anyone seem like a genius, because they have nowhere else to go.

And then… where was I?

Great Caesar’s ghost! Or not…

Here’s a flashback to March of 2019, back when theatre was still a thing and the world was (relatively) normal. Ironically, I originally posted this story a day short of exactly one year before ComedySportz was scheduled to leave its space at the El Portal in order to perform as a touring company before finding a new space in the fall. Ironically, that turned out to have been the best decision possible, as it kept the company together while freeing it of the financial burden of the space. A lot of other small theater companies were not as lucky.

As my regular readers know, I do improv comedy for the ComedySportz L.A. Rec League on Monday nights, as well as work box office for the company, which is located in the smaller space in the historic El Portal Theater, which has quite a history.

It was built in 1926 and housed both vaudeville shows and movies. It was badly damaged in the 1994 Northridge Earthquake, although fortunately restored to become a live theater, with three performance spaces. The smaller one, where ComedySportz is now resident, was originally occupied by Actors Alley and then later briefly by The Company Rep before they moved.

In an ironic full-circle, I joined that company as a playwright while they were at the El Portal, then continued on to act with them as they moved to the NoHo Arts Center and the former location of the Deaf West Theater, where I received a glowing review for my turn as a depressed, unicycle riding bear.

So that’s the background on the building. The other thing to keep in mind is that both Debbie Reynolds and Marilyn Monroe used to come to the place to watch movies when they were kids, and the main space and our theater are named after them respectively. The other is that it is an ancient tradition to believe that all theaters are haunted by ghosts.

Note: I don’t believe in ghosts at all, but I do believe that there are certain psychological and physical factors that can make people think they’ve seen them.

Now to the real start of the story. Recently, I had to pull double-duty running the box office and working as house manager on a night when we had shows at eight and ten in the evening. This meant that I had to come open up at six and stick around until the last show and the notes afterward were over, so I was there until midnight.

As part of the closing up procedure, I have to go up to our booth to shut down the light and sound boards and computer, and then have to make sure that there’s no one still working on the main stage. This means I get to go into the main theater lobby, which is deserted, and then into the main stage itself.

That night, I walked into the space, which was dark except for the so-called ghost-light, and called out asking if anyone was there, and for some reason, I got a sudden chill. You know the feeling, right? It’s like every hair on your body suddenly stands up and you feel that electricity travel from your feet to your head. It’s an ancient reaction common to mammals, and if you’ve ever seen a cat puff up or a dog raise its hackles, then you’ve seen it. It’s a defense mechanism designed to make us look bigger when we’re feeling unsure, although it doesn’t really work as well for humans, mainly because it doesn’t affect the hair on our heads and the hair on our bodies (for most of us) isn’t think enough to make us really puff much.

I wrote it off as the psychological weirdness of walking into a dark, cavernous space all alone late at night, then jokingly waved at the stage and said, “Hi, Debbie!” before heading back out to close up.

The next evening, I was talking to Pegge, the Managing Director, and Steve, the House Manager, of the theater and told them about this, and Pegge immediately told me with complete sincerity, “Oh, no. The ghost’s name is Robert. Don’t worry, he won’t hurt you.” She went on to explain that he was the theater’s original accountant back in the 1920s, and people always saw him dressed very formally, with a high white collar. According to her, there’s also a female ghost who would escort patrons to their seats and then vanish.

Steve explicitly stated that he doesn’t believe in ghosts either, but that he has had a number of people over the years independently mentioning seeing both of them and giving identical descriptions of each, generally wondering, “Who was that person I thought I saw before they just disappeared?”

It’s all rather intriguing and now I want to experience these phenomena just to try to figure out what could be creating these illusions in people’s minds. It is a very old building, and late at night also tends to be preternaturally quiet because the really high ceilings and carpeted and padded interiors like to eat sound.

Also, the single source ghost light on stage tends to create deep shadows and bright highlights, and high contrast lighting like that can create all kinds of visual tricks. Finally, the place does sit right above the L.A. Metro Red Line subway tunnel and has for 20 years. I can often hear the rumble of trains passing beneath the lobby, and the connection between low frequency infrasound and ghosts has been established. That’s exactly the kind of sound a rushing subway train might create toward the back of a large space.

Back to that ghost light, though. It’s a romantic name, but is also known as the Equity light, after the actors’ union. Its real reason for being there is to keep people passing through the space after hours from walking into things or falling into the orchestra pit. `

As for why there’s such a belief of ghosts in theaters? I’m not sure, but maybe we can blame Shakespeare, because he certainly loved the trope. Hamlet Sr.? Banquo? Richard III’s nightmare before Bosworth field? Both parts of Henry VI and the only part of Henry VIII? A whole family of ghosts who visited Cymbeline? (A rarely performed and underrated play, by the way, that manages to be both gross and funny at the same time.)

And, of course, there’s the titular ghost for this post, who also gave Perry White of Superman fame his famous catchphrase.

So I’ll be keeping an eye out for Robert and the nameless female usher in future days, and will report back on anything unusual I experience. This is definitely going to be interesting.

Have you seen or experienced anything you’d call “ghost-like?” If so, how do you explain it? Let us know in the comments!

Image: Painting, La morte di Giulio Cesare, by Vincenzo Camuccini, c. 1806. Public domain in the United States.

The odd origins of 7 city names in Los Angeles County

A lot of place names for cities and streets are pretty straightforward. They come from famous people, frequently those involved with its founding: Burbank, Lankershim, Van Nuys; or from physical features: La Mirada, La Puente, Eagle Rock. But some place names have slightly weirder origins. Here are a few from my home county of Los Angeles.

  1. Agoura Hills: This somewhat rustic and suburban enclave is located on the extreme western edge of the county, a bit beyond the West Valley made famous as the birthplace of the Valley Girl archetype. Originally, most of it belonged to a sheep ranch owned in the 19th century by a man named Pierre Agoure. By the 20th century, the place was called “Picture City,” because Paramount studios owned their own ranch out there and various film companies used it to shoot their own. When the residents needed to establish a post office, they had to come up with a name, and they voted in 1927. In a very pre-internet version of Boaty McBoatface, the winner decided to name it after that sheep rancher, but whether the person who made the nomination goofed up or the government worker who tallied the entries is anyone’s guess. Nonetheless, Pierre Agoure became the namesake of Agoura, later Agoura Hills.
  2. Azusa: This is a neighborhood out beyond Pasadena. Marketers would love to have you think that this town was named because it has “Everything from A to Z in the USA,” but that’s just a bunch of bunk. In reality, like many place names in California, this one was stolen from the natives, in this case the Tongva, who called the area Asuksagna, their word for “skunk place.” As someone who’s driven on the freeway through the area multiple times on evenings when, as we like to put it, “a skunk went off in the hills,” it’s a rather apt description. You can smell those cute but dank little buggers for miles, whether your windows are up or down. Other place names Tongva have given us are Canoga Park and Tujunga.
  1. Echo Park: Located not too far from downtown, you’ve probably seen this lake and its fountains in many a film and music video. This is one of those place name origins that will sound like a total urban legend until you get to the explanation. When this artificial lake was built in 1892, Superintendent of City Parks Joseph Henry Tomlinson picked the name Echo Park because, well, that’s what he heard when people shouted at the construction site — but those echoes went away as soon as the project was finished. It sounds weird until you realize that, in order to create an artificial lake, human engineers had to create a gigantic concrete quarry first, so while that thing was set up and empty, of course it was echo city. But as soon as it was filled with water, ta-da: Echoes no more. Doesn’t seem so weird now, does it?
  1. Los Feliz: Directly south of Griffith Park and probably most famous because Swingers was shot in the Dresden Room right in the middle of town, Los Feliz is one of those interesting places in L.A. that not only seems to be named wrong, but which everyone pronounces wrong. On its face, “los” is a plural article but “feliz” is a singular noun. It’s a Spanish thing, but the expression should be either el or la feliz, for “the happy one” (feliz doesn’t change regarding the gender) or los or las felices, for “the happy ones.” On top of that, people in L.A. tend to pronounce it as “Las FEELis,” rather than the correct way, “Los FayLEASE.” (If you know the song “Feliz Navidad,” then you know how the word is supposed to be pronounced.) Now here’s where it gets more interesting. “Los Feliz” is actually correct, but for only one reason. It doesn’t refer to a happy person or persons. Rather, it refers to an entire family with the surname Feliz, founded by a Spanish explorer named José Feliz. They owned land in the area for years, had a very colorful history, and, in this case, Los Feliz correctly refers to the Feliz family. Unlike English, where you might refer to “The Smiths” to mean the entire Smith Family, Spanish only changes the article, so “Los Feliz” really means “the Feliz family.”
  1. Sylmar: This is way up on the north central part of the San Fernando Valley, and a place that is more known by name than by anyone actually ever going there. This one is short and sweet. The name was created by cobbling together the Latin words for forest and sea: sylvia and mar. (Sylvia is also part of the name of the state of Pennsylvania — Penn’s forest.) At the time it was named, the place had a lot of olive trees and was the location of Olive View Hospital, which was destroyed in an event that will be forever associated with the city, the Sylmar Earthquake of 1971.
  1. Tarzana: Mostly known as that bedroom community stuck up in the hills that tries to keep Woodland Hills and Reseda from banging into each other, Tarzana has a simple etymology that looks like it’s made up, but it’s not. It’s where Edgar Rice Burroughs eventually retired to. Ol’ Edgar was most famous for creating the character called Tarzan. The place needed a name, he was a famous resident. Ta-da: Tarzana.
  1. Venice: While it’s not a huge leap to realize that Venice, California, was named after Venice, Italy, if you only know this hippie/hipster hangout stuck between Santa Monica and LAX for its boardwalk and colorful people and street vendors, it’s easy to forget that it was originally absolutely intended to recreate the original Venice, right down to the canals — some of which are still there, although you do have to travel a bit inland to find them. The main plaza leading to the beach was also designed to resemble Piazza San Marco in the original Venice, although on a much smaller scale. Its founder, Abbot Kinney, was a polyglot who spoke six languages, and eventually made his fortune from tobacco. Originally called Venice of America, it opened in 1905, and was an immediate success. Kinney died of lung cancer as karma took its revenge in 1920. Nearly a century later, Venice is still a success as one of the more recognizable and unique parts of L.A., and well worth the visit for tourists and locals alike.

What are some interesting place names with weird origins where you live? Share in the comments!

The art of war

Ending just over a century ago, World War I, originally known as The Great War or the War to End All Wars, turned out to be none of the above, since it was eclipsed by its sequel, World War II — to date, the planet’s only nuclear war — which also outdid the first World War in terms of “greatness” if you take “great” to mean number of deaths. Also, obviously, the fact that there was a II to follow the I — and many other wars thereafter to the present day — means that World War I didn’t end any wars at all.

What’s often forgotten about the aftermath of that wr was the effect it had on the people who lived through it — sometimes barely — and especially the effect it had on the arts and culture, as well as the politics of the rest of the first half of the 20th century. It left a generation that was as stunned as the post-Vietnam generation. In fact, it gave us the original term for what we now call PTSD: shell-shock.

In the arts, it gave us things like Dada, which led to Surrealism, which were both efforts to deal with the absolute horror of what really was the first modern war. After all, WWI gave us the first aerial warfare with planes (after a brief prelude in Mexico), the first trench warfare and the first large-scale chemical warfare. It also led to the development of new techniques in plastic surgery. Hey, gotta figure out how to rebuild all those faces that got blown off, right?

But it was the art connection that really hit home, because I can think of three films that dealt with World War I that have really stuck with me — the first because of the way it manages to demonstrate the pure horror of that war and all wars, and the other two because they show, brilliantly, how that war went on to influence the arts and artists of that generation as they grew up after it.

The oldest film and oldest source is Johnny Got His Gun, based on a book written Dalton Trumbo in 1938 — or, in other words, right before the sequel to the Great War was released. Ironically, he was later blacklisted as a communist in the 1950s. The movie came out in 1971, at the height of the anti-Vietnam War protest movement. Both it and the book tell a first-person story about a young veteran of World War I who comes home with all of his limbs and his face blown off. He basically has no way to communicate with the world, and keeps reliving the war while telling us what he can sense — which is mostly the sounds and touches from the nurses around him.

It’s a very dark and hopeless story. This man has basically been condemned to be trapped in his own practically useless body which is just being kept alive because, well, it’s what you do for the wounded, right? He is denied euthanasia and can’t even commit suicide. Even though he finally manages to try to communicate in Morse code by banging his head on his pillow, he’s ignored — just like so many veterans of that (and other) wars have been.

The second film, Savage Messiah, is one of Ken Russell’s earlier biopics. Released in 1972, it tells the story of artist Henri Gaudier-Brzeska. Gaudier was his birth name, but he had a rather unconventional relationship with a much older woman and took her name as a hyphenate way before it was even a thing, even though they never married.

Eventually, he marches off voluntarily to fight in World War I, and one of the scenes near the end of the film is one that has stuck with me since I first saw it in an art-house revival years ago. One character is reading a letter from Henri on the front that is glorifying the war, talking about killing the enemy. Another character, pitched as somewhat of an antagonist, says, “Whoever wrote that should be shot,” and the man reading the letter replies, “He was. This morning.”

And that is how we find out that this artist and sculptor is dead. It’s one of those rug-yank moments that works so well.

The final film, Max, came out thirty years after Savage Messiah, but is perhaps the strongest synthesis of the “how this war affected the arts” with “how this war got a sequel.” In it, John Cusack plays the titular character, a would-be artist who lost his painting arm in the trenches and so who is now just an art dealer and agent. He meets a young Hitler, portrayed by the brilliant Noah Taylor, and tries to mentor him, but it does not go well because Hitler cannot understand the human side of art while Max cannot see Hitler’s nascent fascism in his works.

One of the highlights is a Dadaist performance piece by Max in which he is lowered, apparently nude and with lost arm in full view sans prosthetic, into a giant meat-grinder while he talks about the war, tons of ground beef pouring out the business end. While the character of Max Rothman in 1918 may have been fictional, the film is still a very effective take on the emotional scars that this war left on everyone who had to live through the battlefield. Only the dead were left with just physical scars, and not emotional ones, although that’s probably not better.

Of course, there are a bunch of top-rated World War I movies, some made before, a lot made after; some of which I’ve seen, a lot of which I’ve haven’t, along with the long list of all World War I movies. Also, I can’t forget Black Adder Goes Forth, which basically ended a beloved series with (SPOILER ALERT) all of the characters rushing out of the trench to their certain deaths. But, c’mon. It’s a Black Adder series. That shouldn’t be a surprise at all, considering how the first one ended.

Finally, to really bring it full circle, Rajiv Joseph wrote a play about the start of World War I called Archduke which was pretty amazing and that played in Los Angeles at the Mark Taper Forum in 2017, exactly a century after the U.S. finally entered WWI.

Oh yeah. The other big effect of that war? It’s the one that solidified the U.S. as a world super-power after we fired the first shot in the Spanish-American War but before we stole the thunder from Britain and France by finally jumping in to end the First World War. That part is not necessarily good, though, either.

What films about war particularly move you? Tell us in the comments! And Happy Bastille Day!

Momentous Monday: Marbury vs. Madison

Two hundred and seventeen years ago today, the U.S. Supreme Court made a very important decision, one that has resonated on down through the years, and one that is more important now than ever.

Basically, a little incoming executive fuckery attempted to block an approved appointment by the outgoing administration… or did it? Because the outgoing administration wasn’t so innocent either, and to top it off, the Supreme Court Justice who ruled in the case, John Marshall, had been Secretary of State to the President who was trying to pack the courts with justices favorable to his side in the last days before he had to turn over the reins to Thomas Jefferson.

Side note: For all of you Founders fans, read up a bit, and you’ll realize that if you’re progressive, then you’re on the side of Adams, not Jefferson.

Anyway, beyond the politics of all of the above, two things are notable. One is that Marshall actually ignored the fact that he was voting against his guy (Adams) in this case and voted for what was right. Second is that this case forever enshrined the idea that the Supreme Court could absolutely decide whether a law passed by Congress was Constitutional.

Hello, checks and balances, everyone.

But it seems to have passed out of fashion to understand this simple fact. Our Constitution set up three branches of government for one simple purpose: So that no one of them would become too powerful. That’s what checks and balances means.

The three branches are as follows:

Legislative, meaning both houses of Congress, whose job is to make laws.

Executive, meaning the President and Cabinet, and their job is to figure out how to enact the laws passed by the Legislature, or to say “Nope. We’re not passing that law.” (To which the Legislature, with a two thirds majority, can say, “Nu-uh, it’s passed. Suck it!)

Judicial, meaning the Supreme Court, and they get to decide whether a law follows the Constitution or not.

Oh yeah. All three branches are constrained by the Constitution. At least in theory. And getting back to the Legislative, there are two houses of Congress, which makes it wonky: The Senate and the House or Representatives.

These came out of what you can basically call White Privilege, i.e., “We really can only trust rich, old, white, land-owning dudes over 21 to do what’s best (for rich, old, white, land-owning dudes over 21), so the system was stacked from the top. The Reps in the house are based on the population of states, meaning that in the modern day places like California, Texas, and New York have the most Reps. However, the Senate is based on state, as in every state gets the same two Senators, so that California, with nearly 40 million people, gets the same number of Senators as Wyoming, with just over half a million people, and that is utter bullshit. Of course, this is the same nonsense that gave us the Electoral College, which really needs to be banished as well.

To keep it fair, we really need to banish the Senate, reapportion Congress based on an honest 2020 Census, and pack the Supreme Court to at least 17 Justices. Maybe even consider the concept of having two or more of those positions elected by the people instead of appointed by the President, and with the power of recall endowed, again, with the people..

Oh yeah. Because that’s the really big part that the whole “Systems of Checks and Balances” things ignores. The fourth branch of government.

Who is it? You may ask. Simple. It’s us. We the people, and our power to vote. We can’t do shit about the Supreme Court (yet, but see above), otherwise, the President and Congress are in our hands.

The Supreme Court Justice you may or may not have heard of, and who was equal parts hero and dick. 

Momentous Monday: Meet the Middletons

Thanks to boredom and Amazon Prime, I watched a rather weird movie from the 1930s tonight. While it was only 55 minutes long, it somehow seemed much longer because it was so packed with… all kinds of levels of stuff.

The title is The Middleton Family at the New York World’s Fair, and while the content is 7exactly what it says on the tin, there are so goddamn many moving parts in that tin that this is one worth watching in depth, mainly because it’s a case study in how propaganda can be sometimes wrong, sometimes right and, really, only hindsight can excavate the truth from the bullshit.

While it seems like a feature film telling the fictional story of the (snow-white but they have a black maid!) Middleton Family from Indiana who goes back east ostensibly to visit grandma in New York but, in reality, in order to attend the New York World’s Fair of 1939, in reality this was nothing more than a piece of marketing and/or propaganda created by the Westinghouse Corporation, major sponsors of the fair, poised on the cusp of selling all kinds of new and modern shit to the general public.

Think of them as the Apple or Microsoft of their day, with solutions to everything, and the World’s Fair as the biggest ThingCon in the world.

Plus ça change, right?

But there’s also a second, and very political, vein running through the family story. See, Dad decided to bring the family to the fair specifically to convince 16 year-old son Bud that, despite the bad economic news he and his older friends have been hearing about there being no job market (it is the Great Depression, after all) that there are, in fact, glorious new careers waiting out there.

Meanwhile, Mom is hoping that older daughter Babs will re-connect with high school sweetheart Jim, who had previously moved to New York to work for (wait for it) Westinghouse. Babs is having none of it, though, insisting that she doesn’t love him but, instead, is in love with her art teacher, Nick.

1939: No reaction.

2020: RECORD SCRATCH. WTF? Yeah, this is one of the first of many disconnect moments that are nice reminders of how much things have changed in the 81 years since this film happened.

Girl, you think you want to date your teacher, and anyone should be cool with that? Sorry, but listen to your mama. Note: in the world of the film, this relationship will become problematic for other reasons but, surprise, the reason it becomes problematic then is actually problematic in turn now. More on which later.

Anyway, obviously richer than fuck white family travels from Indiana to New York (they’re rich because Dad owns hardware stores and they brought their black maid with them) but are too cheap to spring for a hotel, instead jamming themselves into Grandma’s house, which is pretty ritzy as well and that says grandma has money too, since her place is clearly close enough to Flushing Meadows in Queens to make the World’s Fair a day trip over the course of a weekend.

But it’s okay — everyone owned houses then! (Cough.)

And then it’s off to the fair, and this is where the real value of the film comes in because when we aren’t being propagandized by Westinghouse, we’re actually seeing the fair, and what’s really surprising is how modern and familiar everything looks. Sure, there’s nothing high tech about it in modern terms, but if you dropped any random person from 2020 onto those fairgrounds, they would not feel out of place.

Well, okay, you’d need to put them in period costume first and probably make sure that if they weren’t completely white they could pass for Italian or Greek.

Okay, shit. Ignore that part, let’s move along — as Jimmy, Babs’ high school sweetheart and Westinghouse Shill character, brings us into the pavilion. And there are two really weird dynamics here.

First is that Jimmy is an absolute cheerleader for capitalism, which is jarring without context — get back to that in a moment.

The other weird bit is that Bud seems to be more into Jimmy than Babs ever was, and if you read too much gay subtext into their relationship… well, you can’t read too much , really. Watch it through that filter, and this film takes on a very different and subversive subplot. Sure, it’s clear that the family really wishes Jimmy was the guy Babs stuck with, but it sure feels like Bud wouldn’t mind calling him “Daddy.”

But back to Jimmy shilling for Westinghouse. Here’s the thing: Yeah, sure, he’s all “Rah-Rah capitalism!” and this comes into direct conflict with Nicholas, who is a self-avowed communist. But… the problem is that in America, in 1939, capitalism was the only tool that socialism could use to lift us out of depression and, ultimately, create the middle class.

There’s even a nod to socialism in the opening scene, when Bud tells his dad that the class motto for the guys who graduated the year before was, “WPA, here we come!” The WPA was the government works program designed to create jobs with no particular aim beyond putting people to work.

But once the WPA partnered with those corporations, boom. Jobs. And this was the beginning of the creation of the American Middle Class, which led to the ridiculous prosperity for (white) people from the end of WW II until the 1980s.

More on that later, back to the movie now. As a story with relationships, the film actually works, because we do find ourselves invested in the question, “Who will Babs pick?” It doesn’t help, though, that the pros and cons are dealt with in such a heavy-handed manner.

Jimmy is amazing in every possible way — young, tall, intelligent, handsome, and very knowledgeable at what he does. Meanwhile, Nicholas is short, not as good-1ooking (clearly cast to be more Southern European), obviously a bit older than Babs, and has a very unpleasant personality.

They even give him a “kick the puppy” moment when Babs introduces brother Bud, and Nicholas pointedly ignores the kid. But there’s that other huge issue I already mentioned that just jumps out to a modern audience and yet never gets any mention by the other characters. The guy Babs is dating is her art teacher. And not as in past art teacher, either. As in currently the guy teaching her art.

And she’s dating him and considering marriage.

That wouldn’t fly more than a foot nowadays, and yet in the world of 1939 it seems absolutely normal, at least to the family. Nowadays, it would be the main reason to object to the relationship. Back then, it isn’t even considered.

Wow.

The flip side of the heavy-handed comes in some of Jimmy’s rebukes of Nicholas’ claims that all of this technology and automation will destroy jobs. While the information Jimmy provides is factual, the way his dialogue here is written and delivered comes across as condescending and patronizing to both Nicholas and the audience, and these are the moments when Jimmy’s character seems petty and bitchy.

But he’s also not wrong, and history bore that out.

Now this was ultimately a film made to make Westinghouse look good, and a major set piece involved an exhibit at the fair that I actually had to look up because at first it was very easy to assume that it was just a bit of remote-controlled special effects set up to pitch an idea that didn’t really exist yet — the 1930s version of vaporware.

Behold Elektro! Here’s the sequence from the movie and as he was presented at the fair. Watch this first and tell me how you think they did it.

Well, if you thought remote operator controlling movement and speaking lines into a microphone like I did at first, that’s understandable. But the true answer is even more amazing: Elektro was completely real.

The thing was using sensors to actually interpret the spoken commands and turn them into actions, which it did by sending light signals to its “brain,” located at the back of the room. You can see the lights flashing in the circular window in the robot’s chest at around 2:30.

Of course, this wouldn’t be the 1930s if the robot didn’t engage in a little bit of sexist banter — or smoke a cigarette. Oh, such different times.

And yet, in a lot of ways, the same. Our toys have just gotten a lot more powerful and much smaller.

You can probably guess which side of the argument wins, and while I can’t disagree with what Westinghouse was boosting at the time, I do have to take issue with one explicit statement. Nicholas believes in the value of art, but Jimmy dismisses it completely, which is a shame.

Sure, it’s coming right out of the Westinghouse corporate playbook, but that part makes no sense, considering how much of the world’s fair and their exhibit hall itself relied on art, design, and architecture. Even if it’s just sizzle, it still sells the steak.

So no points to Westinghouse there but, again, knowing what was about to come by September of 1939 and what a big part industry would have in ensuring that the anti-fascists won, I can sort of ignore the tone-deafness of the statement.

But, like the time-capsule shown in the film, there was a limited shelf-life for the ideas Westinghouse was pushing, and they definitely expired by the dawn of the information age, if not a bit before that.

Here’s the thing: capitalism as a system worked in America when… well, when it worked… and didn’t when it didn’t. Prior to about the early 1930s, when it ran unfettered, it didn’t work at all — except for the super-wealthy robber barons.

Workers had no rights or protections, there were no unions, or child-labor laws, or minimum wages, standard working hours, safety rules, or… anything to protect you if you didn’t happen to own a big chunk of shit.

In other words, you were management, or you were fucked.

Then the whole system collapsed in the Great Depression and, ironically, it took a member of the 1% Patrician Class (FDR) being elected president to then turn his back on his entire class and dig in hard for protecting the workers, enacting all kinds of jobs programs, safety nets, union protections, and so on.

Or, in other words, capitalism in America didn’t work until it was linked to and reined-in by socialism. So we never really had pure capitalism, just a hybrid.

And, more irony: this socio-capitalist model was reinforced after Pearl Harbor Day, when everyone was forced to share and work together and, suddenly, the biggest workforce around was the U.S. military. It sucked in able-bodied men between 17 and 38, and the weird side-effect of the draft stateside was that suddenly women and POC were able to get jobs because there was no one else to do them.

Manufacturing, factory jobs, support work and the like boomed, and so did the beginnings of the middle class. When those soldiers came home, many of them returned to benefits that gave them cheap or free educations, and the ability to buy homes.

They married, they had kids, and they created the Boomers, who grew up in the single most affluent time period in America ever.

Side note: There were also people who returned from the military who realized that they weren’t like the other kids. They liked their own sex, and couldn’t ever face returning home. And so major port towns — San Francisco, Los Angeles, Long Beach, San Diego, Boston, New York, Miami, New Orleans — were flooded with the seeds of future LGB communities. (T and Q+ hadn’t been brought into the fold yet.)

In the 60s, because the Boomers had grown up with affluence, privilege, and easy access to education, they were also perfectly positioned to rebel their asses off because they could afford to, hence all of the protests and whatnot of that era.

And this sowed the seeds of the end of this era, ironically.

The socio-capitalist model was murdered, quite intentionally, beginning in 1980, when Ronald fucking Reagan became President, and he and his cronies slowly began dismantling everything created by every president from FDR through, believe it or not, Richard Nixon. (Hint: EPA.)

The mantra of these assholes was “Deregulate Everything,” which was exactly what the world was like in the era before FDR.

Just one problem, though. Deregulating any business is no different from getting an alligator to not bite you by removing their muzzle and then saying to them, “You’re not going to bite me, right?”

And then believing them when they swear they won’t before wondering why you and everyone you know has only one arm.

Still, while it supports an economic system that just isn’t possible today without a lot of major changes, The Middletons still provides a nice look at an America that did work because it focused on invention, industry, and manufacturing not as a way to enrich a few shareholders, but as a way to enrich everyone by creating jobs, enabling people to actually buy things, and creating a rising tide to lift all boats.

As for Bud, he probably would have wound up in the military, learned a couple of skills, finished college quickly upon getting out, and then would have gone to work for a major company, possibly Westinghouse, in around 1946, starting in an entry-level engineering job, since that’s the skill and interest he picked up during the War.

Along the way, he finds a wife, gets married and starts a family, and thanks to his job, he has full benefits — for the entire family, medical, dental, and vision; for himself, life insurance to benefit his family; a pension that will be fully vested after ten years; generous vacation and sick days (with unused sick days paid back every year); annual bonuses; profit sharing; and union membership after ninety days on the job.

He and the wife find a nice house on Long Island — big, with a lot of land, in a neighborhood with great schools, and easy access to groceries and other stores. They’re able to save long-term for retirement, as well as for shorter-term things, like trips to visit his folks in Indiana or hers in Miami or, once the kids are old enough, all the way to that new Disneyland place in California, which reminds Bud a lot of the World’s Fair, especially Tomorrowland.

If he’s typical for the era, he will either work for Westinghouse for his entire career, or make the move to one other company. Either way, he’ll retire from an executive level position in about 1988, having been in upper management since about 1964.

With savings, pensions, and Social Security, he and his wife decide to travel the world. Meanwhile, their kids, now around 40 and with kids about to graduate high school, aren’t doing so well, and aren’t sure how they’re going to pay for their kids’ college.

They approach Dad and ask for help, but he can’t understand. “Why don’t you just do what I did?” he asks them.

“Because we can’t,” they reply.

That hopeful world of 1939 is long dead — although, surprisingly, the actor who played Bud is still quite alive.

Image: Don O’Brien, Flickr, 2.0 Generic (CC BY 2.0), the Middleton Family in the May 1939 Country Gentleman ad for the Westinghouse World’s Fair exhibits.

Sunday Nibble #21: Some Flag Day Birthdays of Important People

In the United States, June 14 is Flag Day, which commemorates the adoption, by the Second Continental Congress on that date in 1777, of the official flag of the British colonies. This is the familiar banner of 13 alternating red and white stripes, and a blue field with a circle of 13 white stars in it.

However, it’s important to remember that while it came after the Declaration of Independence, it also came before the country won its independence, so it started out as the battle flag of a rebellious territory. The only reason it finally became the first flag of the U.S. was because we won that war.

That’s an important distinction to make when it comes to flags, even if some people forget and have to be reminded. It’s also probably not true that Betsy Ross created that first flag. Rather, this was propaganda created nearly a century later to benefit the guy who created the famous painting of… Betsy Ross creating the first flag.

Hm. I wonder if Bob Ross is related? “And let’s paint a happy little rebel right here…”

But Flag Day as an official holiday was not declared until 1916, by Woodrow Wilson, U.S. President and noted racist dick, This was the year before the U.S. entered World War I, by the way, although it was still called The Great War at the time because Germany hadn’t come back to release the sequel and the special edition of the first one, which involved a lot of retconning.

Now, it’s probably just a coincidence, but quite a lot of babies born on Flag Day would have been conceived because their parents fucked on Labor Day weekend — no, really, they’re about 280 days apart — and although that’s just the average, it still gives us the image of Labor Day turning into labor day on Flag Day.

Which brings us to the topic at hand: People born on Flag Day who have made significant contributions to the world, ordered by date of birth.

  1. Harriet Beecher Stowe (1811) Author of Uncle Tom’s Cabin (1852), a somewhat heavy-handed and patronizing work that was sympathetic toward the plight of American slaves when it was written — a terrible example of YT people missing the target now, but incredibly progressive for its time.
  2. Pierre Salinger (1925) American journalist, author, and politician, press secretary for JFK and LBJ, briefly an interim appointed senator for California, and campaign manager for RFK in 1968. Later, a reporter for ABC News. Notably, he never lied while he was press secretary.
  3. Ernesto “Che” Guevara (1928): Argentine Marxist revolutionary, poster child for generations of college students who think they’re Marxists and don’t read his story — he did a lot of good, but was not as good as his fans think. Basically, kind of like everyone else.
  4. Marla Gibbs (1931): African-American actress, made famous by her role as George Jefferson’s maid Florence in the 1975 TV series The Jeffersons. She was one of many actors in the 70s and 80s who elevated black people in American mass media, presenting them as people who were not just pimps and junkies but, rather, who were just like everyone else.
  5. Jerzy Kosiński (1933): Polish born immigrant to America, writer. Best known for the novel Being There and the movie based on it, about a man who is so simple and who grew up so isolated from the real world that he becomes an everyman, a blank slate that people project their hopes and fears onto. While he has absolutely no real personality, empathy, education, or people skills, his fans still think he’s the greatest thing to ever happen. Hm. Sound familiar? The only difference is that Kosiński’s Chance the Gardener character was totally benign and harmless.
  6. Steny Hoyer (1939): A Democratic congressman from Maryland, former House Minority Whip and current House Majority Leader. In his last election in 2018, he defeated his Republican opponent, William Devine III, 70.3% to 27.1%.
  7. Boy George (1961): English singer, songwriter, DJ, and fashion designer who became famous for bringing gender-bending and sexual ambiguity to pop music in the early 1980s. He was largely responsible for making Boomers clutch their pearls as their Gen-X kids latched onto the music and style. OMG, Boy George wore make-up and flowing outfits that could have been gowns or muumuus and, most importantly, pissed off old people by his mere ambiguous existence.

So there are seven significant people I could think of who were born on this day. There are certainly a lot of others who may be lesser known or have done less, but I can’t think of any more important, at least not in the modern age.

Happy birthday to these seven, and happy Flag Day to my American readers.

Wednesday Wonders: How the world almost ended once

Quarantine is hard, so in lieu of not posting anything, here’s a blast from the past, an article posted in February 2019, but which is still relevant today.

I happen to firmly believe that climate change is real, it is happening, and humans are contributing to and largely responsible for it, but don’t worry — this isn’t going to be a political story. And I’ll admit that I can completely understand some of the deniers’ arguments. No, not the ones that say that “global warming” is a hoax made up so that “evil liberals” in government can tax everyone even more. The understandable arguments are the ones that say, “How could mere humans have such a big effect on the world’s climate?” and “Climate change is cyclic and will happen with or without us.”

That second argument is actually true, but it doesn’t change the fact that our industrialization has had a direct and measurable impact in terms of more greenhouse gases emitted and the planet heating up. Also note: Just because you’re freezing your ass off under the polar vortex doesn’t mean that Earth isn’t getting hotter. Heat just means that there’s more energy in the system and with more energy comes more chaos. Hot places will be hotter. Cold places will be colder. Weather in general will become more violent.

As for the first argument, that a single species, like humans, really can’t have all that great an effect on this big, giant planet, I’d like to tell you a story that will demonstrate how wrong that idea is, and it begins nearly 2.5 billion years ago with the Great Oxygenation Event.

Prior to that point in time, the Earth was mostly populated by anaerobic organisms — that is, organisms that do not use oxygen in their metabolism. In fact, oxygen is toxic to them. The oceans were full of bacteria of this variety. The atmosphere at the time was about 30% carbon dioxide and close to 70% nitrogen, with perhaps a hint of methane, but no oxygen at all. Compare this to the atmosphere of Mars today, which is 95% carbon dioxide, 2.7% nitrogen, and less than 2% other gases. Side note: This makes the movie Mars Attacks! very wrong, because a major plot point was that the Martians could only breathe nitrogen, which is currently 78% of our atmosphere but almost absent in theirs. Oops!

But back to those anaerobic days and what changed them: A species of algae called cyanobacteria figured out the trick to photosynthesis — that is, producing energy not from food, but from sunlight and a few neat chemical processes. (Incidentally, this was also the first step on the evolutionary path to eyes.) Basically, these microscopic fauna would take in water and carbon dioxide, use the power of photons to break some bonds, and then unleash the oxygen from both of those elements while using the remaining carbon and hydrogen.

At first, things were okay because oxygen tended to be trapped by organic matter (any carbon containing compound) or iron (this is how rust is made), and there were plenty of both floating around to do the job, so both forms of bacteria got along fine. But there eventually became a point when there were not enough oxygen traps, and so things started to go off the rails. Instead of being safely sequestered, the oxygen started to get out into the atmosphere, with several devastating results.

First, of course, was that this element was toxic to the anaerobic bacteria, and so it started to kill them off big time. They just couldn’t deal with it, so they either died or adapted to a new ecological niche in low-oxygen environments, like the bottom of the sea. Second, though, and more impactful: All of this oxygen wound up taking our whatever atmospheric methane was left and converting it into carbon dioxide. Now the former is a more powerful greenhouse gas, and so was keeping the planet warm. The latter was and still is less effective. The end result of the change was a sudden and very long ice age known as the Huronian glaciation, which lasted for 300 million years — the oldest and longest ice age to date. The result of this was that most of the cyanobacteria died off as well.

So there you have it. A microscopic organism, much smaller than any of us and without any kind of technology or even intelligence to speak of, almost managed to wipe out all life forms on the planet and completely alter the climate for tens of millions of years, and they may have tipped the balance in as little as a million years.

We are much, much bigger than bacteria — about a million times, actually — and so our impact on the world is proportionally larger, even if they vastly outnumbered our current population of around 7.5 billion. But these tiny, mindless organisms managed to wipe out most of the life on Earth at the time and change the climate for far longer than humans have even existed.

Don’t kid yourself by thinking that humanity cannot and is not doing the same thing right now. Whether we’ll manage to turn the planet into Venus or Pluto is still up for debate. Maybe we’ll get a little of both. But to try to hand-wave it away by claiming we really can’t have that much of an impact is the road to perdition. If single-celled organisms could destroy the entire ecosystem, imagine how much worse we can do with our roughly 30 to 40 trillion cells, and then do your best to not contribute to that destruction.