23 and me (and thee)

Warning: after you read this, you’re going to start seeing the numbers 23 and 5 everywhere. Sorry.

When I was 23 years old, I first encountered and read the very trippy book The Illuminatus! Trilogy by Robert Shea and Robert Anton Wilson. I’ve mentioned the latter several times here, and probably will again. Along with several others, he became one of my major writing influences early on.

Now, the thing about me coming to read the book for the first time when I was 23 is that it seemed to come about completely by happenstance. I mentioned to a coworker, who was a Wiccan, that I’d just turned 23, and she said, “Oh, you need to read this book.” I did a little research into it, thought it looked interesting, and headed down to the Bodhi Tree, the now-defunct Melrose Avenue bookshop that specialized in all things new age and esoteric.

The thing is massive — something like 800 pages, I think, and was published in trade paperback format, which is the bigger size in comparison to mass-market paperback. Trade paperbacks are close to the dimensions of standard hardcover books.

Anyway, I started to read it, and the book hooked me immediately. Why not? I was 23, and it was full of sex, drugs, and rock and roll. It also affectionately mimicked and imitated the styles and structures of things like Joyce’s Ulysses and the cut-up technique preferred by William S. Burroughs. Threads of the story weave in and out of each other in constant interruptions, the identity of narrator keeps changing by passing among omniscient third person to first-person from the characters — some of whom seem aware that they are characters in a novel, maybe — and the whole thing plays out as a neo noir detective mystery wrapped around a psychedelic conflation of every far right and far left conspiracy theory of the time, with a healthy dose of science fiction, fantasy, and eldritch horror.

Besides Joyce and Burroughs, H.P. Lovecraft and his universe receive various nods, and one of the protagonists (?) travels about in a golden submarine that evokes both the Beatles and Captain Nemo at the same time.

One of the running ideas in the book is the mystical importance of the number 23, which pops up constantly in the narrative. This also implies the importance of the number 5, which is the sum of 2 and 3. This is also why, in later years, it was tradition for Wilson to always publish his newest book on May 23rd.

There are some very interesting facts about the number, actually — and it shouldn’t escape notice that Wilson’s last initial, W, is the 23rd letter of the Latin alphabet. Those facts do go on and on, too. Here’s another list that has surprisingly little overlap with the first.

William S. Burroughs was obsessed with the number 23, which is mentioned in the novel, and many works created post-Illuminatus! capitalize on the concept by using it. You’ll find 23s in things like the TV show Lost, various films including Star Wars Episode IV, and two films that specifically deal with it, the American film The Number 23 and the German film 23, although the latter would be more properly called Dreiundzwanzig.

There are, of course, also plenty of examples of the number 5 doing interesting things as well.

So here I was, reading this amazing brain-bender of a book at the young age of 23, and I started to wonder whether there was any truth to this idea. You know what happened? I started seeing the number 23 everywhere. It would be on the side of taxis and buses — bonus points, sometimes I’d see 523, 235, 2355 or similar combinations. It would show up on receipts — “You’re order number 23!” It would be one of the winning numbers or the mega number for the current lottery winner. The total when shopping would end in 23 cents, or else 67 cents, meaning that I’d get 23 cents in change.

Wilson eventually gives up the secret to the secret, although not in this book. He does offer another interesting exercise that worked for me at the time, although probably not so much anymore since people don’t tend to carry change around any longer. He referred to it as The Quarter Experiment, although I think of it as “Find the Quarter,” and it’s exactly what it sounds like. When you’re out and about walking around, visualize a quarter (or local coin in your currency of similar size, about 25mm) and then look for one that’s been dropped on the ground.

Back in the day, Wilson claimed success with this and, sure enough, so did I. It’s worth it to click the link above and read the explanation, as well as the several ways to interpret it. (It’s also worthwhile to check out and do the other exercises listed, but especially number four. Too bad the list didn’t make it to five.)

But, again, people just aren’t as likely to drop quarters because they probably only trot them out to do laundry, especially with most parking meters accepting debit and credit cards now. A lot of public washers and driers are also doing the same, so we may be swiftly approaching a day where the only likely place someone might drop some coins is in front of one of those grocery store change converter machines.

Still, you can probably do this experiment fnord with any other object likely to be dropped, like a pen, or a receipt, or keys.

After I finished my first read of Illuminatus!, I went on to read almost all of the rest of Wilson’s oeuvre, both fiction and non. He wrote a number of books outlining his philosophy, like Prometheus Rising and Right Where You Are Sitting Now, as well as his Cosmic Trigger series, which is a cross between autobiography and philosophy, and the amazing novel Masks of the Illuminati, in which James Joyce, Albert Einstein, and Aleister Crowley walk into a bar in Geneva and things get trippy. I’ve always wanted to adapt this one into a play or film and, in fact, it was influential in the creation of my own play Three Lions, which involved Crowley, Ian Fleming, and Hermann Hesse. (Available for production, if you’re interested — here’s the first scene.)

Okay, Wilson has got too many works to cite individually, so just go check out his website for the full list. Meanwhile, this is where we’re going to go meta and full circle.

I’ve re-read Illuminatus! multiple times, and in fact started another read-through about (d’oh!) five weeks ago. Every time through it, it’s a completely different work and I get different things out of it. When I was 23, it was one story. Each of three times after that, it was another. Now, it’s yet again completely different and I just realized that this is, in fact, my fifth pass through the text.

So it was weirdly appropriate when I found out that a friend of mine from our improv company was going to turn 23 on April 30. That date itself is significant because a large part of the present story of the book takes place in April and May, but on top of that I suddenly had the chance to return the favor that my coworker had done for me oh so long ago, so I gifted my young friend a dead-tree copy of the anthology version.

Hey, I survived that journey and I think it made me a better person. Might as well share the love, right? My only hope is that somewhere down the line, after he’s read it a bunch of times, he’s in the position to pass the torch to another 23-year-old.

Pictured: My photo of the covers of my original U.S. paperback versions of the books, which I was lucky enough to find in a used bookstore for cheap a few years back. Interestingly enough, that bookstore is called Iliad Books, and it used to be next door to a video rental place called Odyssey. Both of those also figure into the fnord book. Yes, it’s quite the rabbit hole.

Not pictured: my autographed by RAW himself original edition and my later “checkerboard” cover version from the 90s.

The voice

Recently, I was working at what’s called the Small Business Marketing Plan Bootcamp, run by two old friends of mine, Hank and Sharyn Yuloff. Well, I’ve known Hank longer, lost touch with him for a while, then re-encountered him at random because we had a friend in common we’d both met long after, and then Hank absolutely hated the movie The Blair Witch Project. Long story, but it was another one of those weird moments in which the most random of events somehow led to big things later on.

If you come to their bootcamp and I’m working it, he’ll probably tell you the whole story. Short version, he sent an email rant about the film to one of my friends, A, who’d co-founded the site with me and D (all three of us had been in a band together way the hell back in my “stupid enough to be in a band” days), and A also told him he should write a review for Filmmonthly.com. When the review popped up, I saw his name and, since it’s an unusual one, I contacted him to say, “Hey… didn’t I know you once?”

As for the Filmmonthly website, it’s still there, although A, D, and I passed it on to other people a long time ago, but since all three of us were the publishers for a long time, it’s unfortunately kind of hard to search for any of our reviews specifically there because our names are pretty much embedded in every page, although I can at least lead you to my deep analysis of the movie A.I., and my review of Stanley Kubrick’s last film, Eyes Wide Shut. And, to top that all off, my other in-depth analysis, of The Big Lebowski, wound up enshrined forever in that mythos in the book Lebowski 101.

But I do digress… All of that intro was by way of saying that I’ve known Hank and Sharyn forever, they are amazing people, they have certainly plugged me a lot to their clients, and in this latest seminar, Hank said something that initially really pissed me off.

It was a day dedicated to the importance of social media, and during the portion about blogging. (Side note: This blog itself only exists because they gave me a freebie bootcamp a couple of years ago, although Hank told me that it wasn’t me getting a freebie from them. Rather, it was them investing in me, and he was right.) Anyway, after they’d talked about the importance of creating content and so on, somebody asked, “What if you can’t write? Should you hire a ghostwriter?”

Hank’s immediate answer was, “No. You have to write it because it has to be in your own voice.”

And, honestly, my sudden instinct was to jump up and yell, “Oh, that’s bullshit!” I mean, one of the words on my business card is “ghostwriter,” and it’s basically what I did for a certain cable TV star for five years, creating a weekly column for his readers, along with maintaining the marketing and corporate voice for his website and magazine that entire time. Hell, my titles were Senior Editor and Head Writer.

On top of that, as an experienced and award-winning writer of plays, TV, film, short stories, and long-form fiction, I’ve got a lot of experience in writing in other voices. That’s what writers of fiction do — we speak as other people. And so one of the biggest talents I think that I bring to the corporate world is exactly that: the ability to write as someone else. Give me your voice, I’ll imitate the hell out of it.

But I refrained from saying anything during the bootcamp because, after all, it’s his and Sharyn’s show, so I’ve got no place in rocking the boat (or, as we say in improv, not “Yes, Anding” them), but then after he said it, I started to think a bit more on the concept, and realized that we’re sort of both right in different ways, especially as he explained his reasoning.

See, most of the people at this seminar were entrepreneurs — small business people, either running their own show or with a very small staff. And that does make a difference in establishing a corporate voice because they are most directly the voice of their own corporation or company. Why? Because when they go out to recruit or meet potential clients, it’s just them. It’s not their CFO, or CEO, or Marketing Team, or Social Media mavens, or copywriter because those people do not exist in their organizations. And, so, if all of those blog posts sound one way but, in person, they sound another, clients are going to rightfully sense the difference and nope right outta there because the person they met online and the person they met IRL don’t mesh up, so the person IRL sounds inauthentic.

Brand killer.

That was my own a-ha moment. Keep in mind that I can get tetchy when anyone says, “Hey… anyone can write!” My knee-jerk reaction is, “No. False.” But, you know what? It’s partly true, but let’s go through all the steps.

We all grow up using language. It’s what humans do. And, honestly, it’s what a ton of animals and birds do. Most primates, most cetaceans, pretty much every mammal, parrot, crow, octopus, and even some trees and fungus, whatever. Linking together a bunch of signals — whether words, sounds, images, smells, or chemicals — and having those linked signals relay a message from one entity to the other… that’s pretty much what all intelligent life does.

Boom. Communication. That is what language is. If you can successfully tell that driver, “Hey, hit the damn brakes so you don’t run over my baby,” whether you do it with words, screams, frantic hand waves, a sudden bouquet of smells or hormones, or a well-timed text, then you have communicated very effectively.

But… there’s a huge difference between “effective” and “well,” and I think this is where my feelings and Hank’s feelings on it both part and converge again.

Yes, everybody has their own unique voice, and that has to do with words they use and patterns of speech, and so on. But… the really important part is how all of those separate phrases and sentences and what not add up into a coherent story. And this is where what I do comes in.

If you’re an entrepreneur, should you write your own blogs? Oh, absolutely, but only sort of. Absolutely because, honestly, if you can talk, you can put words down in a written medium. Even if you can’t talk — most humans learn how to communicate with words, whether it’s in spoken language, sign language, or even just written down.

What most humans don’t learn is how to structure the mass of those words into an interesting and compelling story. This is where I come in, and where Hank and I came back into agreement not long after.

He phrased it the best, although I paraphrase it now, in terms of attorneys. “The man who represents himself has a fool for a client.” He followed that up with, “The person who edits their own writing, likewise,” and I could not agree more.

And that’s really what I do — I’m the third eye on your manuscript, I’m the midwife who makes sure to clean up and swaddle your baby before we dump it in your lap. I’m the guy who jumps in the way before you step out into traffic and shoves you back onto the curb, and I’m also a pretty big history and science nerd, so I will stop you from looking silly by knocking the anachronisms out of whatever you’re writing and polishing up the science. Final bonus points: I was raised by an amazing grammar-Nazi English teacher, so I’ll give you the same.

I’m not cheap, but I’m worth it. Trust me. If you want to raise your marketing antlers above the herd of crap that’s all over the place out there, then drop me a line. Rates are negotiable, and depend a lot on subject and page count. Hint: If you’re doing history or Sci-Fi, or your word count is under 40,000 let’s talk discounts. Scripts, plays, and screenplays also considered. But if you want to invest in your future and get some returns, then invest in me first, because I will definitely steer you there.

“War is not healthy for children and other living things.” Except…

The title of this article comes from an incredibly iconic poster that was created during the Vietnam War in the 1960s. Specifically, it was created by printmaker Lorriane Schneider in 1967, and was inspired by her concern that her oldest son would be drafted and die in a war that many Americans considered unnecessary.

However, the Vietnam War is a strange exception and beginning point for a tidal change in American wars. Post-Vietnam, the only benefits wars seem to have given us are more efficient (although not cheaper) ways to kill people, and that sucks. (Incidentally, the Korean War is technically not a war. It also technically never ended.)

But… as weird as it may sound, a lot of the major wars prior to Vietnam actually gave American society weird and unexpected benefits. Yeah, all of that death and killing and violence were terrible, but like dandelions breaking through urban sidewalks to bloom and thrive, sometimes, good stuff does come in the aftermath of nasty wars. Here are five examples.

The American Revolution, 1775-1783

The Benefit: The First Amendment (and the rest of the Constitution)

By the beginning of the 18th century, Europe was having big problems because Monarchs and the Church were all tied up together, the state dictated religion, and so on. It came to an extreme with Britain’s Act of Settlement in 1714, which barred any Catholic from ever taking the throne. The end result of this was that the next in line turned out to be the future George I, son of Sophia. Sophia, however, was an Elector of Hanover or, in other words, German. Queen Victoria was a direct descendant of George I, and spoke both English and German. In fact her husband, Prince Albert, was German.

But the net result of all the tsuris over the whole Catholic vs. Protestant thing in Europe, on top of suppression of the press by governments, led to the Founders making sure to enshrine freedom of speech and the wall between church and state in the very first Amendment to the Constitution, before anything else. To be fair, though, England did start to push for freedom of the press and an end to censorship in the 17th century, so that’s probably where the Founders got that idea. But the British monarch was (and still is) the head of the Church of England, so the score is one up, one down.

The War of 1812, 1812-1815

The Benefit: Permanent allegiance between the U.S. and Britain

This was basically the sequel to the American Revolution, and came about because of continued tensions between the two nations. Britain had a habit of capturing American sailors and forcing them into military duty against the French, for example, via what were vernacularly called “press gangs.” They also supported Native Americans in their war against the fairly new country that had been created by invading their land. So again, one up, one down. And the second one, which is the down vote to America, is rather ironic, considering that the Brits were basically now helping out the people whose land had been stolen by… the first English settlers to get there.

And, honestly, if we’re really keeping score, the U.S. has two extra dings against it in this one: We started it by declaring war — even if there were legitimate provocations from Britain — and then we invaded Canada.

But then a funny thing happened. The U.S. won the war. By all rights it shouldn’t have. It was a new country. It really didn’t have the military to do it. It was going up against the dominant world power of the time, and one that would soon become an empire to boot.

The war technically ended with the Treaty of Ghent in 1814, but there was still the Battle of New Orleans to come after that, and it happened because news of the end of the war hadn’t gotten there yet. In that one, the U.S. kicked Britain’s ass so hard that they then basically said, “Remember all the concessions we made in that treaty? Yeah, not. LOL.”

In a lot of ways, the war was really a draw, but it did get the British to remove any military presence from the parts of North America that were not Canada, and opened the door to American expansionism across the continent. It also helped to establish the boundary between the U.S. and Canada, which is to this day the world’s longest undefended border. Finally, it cemented the relationship of the U.S. and Britain as allies and BFFs, which definitely came in handy in the 20th century during a couple of little European dust-ups that I’ll be getting to shortly.

The American Civil War, 1861-1865

The Benefit: Mass-manufactured bar soap

Now in comparison to the first two, this one may seem trivial and silly, but it actually does have ramifications that go far beyond the original product itself. And it doesn’t matter whether you’re a fan of bar soap now or go for the liquid kind (my preference), because both were really born out of the same need and process.

Once upon a time, soap-making was one of the many onerous tasks that the women of the house were expected to do, along with cleaning, cooking, sewing, canning, laundry, ironing, taking care of the menfolk (husbands and sons, or fathers and brothers), and generally being the literal embodiment of the term “drudge.” But soap-making was so arduous a task in terms of difficulty and general nastiness that it was something generally done only once or twice a year, basically making enough to last six or twelve months.

To make soap involved combining rendered fat and lye. (Remember Fight Club?) The fat came easy, since people at the time slaughtered their own animals for food, so they just ripped it off of the cow or pig or whatever meat they’d eaten. The lye came from leeching water through ashes from a fire made from hardwood, believe it or not, and since wood was pretty much all they had to make fires for cooking, ashes were abundant. Yes, I know, it’s really counter-intuitive that something so caustic could be made that way, but there you go. The secret is in the potassium content of the wood. Fun fact: the terms hard- and softwood have nothing to do with the actual wood itself, but rather with how the trees reproduce. (And I’ll let your brain make the joke so I don’t have to.)

So soap was a household necessity, but difficult to make. Now, while William Procter and James Gamble started to manufacture soap in 1838, it was still a luxury product at the time. It wasn’t until a lot of men went to war in 1861 that women had to run homesteads and farms on top of all of their other duties, and so suddenly manufactured soap started to come into demand. Especially helpful was Procter and Gamble providing soap to the Union Army, so that soldiers got used to it and wanted it once they came home.

Obviously, easier access to soap helped with hygiene but, more importantly, the industry advertised like hell, and from about the 1850s onward, selling soap was big business. There’s a reason that we call certain TV shows “soap operas,” after all, and that’s because those were the companies that sponsored the shows.

World War I, 1914-1918 (U.S. involvement, 1917-1918)

The Benefit: Woman’s suffrage and the right to vote

It’s probably common knowledge — or maybe not — that two big things that happened because of World War I were an abundance of prosthetic limbs and advances in reconstructive and plastic surgery. However, neither of these were really invented because of this conflict, which “only” led to improved surgical techniques or better replacement limbs.

The real advance is sort of an echo of the rise of soap via the Civil War, in the sense that the former conflict freed women from one nasty restriction: Having no say in government. And, as usually happens when the boys march off to do something stupid, the women have to take up the reins at home, and sometimes this gets noticed. It certainly did in the case of WW I, and suffragettes wisely exploited the connection between women and the homefront war effort. Less than two years after the conflict officially ended, women were given the right to vote on August 26, 1920 with the passage of the 19th Amendment.

Hey! Only 144 years too late. Woohoo!

World War II, 1939-1945 (U.S. involvement, 1941-1945)

The Benefit: The rise of the American middle class

As World War II was starting to move to an end, the Servicemen’s Readjustment Act of 1944 was passed into law. It was designed to assist returning service members via things like creating the VA hospital system, providing subsidized mortgages, assisting with educational expenses, and providing unemployment. It was also a direct reaction to the less-than-fantastic reception returning veterans of World War I had received.

In fact, one of FDR’s goals in creating what is commonly known as the G.I. Bill was to expand the middle class, and it succeeded. Suddenly, home ownership was within reach of people who hadn’t been able to obtain it before and, as a result, new housing construction exploded and, with it, the emergence of suburbs all across the country. With education, these veterans found better jobs and higher incomes, and that money went right back into the economy to buy things like cars, TVs, and all the other accoutrements of suburban living. They also started having children — it’s not called the Baby Boom for nothing — and those children benefited with higher education themselves. The rates of people getting at least a Bachelor’s Degree began a steady climb in the 1960s, right when this generation was starting to graduate high school. At the same time, the percentage of people who hadn’t even graduated from high school plunged.

The top marginal tax rates of all time in the U.S. happened in 1944 and 1945, when they were at 94%. They remained high — at least 91% — throughout the 1950s. Oddly, despite the top rate in the 1940s being higher, the median and average top tax rates in the 1950s were higher — about 86% for both in the 40s and 91% for both in the 50s. The economy was booming, and in addition to paying for the war, those taxes provided a lot of things for U.S. Citizens.

Even as his own party wanted to dismantle a lot of FDR’s New Deal policies, President Eisenhower forged ahead with what he called “Modern Republicanism.” He signed legislation and started programs that did things like provide government assistance to people who were unemployed, whether simply for lack of work or due to age or illness. Other programs raised the minimum wage, increased the scope of Social Security, and founded the Department of Health, Education and Welfare. In a lot of ways, it was like the G.I. Bill had been extended to everyone.

New Horizons

I’ve always been a giant nerd for three things: History, language, and science. History fascinates me because it shows how humanity has progressed over the years and centuries. We were wandering tribes reliant on whatever we could kill or scavenge, but then we discovered the secrets of agriculture (oddly enough, hidden in the stars), so then we created cities, where we were much safer from the elements.

Freed from a wandering existence, we started to develop culture — arts and sciences — because we didn’t have to spend all of our time picking berries or hunting wild boar. Of course, at the same time, we also created things like war and slavery and monarchs, which are really the ultimate evil triumvir of all of humanity, and three things we really haven’t shaken off yet, even if we sometimes call them by different names. At the same time, humanity also strove for peace and freedom and equality.

It’s a back and forth struggle as old as man, sometimes forward and sometimes back. It’s referred to as the cyclical theory of history. Arthur Schlesinger, Jr. developed the theory with specific reference to American history, although it can apply much farther back than that. Anthony Burgess, author of A Clockwork Orange, explored it specifically in his earlier novel The Wanting Seed, although it could be argued that both books cover two different aspects of the cycle. The short version of the cycle: A) Society (i.e. government) sees people as good and things progress and laws become more liberal. B) Society (see above) sees people as evil and things regress as laws become harsher and draconian, C) Society (you know who) finally wakes up and realizes, “Oh. We’ve become evil…” Return to A. Repeat.

This is similar to Hegel’s Dialectic — thesis, antithesis, synthesis, which itself was parodied in Robert Anton Wilson and Robert Shea’s Illuminatus! Trilogy, which posited a five stage view of history instead of three, adding parenthesis and paralysis to the mix.

I’m not entirely sure that they were wrong.

But enough of history, although I could go on about it for days. Regular readers already know about my major nerdom for language, which is partly related to history as well, so let’s get to the science.

The two areas of science I’ve always been most interested in also happen to be at completely opposite ends of the scale. On the large end are astronomy and cosmology, which deal with things on scales way bigger than what we see in everyday life. I’m talking the size of solar systems, galaxies, local clusters, and the universe itself. Hey, when I was a kid, humans had already been in space for a while, so it seemed like a totally normal place to be. The first space disaster I remember was the Challenger shuttle, and that was clearly human error.

At the other end of the size scale: chemistry and quantum physics. Chemistry deals with interactions among elements and molecules which, while they’re too small for us to see individually, we can still see the results. Ever make a vinegar and baking soda volcano? Boom! Chemistry. And then there’s quantum physics, which deals with things so small that we can never actually see them, and we can’t even really be quite sure about our measurements of them, except that the models we have also seem to give an accurate view of how the universe works.

Without understanding quantum physics, we would not have any of our sophisticated computer devices, nor would we have GPS (which also relies on Einstein’s Relativity, which does not like quantum physics, nor vice versa.) We probably wouldn’t even have television or any of its successors, although we really didn’t know that at the time TV was invented, way before the atomic bomb. Not that TV relies on quantum mechanics, per se, but its very nature does depend on the understanding that light can behave as either a particle or a wave and figuring out how to force it to be a particle.

But, again, I’m nerding out and missing the real point. Right around the end of last year, NASA did the amazing, and slung their New Horizons probe within photo op range of the most distant object we’ve yet visited in our solar system. Called Ultima Thule, it is a Kuiper Belt object about four billion miles away from earth, only about 19 miles long, and yet we still managed to get close enough to it to get some amazing photos.

And this really is the most amazing human exploration of all. New Horizons was launched a generation or two after both Viking probes, and yet got almost as far in under half the time — and then, after rendezvousing with disgraced dwarf planet Pluto went on to absolutely nail a meeting with a tiny rock so far from the sun that it probably isn’t even really all that bright. And all of this was done with plain old physics, based on rules worked out by some dude in the 17th century. I think they named some sort of cookie after him, but I could be wrong. Although those original rules, over such great distances, wouldn’t have really worked out without the tweaking that the quantum rules gave us.

Exploring distant space is really a matter of combining our knowledge of the very, very big with the very, very small — and this should really reflect back on our understanding of history. You cannot begin to comprehend the macro if you do not understand the micro.

Monarchs cannot do shit without understanding the people beneath them. This isn’t just a fact of history. For the scientifically inclined, the one great failing of Einstein’s theories — which have been proven experimentally multiple times — is that they fall entirely apart on the quantum level. This doesn’t mean that Einstein was wrong. Just that he couldn’t or didn’t account for the power of the very, very tiny.

And, call back to the beginning: Agriculture, as in the domestication of plants and animals, did not happen until humans understood the cycle of seasons and the concept of time. Before we built clocks, the only way to do that was to watch the sun, the moon, and the stars and find the patterns. In this case, we had to learn to pay attention to the very, very slow, and to keep very accurate records. Once we were able to predict things like changes in the weather, or reproductive cycles, or when to plant and when to harvest, all based on when the sun or moon rose or set, ta-da. We had used science to master nature and evolve.

And I’ve come full circle myself. I tried to separate history from science, but it’s impossible. You see, the truth that humanity learns by objectively pursuing science is the pathway to free us from the constant cycle of good to bad to oops and back to good. Repeat.

Hey, let’s not repeat. Let’s make a concerted effort to agree when humanity achieves something good, then not flip our shit and call it bad. Instead, let’s just keep going ever upward and onward. Change is the human condition. If you want to restore the world of your childhood, then there’s something wrong with you, not the rest of us. After all, if the negative side of humanity had won when we first learned how to domesticate plants and animals and create cities, we might all still be wandering, homeless and nearly naked, through an inhospitable world, with our greatest advancements in technology being the wheel and fire — and the former not used for transportation, only for grinding whatever plants we’d picked that day into grain. Or, in other words, moderately intelligent apes with no hope whatsoever of ever learning anything or advancing toward being human.

Not a good look, is it? To quote Stan Lee: “Excelsior!”

Onward. Adelante. Let’s keep seeking those new and broader horizons.

Nothing changes until we change it

You’ve probably never heard of Milton Slocum Latham unless you’re a serious California history nerd. I’d never heard of him until today, but I discovered him because I looked up a list of California governors. I did this because the Chief Justice of the California Supreme Court, Tani Cantil-Sakauye, announced that she was giving up her current party affiliation in order to become independent. I was curious as to which governor had put her on the court and who made her Chief Justice.

Note that I don’t really want to discuss partisan politics here. You can look up the particulars yourself. Suffice it to say that Cantil-Sakauye was appointed by a governor of her own party, then made chief justice by a governor from the other party. But what really caught my eye was going down that list of California governors and realizing that there have been a lot of tumultuous changes.

For one thing, a lot of governors served very short terms, and either resigned or were not re-elected or even nominated. This seemed particularly common in the 19th century, which makes sense considering that California came into the union in 1850 as a free state (i.e., slavery was illegal), but seemed to have a lot of Democratic governors around the time of the Civil War. And, if you’re not ignorant of history, you know that, at that time, the Democratic Party was mostly on the pro-slavery side while the Republicans were anti-slavery. This was before the great reversal of sides begun under FDR and completed while LBJ was president.

The first Republican governor of California was Leland Stanford — you might recognize his name from that little university in the northern part of the state. Elected in 1861, he only served one term at a time when the governor’s term was only two years. The law changed to double that term as soon as he left office, of course, although he did go on to serve as a U.S. Senator for California for eight years, until his death in 1893.

Stanford isn’t the only governor to have namesake places in the state. The city of Downey was named for the seventh governor, John Gately Downey, who, until Arnold Schwarzenegger, was the only governor of the state not born in the U.S. (he was Irish.) On the other hand, while it’s been claimed that Haight Street in San Francisco was named for Henry Huntly Haight, the tenth governor, that’s probably not true. This claim was first made in 1989, but the oldest mention of the street’s namesake, from 1916, says that it was probably named for Fletcher Haight, a local lawyer and district judge who, coincidentally perhaps, died the year before the other Haight became governor. And it does make sense. Governors tend to get things bigger than streets named after them.

But let’s get back to Milton Slocum Latham, the sixth governor of California, and the person to hold the singular distinction of having served the shortest term to date in that position: five days, from January 9, 1860 to January 14, 1860. He immediately preceded the aforementioned Governor Downey, by the way.

Now, why was Latham’s term so short? Did a scandal throw him out of office? Was his election invalidated, or did he pull a William Henry Harrison and drop dead? Perhaps he changed his mind and quit? Nope. None of the above, but definite proof that some things in politics never change.

See, just after Latham’s election, one of Calfornia’s Senators, David Colbreth Broderick, went and got himself killed in a duel that was most definitely related to the contentious issue of slavery, although Broderick was also apparently quite corrupt, and had made a fortune running San Francisco the same way that Tammany Hall (a thing, not a person) had run New York. All this makes me rather ashamed to admit that ol’ Brod and I have the same birthday. Dammit.

On the other hand, he was part of an attempted offshoot of the Democratic Party at the time, the Free Soil Democrats. They were the ones opposed to slavery expanding into the west. (Note: They were not necessarily anti-slavery. They just didn’t want it moving to other states.) After a little insult battle between Broderick and David S. Terry, a former California Chief Justice no less, the two met to duel. Broderick’s pistol anti-Hamiltoned and threw away its shot by firing as he drew and putting the bullet into the ground. Terry then nailed him in the right lung.

The duel happened six days after the general election that Latham won with 60% of the voters. That election was on September 7, 1859, the duel was September 13, and Broderick died on the 16th. So at least we can say Latham did not run with the intent of taking that senate seat, right?

That didn’t stop him once he was in office and, since this was back in the days when Senators were still appointed by their states instead of elected, Latham did a little wheeling and dealing, and the rest was rather dubious history.

He was not re-elected to a second senate term and died in 1882, in New York, at the age of 54.

But now to the point of this history lesson. There’s really nothing new in politics. Only the names of the people and parties and the methods through which information is exchanged evolve. I’m sure that Broderick’s duel and Latham’s gambit were covered in the newspapers and periodicals at the time, discussed in the private clubs, and propagated by telegraph.

And regardless of the parties involved, I think we can all agree that somebody being elected to one office only to lobby for a sudden vacancy in a higher office after less than a week shows heinous disregard for the people who elected them — especially when that election came with a 60% majority.

Yet we see this sort of thing all the time, as an elected official will suddenly start campaigning for an office higher up, sometimes right after they’ve been sworn in. It seems particularly bad with governors who want to run for senator or president, and senators who want to run for president, but it happens at all levels. I’ve seen city council members start to stump to become the next mayor less than halfway through their first term, mayors campaigning for governor once they’ve moved into city hall, and so on.

Now I have no objection whatsoever to an elected official wanting to work their way up the food chain. That’s how it should be. I just think that we need to make them take some time to do it, which is why I think we need a little adjustment to the law. Well two adjustments.

First, does anyone else think that it’s insane, in this day and age, that people elected to the U.S. House of Representatives serve only two years? In effect, this really means that any Rep is basically spending all of their term campaigning for their next election. The California gubernatorial term doubled from two to four years well over a century ago. We need to update the House of Reps to at least four years as well.

And, for that matter, why does the Senate get six? I can understand the idea of staggering those elections into three classes, like they are now, but why not four year terms for everyone, staggered into two classes, half elected every two years? Although, given recent behavior, it really should be flipped: House term of six years, Senate term of two. Just a thought.

But the real proposal is this one:

  1. No person newly elected for the first time to any position in the government of the United States or any of its states, counties, cities, or other political jurisdictions, shall seek, campaign for, file for, raise funds for the purpose of, or otherwise pursue, election to a different position within the aforementioned governmental jurisdictions prior to the completion of one (1) complete term to which they have been elected.
  2. Any incumbent elected official in any of the jurisdictions mentioned in §1 shall not seek, campaign for, file for, raise funds for the purpose of, or otherwise pursue, election to any different position within the government unless the term for which they would be newly elected begins on or after the date that their current term would normally expire according to applicable law. This exception does not apply to a first-term official in any capacity.
  3. None of the above restrictions shall apply to an elected official seeking to be re-elected to the same position they already occupy; nor to previously elected officials who are not currently in office for reasons other than impeachment, censure, or conviction of felonies; nor to an elected official who is not eligible to run for the same office again due to term limits.

I think those rules are fair all the way around. If you want the job, at least do it for what you contracted for. If you want to apply for another job, make sure it starts after this one ends. If you want to keep your job by reapplying, or go back to work after leaving, or are going to get laid off — then do what you want.

If your only purpose in running for office is to leap-frog your way to the top of the pile as quickly as possible for the sake of power, then we don’t need you in office. Milton Slocum Latham learned that lesson first hand. There’s also a very local and specific example from Los Angeles, but I won’t mention any names here. The important part is that, as with Latham, the voters figured it out and soon said “No.” But we really need to enshrine that automatic no into the law.

And that’s not really a political position one way or the other, since this really is a case of “both sides do it.” It’s just common sense, and another way to try to restore some sanity to our political system.