Talky Tuesday: Punctuation

One of the side-effects of people texting and posting online — particularly if they do the latter with their phones — is that punctuation and, often, capitalization go by the wayside. I can understand this if you are using a phone, because the keyboard can be tiny, even on our modern oversized smart phones.

Generally, messages and posts done this way are short enough that missing punctuation, as well as regular paragraphing to indicate changes in thought, can’t hinder the meaning from getting through, at least not that much. Everyone is going to know what you mean in a short text, right?

But the longer you go and the more you write, the more you really do need to punctuate and paragraph your text. For example:

one of the side effects of people texting and posting online particularly if they do the latter with their phones is that punctuation and often capitalization go by the wayside i can understand this if you are using a phone because the keyboard can be tiny even on our modern oversized smart phones generally messages and posts done this way are short enough that missing punctuation as well as regular paragraphing to indicate changes in thought cant hinder the meaning from getting through at least not that much everyone is going to know what you mean in a short text right

How much harder was that paragraph to read than the two that opened the article? Same text exactly, just without any punctuation marks, so no road map. Which one would you rather be handed to read out loud with no preparation?

That’s pretty much the raison d’être of punctuation in any language — to clarify meaning, and especially to facilitate reading the words, whether out loud or in one’s head. But did you ever wonder where those punctuation marks came from?

Today, I’m going to focus on English, so we won’t be dealing with things like cedilla, which you see in the word façade, or the tilde, which is common in Spanish words like mañana. I’ll even pass on the French punctuation seen above in the italicized expression which just means “purpose” — literally, reason for being.

Depending upon the source, there are either fourteen or fifteen, but I’ll be focusing on fewer. I don’t agree with the latter list’s fifteen, which is a bullet point. I consider it more of a formatting tool than a punctuation mark. In a numbered list, while the numbers may or may not have period after them, nobody thinks of the numbers as punctuation, right?

I’ll also be skipping brackets and curly braces because they really aren’t in common use. And, finally, lists of more than five items tend to get cumbersome, so I’m going to stick with the most common ones and take a look at where they came from.

By the way, missing from both of the above lists: our friend the ampersand (&) which I definitely consider a punctuation mark, but which actually used to be the 27th letter of the alphabet. In fact, under its original name, you can’t spell alphabet without it, but those two letters eventually morphed into the pretzel or, as I see it, Panda sitting down to eat bamboo, that we all know and love today. And yes, you’ll never un-see that one.

Here are the origin stories of five heroic punctuation marks.

  1. Period: While the period, known in British as the “full stop,” is probably the most common punctuation mark in European languages, it came from the same forge as all of the other “dot” punctuations, including the comma, colon, semicolon, and ellipsis. The concept of the period was originally created by a Greek playwright, Aristophanes, who had grown tired of the published works of the time having no breaks between words, making the scrolls very hard to read.

Originally, his system involved placing dots either low, in the middle or high relative to the heights of the letters, and the position indicated the length of the pause, much as a period, comma, and colon indicate different lengths of pauses nowadays. However, his system did not pass directly to us. The Romans were not big fans of punctuation, and a lot of their works were copied down in so-called scriptio continua, or continuous writing.

Ironically, punctuation didn’t come back into it until Christianity began to take hold in the crumbling Roman Empire. Monks tasked with copying manuscripts by hand brought back the marks they knew from the classical Greek of Aristophanes’ era, largely to preserve the meaning of the frequently biblical texts they were copying.

And, again, if they were working to translate the Old Testament, which was largely written in Hebrew, they were going from a language that lacked punctuation, word spacing, and vowels, with the added bonus of only being written in the present tense. Yeah, that must have been a hair-puller. And, no doubt, the New Testament stuff they were working with probably had many of the same issues, since it was written in the Greek, Latin, Hebrew, and Aramaic of the late 1st century.

These were the people instrumental in writing down the first official version of that bible in the early 4th century, starting with the Council of Nicea, and over the next 1,100 years, they also kind of invented emojis of a sort. What? They were bored college-aged dudes who weren’t allowed to get laid. What else could they do?

So things proceeded on the punctuation front without a lot happening until that dude Gutenberg got to printing in the 15th century. And that was when all of the existing punctuation got locked down because it had to be. That’s what standardization via mass manufacturing does, after all. Not necessarily a bad thing by any means.

  1. Question mark: This was another punctuation mark created by a person, Alcuin of York, an English poet and scholar who was invited to join the court of Charlemagne, who was first King of the Franks, then King of the Lombards, and finally Emperor of the Romans from the late 8th to early 9th centuries. If you have any western European blood in you, he is probably an ancestor.

Alcuin was a prolific author and very familiar with the old dot system of the Greeks, but he sought to improve it, so he created the punctus interrogatives, which is pretty much the Latin version of what we call it now, although his probably looked more like this: .~.

And while you may think that the question and exclamation marks are connected, with the latter just being the unsquiggled version of the former, you’d be wrong. In fact, no one is really sure where the exclamation mark came from, and it didn’t even appear on typewriter keyboards until the relatively late date of 1970.

  1. Hyphen: In the present day, hyphens pretty much exist only to join words that haven’t quite become full-on compounds But once upon a time, before computers had this wonderful ability to justify text and avoid breaking one word across two lines, hyphens did exactly that. They told you whether a word had been broken and to look for more of it on the next line. In practice, it would look something like this:

 He contemplated the scene, not sure what he was going to find, but fully ex-

pecting it to be something dangerous; something he’d rather not have to con-

front on his own.

Yeah. Messy and awkward, isn’t it? And yet, if you read any published material from earlier than about the late 80s, this is what you get and, honestly, it’s as annoying as hell.

The hyphen itself goes back, again, to ancient Greece, where it was a sort of arc drawn below the letters of the words to be joined. It was still common enough when Gutenberg got around to creating his moveable type that it was adapted. However, since he couldn’t figure out how to include punctuation below the baselines of his letters, he moved the hyphen to the medial position we all know today.

  1. Parenthesis: These most useful of marks were a product of the 14th century, and also brought to us by the creativity of monks copying manuscripts. And, again, I’ll remind you that these geniuses happened to be a part of their era’s version of what we’re currently calling Gen Z. You know. The ones after the Millennials that you should be paying attention to.

Anyway… in their wisdom, these monks decided to draw half circles around certain parts of the text (mostly to indicate that it was connected to but not part of the main idea) in order to set it off from the rest. In a lot of ways, parentheticals became a mental aside for the reader — hear this in a different voice.

And, like tits and testicles, parentheses are intended to always travel in pairs. (Yes, I know that not everyone has two of either, but note the “intended” part. Nature tries. Sometimes, she fucks up.)

  1. Quotation marks: These are yet another thing that the Greeks created, the Romans ignored, and medieval monks brought back. Originally, Greeks in the second century B.C. used sort of arrows to indicate that a line was a quote, and they stuck them in the margins. This form of quotation mark is still visible in modern languages, for example in the Spanish «quotation marks», which are pairs of little arrows.

When we got to the sixteenth century, they became a pair of commas before a line and outside of the margins, and indeed to this day, you’ll see this in ,,German quotes,‘‘ which have two commas before and two open single quotes after. Nowadays, you can’t say he said, she said without quotation marks.

So there you go. The origins of five-ish common punctuation marks. Which one is your favorite, and why? Tell us in the comments!


Marbury vs. Madison

If you haven’t already done it, then today is your last chance. VOTE, VOTE, VOTE!

Just over two hundred and seventeen years ago, the U.S. Supreme Court made a very important decision, one that has resonated on down through the years, and one that is more important now than ever.

Basically, a little incoming executive fuckery attempted to block an approved appointment by the outgoing administration… or did it? Because the outgoing administration wasn’t so innocent either, and to top it off, the Supreme Court Justice who ruled in the case, John Marshall, had been Secretary of State to the President who was trying to pack the courts with justices favorable to his side in the last days before he had to turn over the reins to Thomas Jefferson.

Side note: For all of you Founders fans, read up a bit, and you’ll realize that if you’re progressive, then you’re on the side of Adams, not Jefferson.

Anyway, beyond the politics of all of the above, two things are notable. One is that Marshall actually ignored the fact that he was voting against his guy (Adams) in this case and voted for what was right. Second is that this case forever enshrined the idea that the Supreme Court could absolutely decide whether a law passed by Congress was Constitutional.

Hello, checks and balances, everyone.

But it seems to have passed out of fashion to understand this simple fact. Our Constitution set up three branches of government for one simple purpose: So that no one of them would become too powerful. That’s what checks and balances means.

The three branches are as follows:

Legislative, meaning both houses of Congress, whose job is to make laws.

Executive, meaning the President and Cabinet, and their job is to figure out how to enact the laws passed by the Legislature, or to say “Nope. We’re not passing that law.” (To which the Legislature, with a two thirds majority, can say, “Nu-uh, it’s passed. Suck it!)

Judicial, meaning the Supreme Court, and they get to decide whether a law follows the Constitution or not.

Oh yeah. All three branches are constrained by the Constitution. At least in theory. And getting back to the Legislative, there are two houses of Congress, which makes it wonky: The Senate and the House or Representatives.

These came out of what you can basically call White Privilege, i.e., “We really can only trust rich, old, white, land-owning dudes over 21 to do what’s best (for rich, old, white, land-owning dudes over 21), so the system was stacked from the top. The Reps in the house are based on the population of states, meaning that in the modern day places like California, Texas, and New York have the most Reps. However, the Senate is based on state, as in every state gets the same two Senators, so that California, with nearly 40 million people, gets the same number of Senators as Wyoming, with just over half a million people, and that is utter bullshit. Of course, this is the same nonsense that gave us the Electoral College, which really needs to be banished as well.

To keep it fair, we really need to banish the Senate, reapportion Congress based on an honest 2020 Census, and pack the Supreme Court to at least 17 Justices. Maybe even consider the concept of having two or more of those positions elected by the People instead of appointed by the President, and with the power of recall endowed, again, with the People..

Oh yeah. Because that’s the really big part that the whole “Systems of Checks and Balances” things ignores. The fourth branch of government.

Who is it? You may ask. Simple. It’s us. We the People, and our power to vote. We can’t do shit about the Supreme Court (yet, but see above), otherwise, the President and Congress are in our hands.

Momentous Monday: I’m not really who I think I am

The surname Bastian is the 11,616th most common in the world — meaning it’s not all that high on the list — and is most common in Germany, which should be a no-brainer, since it is in fact a German name.

Thirty-five percent of Bastians reside in Germany, and the name has been documented in 86 other countries. Surprisingly, it is more popular in Indonesia (21% of Bastians) than in the U.S. (19% of Bastians.)

And yet, a few years back, I had a little existential shock when I found out that I was not a Bastian at all. It all happened because I’d started doing genealogy years ago and lucked out a long time after that when somebody researching the German village my ancestors came from saw a query I’d posted about my great-grandfather, so he sent me all the info.

But, because of that, I don’t know what the family name is really supposed to be because Bastian only goes back to my great-great-great grandmother, Barbara Bastian, who was born in 1801. But… that was her maiden name, and her husband’s name wasn’t recorded, so her sons Peter and Titus assumed the name Bastian. (I’m descended from Titus.)

I have the info on her Bastian ancestors going back four more generations to the 1670s, but no idea who my great-great-great-grandfather in that slot really was. The genealogist said that it could either have been a passing soldier who didn’t stick around (common at the time) or that the husband wasn’t Catholic and the family apparently was, so his info wasn’t recorded in the church records and/or the marriage (if it happened) was never recognized.

Of course, there’s a possibility that Barbara was actually the father, since there is precedent for it being a man’s name and it just got flipped at some point. After all, Marian is still a very common German name for boys. But I’m not counting on that.

So the Bastian line I know of goes: Johannes Georg and Ursula Rieger begat Johannes Lorenz Bastian; he and Catharina Melchior begat Johannes Georg Bastian; he and Anna Barbara Riger begat Matthias Bastian; he and Dorothea Bittman begat Barbara Bastian; she and some dude begat Titus Bastian; he and Catharina Seiser begat Gustav Bastian; he and Mary Fearl begat Theodore James Bastian; and he and Neva Belle Jones begat my father, who knocked up my mother and begat me.

That’s ten generations, but the last six of them aren’t really Bastians at all.

If any of those surnames sound familiar and you have family in or ancestors from Gaggenau-Michelbach in Baden, Germany, by all means say hello in the comments — we probably are related. That was another thing the genealogist told me — that there were only about nine families in the village, which was isolated, so yes, there were a lot of cousins getting married.

And before you roll your eyes over incest, cousins marrying was the norm throughout most of human history, because those were the only people a lot of people knew but who were distant enough genetically to safely marry but close enough in distance to actually meet. Also, second cousins and beyond were much more common.

I am fortunate, though, in that German obsession to detail and the Catholic penchant for keeping meticulous records combined to preserve this history so that a researcher could find it centuries later.

I’m less fortunate on my mom’s side of the family, which is all Irish, because we have the same genealogical problem that a lot of European Jews do: an attempted genocide intervened to wipe out most of the records.

In my case, it happened over a century earlier, and in a much more passive-aggressive way as England basically did nothing about a potato blight that created a potato famine that decimated the population. So… not an active genocide, I… guess…?

But they also went in and stamped down Irish culture, forcing everyone to speak English and almost killing of Gaelic, and paying no regard to any records.

So… while I can trace that one line through my father back ten generations (and another line on his side that lucks out and hits England back thirty or so), on my mother’s side, the farthest I can get back is… four generations through every branch. It all stops in the mid-19th century, which is also about the same time that most of them arrive in the U.S.

In fact, up each branch, the trail ends with no information on the parents of each one who was the first immigrant to come here. The pattern is “Born in Ireland, died in America, parents unknown.”

It’s kind of ironic, then, that I know more about my English and Welsh ancestry through just one of my father’s 7th great grandparents than I do through my mother, especially considering that genetically I am 50% Irish.

Oh, by the way, not accounting for pedigree collapse, a person has 512 7th great grandparents. That makes sense, since it’s two to the eighth power (don’t forget to add your parents to the seven), then doubled because you have two ancestors per slot per generation.

And, to put the degree of DNA in perspective — 50% from my mom, directly and, while the percentage that came from my dad is the same, the bit that came from that ancestor of his is about 0.39%.

Or, in other words, out of the 30,000 genes in my genome, about 117 came from that ancestor — only to mix in match with the 117-ish other genes that came from every other person swimming in the gene pool that eventually became me at that point in the timeline.

In case you’re wondering, it wouldn’t take anything nearly as big as a swimming pool. In fact, a one liter bottle would hold all of the quarter gram of human eggs and approximately 800 ccs of semen contributed by all of those 7th great grandparents, with room to spare.

But you’re going to need a two liter if you want to go to the next generation, and a gallon jug to hold the ingredients for the one after that. At that point, just forget it, because you’re just going to be exponentially adding gallon jugs from that point on.

Ah. Isn’t genealogy wonderful?

Image by Calips, used unaltered via (CC) BY-SA 3.0.

“War is not healthy for children and other living things.” Except…

The title of this article comes from an incredibly iconic poster that was created during the Vietnam War in the 1960s. Specifically, it was created by printmaker Lorriane Schneider in 1967, and was inspired by her concern that her oldest son would be drafted and die in a war that many Americans considered unnecessary.

However, the Vietnam War is a strange exception and beginning point for a tidal change in American wars. Post-Vietnam, the only benefits wars seem to have given us are more efficient (although not cheaper) ways to kill people, and that sucks. (Incidentally, the Korean War is technically not a war. It also technically never ended.)

But… as weird as it may sound, a lot of the major wars prior to Vietnam actually gave American society weird and unexpected benefits. Yeah, all of that death and killing and violence were terrible, but like dandelions breaking through urban sidewalks to bloom and thrive, sometimes, good stuff does come in the aftermath of nasty wars. Here are five examples.

The American Revolution, 1775-1783

The Benefit: The First Amendment (and the rest of the Constitution)

By the beginning of the 18th century, Europe was having big problems because Monarchs and the Church were all tied up together, the state dictated religion, and so on. It came to an extreme with Britain’s Act of Settlement in 1714, which barred any Catholic from ever taking the throne. The end result of this was that the next in line turned out to be the future George I, son of Sophia. Sophia, however, was an Elector of Hanover or, in other words, German. Queen Victoria was a direct descendant of George I, and spoke both English and German. In fact her husband, Prince Albert, was German.

But the net result of all the tsuris over the whole Catholic vs. Protestant thing in Europe, on top of suppression of the press by governments, led to the Founders making sure to enshrine freedom of speech and the wall between church and state in the very first Amendment to the Constitution, before anything else. To be fair, though, England did start to push for freedom of the press and an end to censorship in the 17th century, so that’s probably where the Founders got that idea. But the British monarch was (and still is) the head of the Church of England, so the score is one up, one down.

The War of 1812, 1812-1815

The Benefit: Permanent allegiance between the U.S. and Britain

This was basically the sequel to the American Revolution, and came about because of continued tensions between the two nations. Britain had a habit of capturing American sailors and forcing them into military duty against the French, for example, via what were vernacularly called “press gangs.” They also supported Native Americans in their war against the fairly new country that had been created by invading their land. So again, one up, one down. And the second one, which is the down vote to America, is rather ironic, considering that the Brits were basically now helping out the people whose land had been stolen by… the first English settlers to get there.

And, honestly, if we’re really keeping score, the U.S. has two extra dings against it in this one: We started it by declaring war — even if there were legitimate provocations from Britain — and then we invaded Canada.

But then a funny thing happened. The U.S. won the war. By all rights it shouldn’t have. It was a new country. It really didn’t have the military to do it. It was going up against the dominant world power of the time, and one that would soon become an empire to boot.

The war technically ended with the Treaty of Ghent in 1814, but there was still the Battle of New Orleans to come after that, and it happened because news of the end of the war hadn’t gotten there yet. In that one, the U.S. kicked Britain’s ass so hard that they then basically said, “Remember all the concessions we made in that treaty? Yeah, not. LOL.”

In a lot of ways, the war was really a draw, but it did get the British to remove any military presence from the parts of North America that were not Canada, and opened the door to American expansionism across the continent. It also helped to establish the boundary between the U.S. and Canada, which is to this day the world’s longest undefended border. Finally, it cemented the relationship of the U.S. and Britain as allies and BFFs, which definitely came in handy in the 20th century during a couple of little European dust-ups that I’ll be getting to shortly.

The American Civil War, 1861-1865

The Benefit: Mass-manufactured bar soap

Now in comparison to the first two, this one may seem trivial and silly, but it actually does have ramifications that go far beyond the original product itself. And it doesn’t matter whether you’re a fan of bar soap now or go for the liquid kind (my preference), because both were really born out of the same need and process.

Once upon a time, soap-making was one of the many onerous tasks that the women of the house were expected to do, along with cleaning, cooking, sewing, canning, laundry, ironing, taking care of the menfolk (husbands and sons, or fathers and brothers), and generally being the literal embodiment of the term “drudge.” But soap-making was so arduous a task in terms of difficulty and general nastiness that it was something generally done only once or twice a year, basically making enough to last six or twelve months.

To make soap involved combining rendered fat and lye. (Remember Fight Club?) The fat came easy, since people at the time slaughtered their own animals for food, so they just ripped it off of the cow or pig or whatever meat they’d eaten. The lye came from leeching water through ashes from a fire made from hardwood, believe it or not, and since wood was pretty much all they had to make fires for cooking, ashes were abundant. Yes, I know, it’s really counter-intuitive that something so caustic could be made that way, but there you go. The secret is in the potassium content of the wood. Fun fact: the terms hard- and softwood have nothing to do with the actual wood itself, but rather with how the trees reproduce. (And I’ll let your brain make the joke so I don’t have to.)

So soap was a household necessity, but difficult to make. Now, while William Procter and James Gamble started to manufacture soap in 1838, it was still a luxury product at the time. It wasn’t until a lot of men went to war in 1861 that women had to run homesteads and farms on top of all of their other duties, and so suddenly manufactured soap started to come into demand. Especially helpful was Procter and Gamble providing soap to the Union Army, so that soldiers got used to it and wanted it once they came home.

Obviously, easier access to soap helped with hygiene but, more importantly, the industry advertised like hell, and from about the 1850s onward, selling soap was big business. There’s a reason that we call certain TV shows “soap operas,” after all, and that’s because those were the companies that sponsored the shows.

World War I, 1914-1918 (U.S. involvement, 1917-1918)

The Benefit: Woman’s suffrage and the right to vote

It’s probably common knowledge — or maybe not — that two big things that happened because of World War I were an abundance of prosthetic limbs and advances in reconstructive and plastic surgery. However, neither of these were really invented because of this conflict, which “only” led to improved surgical techniques or better replacement limbs.

The real advance is sort of an echo of the rise of soap via the Civil War, in the sense that the former conflict freed women from one nasty restriction: Having no say in government. And, as usually happens when the boys march off to do something stupid, the women have to take up the reins at home, and sometimes this gets noticed. It certainly did in the case of WW I, and suffragettes wisely exploited the connection between women and the homefront war effort. Less than two years after the conflict officially ended, women were given the right to vote on August 26, 1920 with the passage of the 19th Amendment.

Hey! Only 144 years too late. Woohoo!

World War II, 1939-1945 (U.S. involvement, 1941-1945)

The Benefit: The rise of the American middle class

As World War II was starting to move to an end, the Servicemen’s Readjustment Act of 1944 was passed into law. It was designed to assist returning service members via things like creating the VA hospital system, providing subsidized mortgages, assisting with educational expenses, and providing unemployment. It was also a direct reaction to the less-than-fantastic reception returning veterans of World War I had received.

In fact, one of FDR’s goals in creating what is commonly known as the G.I. Bill was to expand the middle class, and it succeeded. Suddenly, home ownership was within reach of people who hadn’t been able to obtain it before and, as a result, new housing construction exploded and, with it, the emergence of suburbs all across the country. With education, these veterans found better jobs and higher incomes, and that money went right back into the economy to buy things like cars, TVs, and all the other accoutrements of suburban living. They also started having children — it’s not called the Baby Boom for nothing — and those children benefited with higher education themselves. The rates of people getting at least a Bachelor’s Degree began a steady climb in the 1960s, right when this generation was starting to graduate high school. At the same time, the percentage of people who hadn’t even graduated from high school plunged.

The top marginal tax rates of all time in the U.S. happened in 1944 and 1945, when they were at 94%. They remained high — at least 91% — throughout the 1950s. Oddly, despite the top rate in the 1940s being higher, the median and average top tax rates in the 1950s were higher — about 86% for both in the 40s and 91% for both in the 50s. The economy was booming, and in addition to paying for the war, those taxes provided a lot of things for U.S. Citizens.

Even as his own party wanted to dismantle a lot of FDR’s New Deal policies, President Eisenhower forged ahead with what he called “Modern Republicanism.” He signed legislation and started programs that did things like provide government assistance to people who were unemployed, whether simply for lack of work or due to age or illness. Other programs raised the minimum wage, increased the scope of Social Security, and founded the Department of Health, Education and Welfare. In a lot of ways, it was like the G.I. Bill had been extended to everyone.

Across the multiverse

It can be daunting, sometimes, to think about the precarious pathways that led to each of our lives, and then led to the lives we have led. In my case, answering a want ad in Variety two years out of college led to an office job that changed everything — not because of the job, but because of the people I met, and connections that led directly to me pursuing a career as a playwright with some success and also to working in television and eventually doing improv.

But I never would have wound up there if my parents hadn’t met and married, and that only happened because my mother had one bad first marriage that led to her moving across the country and winding up working as a waitress in a restaurant across from the office where my father, who was also ending his bad first marriage, worked. He wound up there because he had taken advantage of the G.I. Bill to study architecture and so was a structural engineer for one of the more prestigious firms in Los Angeles. In another case of amazing coincidence, I wound up working about a block from where his office and her restaurant had been when I went into the TV biz twenty-ish years after he worked there.

So my father wound up doing the G.I. Bill thing because he was a veteran and that happened because there had been a war. But he was only in America to fight on our side because his grandfather had come here in the first place, and my father’s own father and mother wound up in California. That happened because my grandfather worked for the railroads. I also think it was because my grandmother got knocked up with my dad’s older brother at about eighteen and before they married, but that’s beside the point. Or maybe not.

If my mother had stayed where she’d been born, she never would have met my father. If my great grandfather had never left Germany, than one of my ancestors may have died on the wrong side of WW II. And if that had happened and my mother came to Los Angeles anyway, there’s no telling whom she might have met and married. It could have been a big power player in Hollywood. It could have been a dishwasher in the restaurant. The unanswered question, really, is whether who I am came only from her egg or from dad’s sperm, or whether I would have never existed had the two never met. Impossible to say.

What’s really fascinating are the long-term effects of random choices. I do improv now because of one particular actor I met about six years ago. I met him because he was involved with a play of mine that was produced in 2014. That play happened because an actor who had done a reading of it when I first wrote it, twenty years previously, remembered it when he was at a point to play the lead and bring it to a company. That reading happened because it was set up by a woman who produced my second full-length play — and who is still one of my best friends — and that happened because of all the attention received by my first produced full-length play, which happened because of a woman I met at that first office job out of college I mentioned before. She was in a writing group, heard I was interested in being a writer and invited me to join. Ta-da… a link in a damn long chain of consequence happened.

And that third play, about William S. Burroughs, only happened because I somehow heard about his works when I was probably in middle school, and only because the title “Naked Lunch” made a bunch of twelve-year-olds giggle. But reading that book when I was about fourteen, and realizing it was about so much more, and then discovering the rest of his works along with Vonnegut and Joyce and Robert Anton Wilson and so many others set my sails for being a writer, and out of all of them, Burroughs had the most fascinating life story, as well as the personal struggle I most related to, since he was a gay man, after all.

And, I suppose, I can attribute my interest in the salacious and interesting to the fact that my mother had such an aversion to them. She could watch people on cable TV get their heads blown off for days, but show one tit or one ass — or god forbid a dick — and she would lose it. It was good-old Catholic body shame, and I never understood it, mainly since I’ve been a naturist since, like, forever. Of course, the extent of my exposure to that church was to be baptized as a preemie “just in case,” and then not a lot else beyond the scary crucifix that always hung in my bedroom and the scarier icons and statues I’d see when we visited my mom’s mom.

Ironically, I’ve actually come to relate to Catholicism, although not so much as a religion, but more as a cultural touchstone and anchor for my Irish roots. Yeah, we bog-cutters love the ceremony, but piss on the bullshit, so that’s probably why it works. Give me the theater, spare me the crap. Sing all you want, you middle-aged men in dresses, but touch the kids, and we will end you.

But I do digress… because if we’re going to go down the Irish rabbit hole, that is an entirely different path by which I could have not wound up here today. At any point, one of my direct ancestors on my mother’s side could have taken vows, and then boom. No more descendants to lead to me.

Or any of my grandparents or parents or I could have walked in front of a speeding bus before their descendants were born or before I had my first play produced, and game over. History changed. I could have signed up with a temp agency on a different day and never wound up having met my best friend.

Then again… I have no idea who I would be if any of these different paths had been taken at any point in history all the way back to the beginning. It’s really daunting to consider how many ancestors actually had to come together to lead to the genetic knot that is you or me. But you and I exist as who we are. Rather than worry about how easily that could not have happened, I suppose, the better approach is to just revel in the miracle that it did. Here we are. It happened because other things happened. And thinking too hard about why those other things happened might actually be a bad thing to do.


If you could go back in time to your younger self — say right out of high school or college — what one bit of advice would you give? I think, in my case, it would be this: “Dude, you only think you’re an introvert, but you’re really not. You just need to learn now what it took me years to understand. No one else is really judging you because they’re too busy worrying about how they come off.”

But that worry about what other people thought turned me into a shy introvert for way too long a time. At parties, I wouldn’t talk to strangers. I’d hang in the corners and observe, or hope that I knew one or two people there already, so would stick to them like your insurance agent’s calendar magnet on your fridge. Sneak in late, leave early, not really have any fun.

It certainly didn’t help on dates, especially of the first kind. “Hi, (your name). How’s it going?” Talk talk talk, question to me… awkward silence, stare at menu, or plate if order already placed.

Now this is not to imply that I had any problem going straight to close encounters of the third kind way too often, but those only happened when someone else hit on me first. Also, I had a really bad habit of not being able to say “No” when someone did show interest. I guess I should have noticed the contradiction: Can someone really be an introvert and a slut at the same time?

What I also didn’t notice was that the times I was a total extrovert all happened via art. When I wrote or acted, all the inhibitions went away. Why? Because I was plausibly not being myself. The characters I created or the characters I played were other people. They were insulation. They gave me permission to just go out there without excuse. (Okay, the same thing happened during sex, but by that point, I don’t think that introversion is even possible or very likely.)

However… the characters did not cross over into my real life. I was awkward with strangers. I was okay with friends, but only after ample time to get to know them.

And so it went until I wound up in the hospital, almost died, came out the other side alive — and then a funny thing happened. I suddenly started initiating conversations with strangers. And enjoying them. And realized that I could play myself as a character in real life and have a lot of fun doing it. And started to not really care what anyone else thought about me because I was more interested in just connecting with people and having fun.

The most important realization, though, was that I had been lying to myself about what I was for years. The “being an introvert” shtick was just an excuse. What I’d never really admitted was that I was extroverted as hell. The “almost dying” part gave the big nudge, but the “doing improv” part sealed it. Here’s the thing. Our lives, day to day and moment to moment, are performance. Most muggles never realize that. So they get stage fright, don’t know what to do or say or how to react.

But, honestly, every conversation you’ll ever have with someone else is just something you both make up on the spot, which is what improv is. The only difference is that with improv you’re making up the who, what (or want) and where, whereas in real life, you’re playing it live, so those things are already there.

Ooh, what’s that? Real life is easier than performing on stage?

One other thing that yanked me out of my “I’m an introvert” mindset, though, was an indirect result of doing improv. I’ve been working box office for ComedySportz for almost a year now — long story on how and why that happened — but I’m basically the first public face that patrons see, I’ve gotten to know a lot of our regulars, and I honestly enjoy interacting with the public, whether via walk-ups to the ticket counter or phone calls. Young me would have absolutely hated doing this, which is another reason for my intended message to that callow twat.

And so… if you’re reading this and think that you’re an introvert, do me a favor. Find something that drags you out of your comfort zone. Remind yourself that no one else is really judging you because they’re too busy worrying about themselves, then smile and tell way too much to the wait-staff or checker or usher or whomever — and then don’t give a squishy nickel over what they might think about it.

(Note: “squishy nickel” was a fifth level choice on the improv game of “New Choice” in my head just now. Which is how we do…)

Four expressions that are older than you think

One of the things I do when I edit and fact-check other people’s books and scripts is to check for anachronisms, which are things that are out of their proper time. For example, let’s say that a major plot element in a thriller is a stolen thumb drive with the names of every undercover agent on it. That’s a great MacGuffin… unless you set your script before 2000, when USB thumb drives were not commercially available. (At a stretch, I’d give you 1999, since we’d be dealing with governmental agencies and all that.)

A very common one that I’ve seen so many times that it’s one of my first searches on period pieces is use of the term “Ms.” Well, not all period pieces, since any story set before 2009 is now considered a period piece, but definitely those that are set before about 1972, which is when the term started to become part of mainstream vernacular. Oddly enough, though, it was first proposed as a neutral alternative to Mrs. and Miss as early as 1901, although it was used as a written abbreviation of “mistress” only as far back as the 17th century. Keep in mind, though, that this usage had nothing to do with treating women as equals and everything to do with male scribes figuring out how to spare themselves writing six letters by hand every time they recorded a record about a single female.

But this brings up an interesting point. Technically, yes the term “Ms.” is a lot older than you’d think. On the other hand, its usage in its modern sense pretty much began as noted above, in the early 1970s. There are other expressions, though, that really are a lot older than you think, so in the spirit of my story about inventions that are older than you think, here we go.


We haven’t quite perfected the fully autonomous humanoid robot, although Honda’s ASIMO has come close. Keep in mind, though, that they’ve been working on it for over thirty years now. And, surprisingly, while there’s a certain resemblance to the name of a famous science fiction author, the name ASIMO really refers to “Advanced Step in Innovative Mobility.”

The author in question, Isaac Asimov, is famous for writing a lot of both science fiction and science fact, but one of his series, I, Robot, is famous for establishing the Laws of Robotics. However, while they’ve always been popular with science fiction fans, they really didn’t explode onto the scene until a kind of lame 2004 film adaptation, although if you’ve ever owned a Roomba, Scooba, Braava, or Mirra, then you’ve done business with the iRobot. But either of these would probably make you think that robots are a fairly recent invention.

Of course, if you owned any kind of modem between the 1970s and mid-1990s, it might have come from the company US Robotics. Guess where they got their name… That’s right. Also Asimov.

But if you’re only a film fan and not a tech or science fiction nerd, you might think that robots were created in the 1950s, with the appearance of Robby the Robot in the film Forbidden Planet. Never mind that, at least in literature, Asimov got to robots by 1940, because that’s still too early.

The actual origin of the word “robot” is in a 1920 play by Karel Čapek called R.U.R., or Rossum’s Universal Robots. He adapted that word from an old Church Slavonic term rabota, which meant slave or serf. And if you’d like to, you can listen to a reading of the play itself.

To do someone

If someone were to say to you, “Hey, do me,” you’d probably take it in a sexual sense, right? And that also seems like a really modern usage of the phrase. Just thinking back through pop culture, I have it my head that Austin Powers said something like, “Oh, do me, baby” (he didn’t,”) but the slang must have begun with the Beatles in 1968 on the White Album, with the song “Why Don’t We Do It in the Road.”

I really couldn’t find any clear sources for “do it” or “to do” in a sexual sense back from 1968, but I did find one from 1588, in Shakespeare’s Titus Andronicus, which reads as follows:

     Villain, what hast thou done?

     That which thou canst not undo.

     Thou hast undone our mother.

     Villain, I have done thy mother.

If you doubt this reading, then just take a look at this scene from Julie Taymor’s brilliant adaptation, and you’ll see that it’s exactly how Willie Shakes intended it to read.


You might think that this one was invented by Samuel L. Jackson, who uses it so eloquently, or maybe it was a product of the 1960s. While the movie M*A*S*H infamously was the first major motion picture to use the back half of the word, it was Myra Breckinridge that turned things on its head by using the word in full, but bleeping “mother” instead of “fucker.”

Prior to the 1960s, this term is alleged to have been used by slaves in America before the Civil War to describe owners who would rape the slaves’ mothers as a psychological breaking tactic, but this probably isn’t true. The earliest attestations come from a court case in 1889, so its origin probably dates back a bit earlier than that, although in the case documents it’s an adjective, motherfucking, instead of the noun, motherfucker. The noun form didn’t pop up until 1917, when a black soldier referred to the draft board as “low-down motherfuckers.”

Seeing pink elephants

This is an old expression to indicate either that someone was habitually drunk or they were an alcoholic experiencing DTs due to lack of booze. Nowadays, the expression has mostly fallen out of use with the understanding that alcoholism is a disease, and nothing to joke about, although it’s still a part of pop culture because of Disney’s original 1941 version of Dumbo, but that isn’t the origin of the expression or the idea. And while it is frequently attributed to Jack London in his 1913 novel John Barleycorn, it actually goes back a bit farther than that, to sometime between 1883 and 1903, ten years before that book came out. It had a lot to do with the disappointment of audiences who were expecting to see a rare white elephant — white because of its albinism — but the beasts actually turned out to be closer to pink. In case you haven’t seen it, the scene in Dumbo is an incredible bit of animated surrealism called “Pink Elephants on Parade” — and I swear that the animators hid one of those infamous Disney toon penises at about the 2:40 mark. Watch the elephant’s trunk.

What’s your favorite slang expression that’s a lot older than people think?

Going back up the family tree

I became fascinated with genealogy years ago, and used to spend many a Wednesday evening in the Family History Center next to the Mormon Temple near Century City in Los Angeles. Say what you want about them as a religion, but their work in preserving family history has been invaluable and amazing, even if it did originally start out for the most racist of reasons wrapped in a cloak of theological justification. Fortunately, the nasty justifications have long since been removed, and if it takes believing that all family members throughout time are forever bound together in order for the Mormons to keep on doing what they do in this area, then so be it.

It had been a while since I’d actively done any research, largely because I no longer had time for it, but back in the day, I did manage to follow one branch, the ancestors of my father’s father’s mother’s mother, also known as my great-great grandmother, to find that at some point this line had been traced back to the magic date of 1500.

Why is that date magic? Well, if you do genealogy, you know. If you manage to trace all of your own family lines back that far, you can turn your research over to the LDS, and they will do the rest for you. Keep in mind, though, that it isn’t easy to get all of your branches back to 1500, and certain ancestries naturally create blocks to progress. For example, if you’re descended from Holocaust survivors, you’re probably SOL for any time during or prior to WW II. Likewise if you’re descended from slaves, or your ancestors immigrated from Ireland, you’re not going to find many records after a few generations.

This is, of course, because paper records can easily be lost. For example, almost all of the records from the U.S. Census of 1890 were destroyed by a fire in 1921. During the period from June 1, 1880 to June 2, 1890 — the span between the two censuses — around 5.2 million people legally immigrated into the country. At the same time, the population grew from just over fifty million to just under sixty-three million. Or, in other words, the major and official historical record of just over eleven million people newly arrived in the country, through birth or immigration, were destroyed forever, with no backup.

Fortunately, over the last decade or so, science has developed a way of researching genealogy that cannot be destroyed because every single one of us carries it within us, and that’s called DNA, which can now be tested to match family members. On the upside, it can reveal a lot about your ancestry. Oh, sure, it can’t reveal names and dates and all that on its own, but it can tell you which general populations you’re descended from. Of course, this can be a double-edged sword. At its most benign, you might find out that the ancestry you always thought you had is wrong. At its worst, you may learn about family infidelities and other dark secrets.

I haven’t had my DNA tested yet, but my half-brother did, and his girlfriend recently contacted me to reveal that at least one family secret fell out of it, although it doesn’t involve either my brother or me. Instead, it looks like a cousin of ours fathered an illegitimate child in the 1960s and, oddly enough, that woman lives in the same town as my brother’s girlfriend.

Of course, the test also came with a minor existential shock for me, since she gave me the logon and password to look at the data. It turns out that my half-brother’s ancestry is 68% British Isles and 15% each from Scandinavia and Iberia. Now, since we have different mothers, the latter two may have come from there, but the surprising part was that there is nary a sign of French or German, although our common great-grandfather, an Alsatian, is documented to have emigrated from the part of Germany that regularly gets bounced back and forth with France, and the family name is totally German. I even have records from a professional genealogist and historian who happened to find the small village my great-grandfather came from, and my brother’s girlfriend tracked down the passenger list that documented his arrival in America from Germany on a boat that sailed from France.

But that wasn’t the troublesome part of the conversation. What was troubling was finding out that one of my cousins, her husband, and two of their kids had all died, most of them young, and I had no idea that they were all gone. This led me to search online for obituaries only to wind up at, which is the Mormon-run online genealogy website, and decide to create an account. Once I did, I searched to connect my name to my father’s, and… boom.

See, the last time I’d done any family research, which was at least a decade ago, I’d only managed to creep up one line into ancient history, as in found an ancestor that the Mormons had decided to research. This was the line that told me I was descended from Henry II and Eleanor of Aquitaine via an illegitimate child of King John of England. This time, things were different, possibly due to DNA testing, possibly due to better connection of data. Whatever it was, though, wow.

Suddenly, I started out on my father’s father’s father’s side of things and kept clicking up and… damn. After a journey through England and back to Scottish royalty and beyond, I wound up hitting a long chain of Vikings that eventually exploded into probably legendary bullshit, as in a supposed ancestor who is actually mentioned in the opening chapter of Beowulf. That would make my high school English teacher happy, but it’s probably not true.

The one flaw of Mormon genealogy: Their goal is to trace everyone’s ancestry back to Adam, and so shit gets really dubious at some point.

But… if you’re willing to write off everything claimed for you before maybe Charlemagne’s grandmother, then you will find interesting stuff, and the stuff I found after clicking up a few lines was, well… definitely interesting, and maybe reinforced the idea that, despite a German great-great-granddad, my half-bro and I are apparently British as bollocks for one simple reason: Everybody and his uncle invaded Britain over the centuries, including the Romans, the Vikings, the Danish, the Gauls, the Celts, and so on.

And, true enough… up one line, I wind up descended from nothing but Vikings. Up another, from but Vandals and Goths. Several lines tell me I’m descended from a King of Denmark. Along another path, it’s the Franks, house of Charlemagne, except that the Mormons tell me I’m descended from there long before Karl Magnus himself. Several other lines, including that King John one, I’m more Welsh than the Doctor Who production company. And there are all the royal houses: Swabia, Burgundy, Thuringia, etc., as well as several Holy Roman Emperors, and kings of France, the Franks, the Burgundians, and the English, that are dancing a pavane in every cell in my body.

So, what does it all mean? On the one hand, it’s nice to be able to flip back through history and look up people from past centuries — bonus points if they made enough of a dent in time to at least have some records to look up, and big ups if they appear in Wikipedia. On the other hand, you only have to go back six generations — to your great, great, great grandparents, to find a point where each of the 32 of them contributed less than one whole chromosome to your genetic make-up. About 40 generations back, each ancestor could not have contributed more than a single atom from that DNA to you, and before that, it gets meaningless. (I’ll leave you to do the math, but it’s about 8.5 billion atoms per chromosome, times 46.)

Yet… life and time marches on. A lot of our history is oral or traditional or recorded on paper. A lot of it is false, although science is marching us toward a sort of truth. Maybe I’m not as German as I thought, but I won’t know until I test my own DNA, and may very likely run into the ancestral roadblock on my mother’s side common to people of Irish descent — ironically because people of English descent were such right bastards a few hundred years ago. That’s one set of ancestors trying to wipe out another.

But if you go back far enough, what you learn about humans is what you learn about air and water. By this point in time, every molecule of air has been through countless lungs and every molecule of water has been through countless plants, animals, and people. All of us now living have literally breathed the same air and drunk and excreted the same water. We have shared precious resources that keep us alive. Likewise, our human DNA has been through each of us, has existed long before any of us, and ultimately came from the same primordial ooze of long ago, and is also essential to our continued existence as a species.

Or, in other words, while it’s fun to do genealogy to try to pin specifics on our ancestors, there’s really only one truth. We are all related to each other. We should all treat each other like family. And this circles back to the Mormons. While they might try to justify their interest in family history based on some sort of theological belief, they’re still on the right track. Yes — all family members are sealed to each other throughout history. The thing is, all humans are family.

That’d be all humans, no exceptions. And that, perhaps, is the most amazing thing about studying genealogy. All roads lead to the idea that borders, nationalities, differences in belief, and separations by geography are complete and total bullshit. There’s another religion that put it succinctly and nicely. They were founded about twenty years after Mormonism, and they’re known as the Bahá’í. Their motto is “One planet, one people, please.

I think that’s a motto we can all get behind right now. It’s one we need to. Otherwise, we’re not going to leave any people on this planet to carry on our DNA.

Nothing changes until we change it

I missed mention the anniversary of California’s admission to the U.S. last week, so here’s a flashback piece with a bit of California history for you.

You’ve probably never heard of Milton Slocum Latham unless you’re a serious California history nerd. I’d never heard of him until today, but I discovered him because I looked up a list of California governors. I did this because the Chief Justice of the California Supreme Court, Tani Cantil-Sakauye, announced that she was giving up her current party affiliation in order to become independent. I was curious as to which governor had put her on the court and who made her Chief Justice.

Note that I don’t really want to discuss partisan politics here. You can look up the particulars yourself. Suffice it to say that Cantil-Sakauye was appointed by a governor of her own party, then made chief justice by a governor from the other party. But what really caught my eye was going down that list of California governors and realizing that there have been a lot of tumultuous changes.

For one thing, a lot of governors served very short terms, and either resigned or were not re-elected or even nominated. This seemed particularly common in the 19th century, which makes sense considering that California came into the union in 1850 as a free state (i.e., slavery was illegal), but seemed to have a lot of Democratic governors around the time of the Civil War. And, if you’re not ignorant of history, you know that, at that time, the Democratic Party was mostly on the pro-slavery side while the Republicans were anti-slavery. This was before the great reversal of sides begun under FDR and completed while LBJ was president.

The first Republican governor of California was Leland Stanford — you might recognize his name from that little university in the northern part of the state. Elected in 1861, he only served one term at a time when the governor’s term was only two years. The law changed to double that term as soon as he left office, of course, although he did go on to serve as a U.S. Senator for California for eight years, until his death in 1893.

Stanford isn’t the only governor to have namesake places in the state. The city of Downey was named for the seventh governor, John Gately Downey, who, until Arnold Schwarzenegger, was the only governor of the state not born in the U.S. (he was Irish.) On the other hand, while it’s been claimed that Haight Street in San Francisco was named for Henry Huntly Haight, the tenth governor, that’s probably not true. This claim was first made in 1989, but the oldest mention of the street’s namesake, from 1916, says that it was probably named for Fletcher Haight, a local lawyer and district judge who, coincidentally perhaps, died the year before the other Haight became governor. And it does make sense. Governors tend to get things bigger than streets named after them.

But let’s get back to Milton Slocum Latham, the sixth governor of California, and the person to hold the singular distinction of having served the shortest term to date in that position: five days, from January 9, 1860 to January 14, 1860. He immediately preceded the aforementioned Governor Downey, by the way.

Now, why was Latham’s term so short? Did a scandal throw him out of office? Was his election invalidated, or did he pull a William Henry Harrison and drop dead? Perhaps he changed his mind and quit? Nope. None of the above, but definite proof that some things in politics never change.

See, just after Latham’s election, one of Calfornia’s Senators, David Colbreth Broderick, went and got himself killed in a duel that was most definitely related to the contentious issue of slavery, although Broderick was also apparently quite corrupt, and had made a fortune running San Francisco the same way that Tammany Hall (a thing, not a person) had run New York. All this makes me rather ashamed to admit that ol’ Brod and I have the same birthday. Dammit.

On the other hand, he was part of an attempted offshoot of the Democratic Party at the time, the Free Soil Democrats. They were the ones opposed to slavery expanding into the west. (Note: They were not necessarily anti-slavery. They just didn’t want it moving to other states.) After a little insult battle between Broderick and David S. Terry, a former California Chief Justice no less, the two met to duel. Broderick’s pistol anti-Hamiltoned and threw away its shot by firing as he drew and putting the bullet into the ground. Terry then nailed him in the right lung.

The duel happened six days after the general election that Latham won with 60% of the voters. That election was on September 7, 1859, the duel was September 13, and Broderick died on the 16th. So at least we can say Latham did not run with the intent of taking that senate seat, right?

That didn’t stop him once he was in office and, since this was back in the days when Senators were still appointed by their states instead of elected, Latham did a little wheeling and dealing, and the rest was rather dubious history.

He was not re-elected to a second senate term and died in 1882, in New York, at the age of 54.

But now to the point of this history lesson. There’s really nothing new in politics. Only the names of the people and parties and the methods through which information is exchanged evolve. I’m sure that Broderick’s duel and Latham’s gambit were covered in the newspapers and periodicals at the time, discussed in the private clubs, and propagated by telegraph.

And regardless of the parties involved, I think we can all agree that somebody being elected to one office only to lobby for a sudden vacancy in a higher office after less than a week shows heinous disregard for the people who elected them — especially when that election came with a 60% majority.

Yet we see this sort of thing all the time, as an elected official will suddenly start campaigning for an office higher up, sometimes right after they’ve been sworn in. It seems particularly bad with governors who want to run for senator or president, and senators who want to run for president, but it happens at all levels. I’ve seen city council members start to stump to become the next mayor less than halfway through their first term, mayors campaigning for governor once they’ve moved into city hall, and so on.

Now I have no objection whatsoever to an elected official wanting to work their way up the food chain. That’s how it should be. I just think that we need to make them take some time to do it, which is why I think we need a little adjustment to the law. Well two adjustments.

First, does anyone else think that it’s insane, in this day and age, that people elected to the U.S. House of Representatives serve only two years? In effect, this really means that any Rep is basically spending all of their term campaigning for their next election. The California gubernatorial term doubled from two to four years well over a century ago. We need to update the House of Reps to at least four years as well.

And, for that matter, why does the Senate get six? I can understand the idea of staggering those elections into three classes, like they are now, but why not four year terms for everyone, staggered into two classes, half elected every two years? Although, given recent behavior, it really should be flipped: House term of six years, Senate term of two. Just a thought.

But the real proposal is this one:

  1. No person newly elected for the first time to any position in the government of the United States or any of its states, counties, cities, or other political jurisdictions, shall seek, campaign for, file for, raise funds for the purpose of, or otherwise pursue, election to a different position within the aforementioned governmental jurisdictions prior to the completion of one (1) complete term to which they have been elected.
  2. Any incumbent elected official in any of the jurisdictions mentioned in §1 shall not seek, campaign for, file for, raise funds for the purpose of, or otherwise pursue, election to any different position within the government unless the term for which they would be newly elected begins on or after the date that their current term would normally expire according to applicable law. This exception does not apply to a first-term official in any capacity.
  3. None of the above restrictions shall apply to an elected official seeking to be re-elected to the same position they already occupy; nor to previously elected officials who are not currently in office for reasons other than impeachment, censure, or conviction of felonies; nor to an elected official who is not eligible to run for the same office again due to term limits.

I think those rules are fair all the way around. If you want the job, at least do it for what you contracted for. If you want to apply for another job, make sure it starts after this one ends. If you want to keep your job by reapplying, or go back to work after leaving, or are going to get laid off — then do what you want.

If your only purpose in running for office is to leap-frog your way to the top of the pile as quickly as possible for the sake of power, then we don’t need you in office. Milton Slocum Latham learned that lesson first hand. There’s also a very local and specific example from Los Angeles, but I won’t mention any names here. The important part is that, as with Latham, the voters figured it out and soon said “No.” But we really need to enshrine that automatic no into the law.

And that’s not really a political position one way or the other, since this really is a case of “both sides do it.” It’s just common sense, and another way to try to restore some sanity to our political system.

A company town

I first posted this back in December 2018, but in light of current events and the effect that COVID-19 has had on the entertainment industry, it’s worth the reminder: a good part of the economy of L.A. is powered by the entertainment industry, which has been shut down — and attempts to start it up again have not always been successful. It’s also an appropriate reminder for Labor Day.

Despite its size, Los Angeles is a company town, and that company is entertainment — film, television, and music, and to a lesser extent gaming and internet. So, growing up here, seeing film crews and running into celebrities all over the place was always quite normal. Hell, I went to school with the kids of pretty big celebrities and never thought much of it. “Your dad is who? Whatever.”

But here’s one thing I don’t think a lot of non-locals understand: None of the major studios are actually in Hollywood. How the city of Hollywood — which is where I was actually born — became conflated with the movies is a very interesting story. Once upon a time, there were some studios there. Charlie Chaplin built his at La Brea and Sunset in 1917. It was later owned by Herb Alpert, when it was A&M Studios and produced music. Currently, it’s the location of the Jim Henson Company. The Hollywood Hills were also a popular location for celebrities to live, and a lot of the old apartment buildings in the city were originally designed for young singles who worked in the industry.

Come to think of it, they still serve that purpose, although given the cost of rent in this town, a lot of those studio units are cramming in two tenants.

The one thing that Hollywood did have in abundance: Movie premieres, and that’s still the case to this day. The Chinese, The Egyptian, and the El Capitan are perennial landmarks, and the Boulevard itself is quite often still closed down on Wednesdays for red carpet openings. Although Broadway downtown also boasts its own movie palaces from the golden age of cinema, it was always Hollywood Boulevard that had the great grand openings. It’s also still home to the Pantages, which is the biggest live theater venue outside of downtown, although they generally only do gigantic Broadway style musicals. (Side note on the Chinese Theater — although it’s technically called the TCL Chinese because, owners, nobody refers to it that way, and you’re still more likely to hear it called what it always was: Grauman’s Chinese Theater. Want to sound like a local? That’s how you do it. You’re welcome.)

There is one Hollywood tradition that does not date from the golden age of cinema, though, and it might surprise you. The Hollywood Walk of Fame wasn’t proposed until the 1950s, and construction on it didn’t begin until 1960 — long after all of the movie studios had left the area.

In case you’re wondering where those studios went, a number of them are in the oft-derided Valley: Universal in Studio City (they like to call themselves “Hollywood” but they’re not), Warner Bros. in Burbank, Disney in Burbank and Glendale, and Dreamworks Animation SKG in Glendale (across from Disney Animation!) all come to mind — and damn, I’ve worked for three out of four of them. On the other side of the hill, in L.A. proper, Sony is in Culver City, 20th Century Fox is in Century City (which was named for the studio), and Paramount is in L.A. proper, right next to RKO, which really isn’t doing much lately, both due south of Hollywood and right behind the Hollywood Forever Cemetery — which isn’t in Hollywood either, but which has a large number of dead celebrities. I think that covers most of the majors. YouTube Studios is in Playa del Rey, on the former sight of the Hughes helicopter factory that also happens to be right below the university I went to for film school, Loyola Marymount.

Like I said, company town.

The other fun part about growing up here is all of the film locations that I see every day, and there are tons. Ever see Boogie Nights? Well, most of that film was basically shot within a five mile radius of where I grew up, with only a few exceptions. Dirk Diggler’s fancy new house once he became a porn star? Yeah, my old hood. Location of the club where Burt Reynold’s character finds Mark Wahlberg’s character? I took music lessons a few blocks away from there. Parking lot where Dirk is mistakenly gay-bashed? Pretty close to the public library where I fell in love with reading.

Remember The Brady Bunch or the movies? Well, that house is only a couple of miles away from where I live now. The OG bat cave? Let me take you to Griffith Park. If you’ve ever seen Myra Breckenridge (you should if you haven’t) the place where Myra dances in the opening is right next to where Jimmy Kimmel does his show now and two doors down from the now Disney-owned El Capitan.

The Loved One (an amazing movie) — Forest Lawn Glendale, where I happen to have at least four ancestors buried. Xanadu? The major setting was the Pan Pacific Auditorium, which was a burned down wreck in my day, but it’s where my dad used to go on date night to roller skate. Go to the Vista Theatre? It sits on the site where D.W. Griffith built one of his biggest sets for Intolerance, his “mea culpa” for making The Birth of a Nation.

I’m not even going to get into how many times the complex I live in has been used for various epic TV shoots (which is a lot) or, likewise, how the area in NoHo I work in is used by everybody, from YouTubers to major studios. Although, I can tell you that having to put up with film crews and their needs is always a major pain in the ass, especially when it comes to parking vanishing. That’s right — there’s really no glamor in show biz outside of that red carpet.

But I guess that’s the price of admission for growing up and living in a company town and, honestly, I’ve never had a single adult job that wasn’t related to that company ever. (We won’t count my high school jobs as wire-puller for an electrical contractor and pizza delivery drone.)

Otherwise, though — yep. Whether it’s been TV, film, theater, or publishing, I’ve never not worked in this crazy stupid industry that my home town is host to. And I really wouldn’t have it any other way. What? Wait tables? Never. Although sharing my home town with tourists is a distinct possibility. I love this place. A lot. And you should too, whether you’re a visitor or a transplant. Welcome!