Power up

You could say that May 16 can be an electrifying day in history. Or at least a very energetic one. On this day in 1888, Nikola Tesla described what equipment would be needed to transmit alternating current over long distances. Remember, at this time, he was engaged in the “War of the Currents” with that douche, Edison, who was a backer of DC. The only problem with DC (the kind of energy you get out of batteries) is that you need retransmission stations every mile or so. With Tesla’s version, you can send that power a long way down the wires before it needs any bump up in energy.

Of course, it might help to understand in the first place what electric charge is. Here’s Nick Lucid from Science Asylum to explain:

But if you think that electric current flows through a wire like water flows through a pipe, you’re wrong, and there’s a really interesting and big difference between the one and the other, as well as between AC and DC current. DC, meaning “direct current,” only “flows” in one direction, from higher to lower energy states. This is why it drains your batteries, actually — all of the energy potential contained therein sails along its merry way, powers your device, and then dumps off in the lower energy part of the battery, where it isn’t inclined to move again.

A simplification, to be sure, but the point is that any direct current, by definition, loses energy as it moves. Although here’s the funny thing about it, which Nick explains in this next video: neither current moves through that wire like it would in a pipe.

Although the energy in direct current moves from point A to point B at the speed of light, the actual electrons wrapped up in the electromagnetic field do not, and their progress is actually rather slow. If you think about it for a minute, this makes sense. Since your battery is drained when all of the negatively charged electrons move down to their low energy state, if they all moved at the speed of light, your battery would drain in nanoseconds. Rather, it’s the field that moves, while the electrons take their own sweet time moving down the crowded center of the wire — although move they do. It just takes them a lot of time because they’re bouncing around chaotically.

As for alternating current, since its thing is to let the field oscillate back and forth from source to destination, it doesn’t lose energy, but it also keeps its electrons on edge, literally, and they tend to sneak down the inside edges of the wire. However, since they’re just as likely to be on any edge around those 360 degrees, they have an equally slow trip. Even more so, what’s really guiding them isn’t so much their own momentum forward as it is the combination of electricity and magnetism. In AC, it’s a dance between the electric field in the wire and the magnetic field outside of it, which is exactly why the current seems to wind up in a standing wave between points A and B without losing energy.

I think you’re ready for part three:

By the way, as mentioned in that last video, Ben Franklin blew it when he defined positive and negative, but science blew it in not changing the nomenclature, so that the particle that carries electrical charge, the electron, is “negative,” while we think of energy as flowing from the positive terminal of batteries.

It doesn’t. It flows backwards into the “positive” terminals, but that’s never going to get fixed, is it?

But all of that was a long-winded intro to what the Germans did on this same day three years later, in 1891. It was the International Electrotechnical Exhibition, and they proved Edison dead wrong about which form of energy transmission was more efficient and safer. Not only did they use magnetism to create and sustain the energy flow, they used Tesla’s idea of three-phase electric power, and if you’ve got outlets at home with those three prongs, frequently in an unintended smiley face arrangement, then you know all about it.

Eleven years later, Edison would film the electrocution of an elephant in order to “prove” the danger of AC, but he was fighting a losing battle by that point. Plus, he was a colossal douche.

Obviously, the power of AC gave us nationwide electricity, but it also powered our earliest telegraph systems, in effect the great-grandparent of the internet. Later on, things sort of went hybrid, with the external power for landlines coming from AC power, but that getting stepped down and converted to operate the internal electronics via DC.

In fact, that’s the only reason that Edison’s version wound up sticking around: the rise of electronics, transistors, microchips, and so on. Powering cities and neighborhoods and so on requires the oomph of AC, but dealing with microcircuits requires the “directionality” of DC.

It does make sense though, if we go back to the water through a house analogy, wrong as it is. Computer logic runs on transistors, which are essentially one-way logic gates — input, input, compare, output. This is where computers and electricity really link up nicely. Computers work in binary: 1 or 0; on or off. So does electricity. 1 or 0; positive voltage, no voltage. Alternating current is just going to give you a fog of constant overlapping 1s and 0s. Direct current can be either, or. And that’s why computers manage to convert one to the other before the power gets to any of the logic circuits.

There’s one other really interesting power-related connection to today, and it’s this: on May 16, 1960, Theodore Maiman fired up the first optical LASER in Malibu, California, which he is credited with creating. Now… what does this have to do with everything before it? Well… everything.

LASER, which should only properly ever be spelled like that, is an acronym for the expression Light Amplification by Stimulated Emission of Radiation.

But that’s it. It was basically applying the fundamentals of electromagnetism (see above) to electrons and photons. The optical version of electrical amplification, really. But here’s the interesting thing about it. Once science got a handle on how LASERs worked, they realized that they could use to send the same information that they could via electricity.

So… all those telegraphs and telephone calls that used to get shot down copper wires over great distances in analog form? Yeah, well… here was a media that could do it through much cheaper things called fiber optics, transmit the same data much more quickly, and do it with little energy loss over the same distances.

And, ironically, it really involved the same dance of particles that Tesla realized in figuring out how AC worked way back in the day, nearly a century before that first LASER.

All of these innovations popped up on the same day, May 16, in 1888, 1891, and 1960. I think we’re a bit overdue for the next big breakthrough to happen on this day. See you in 2020?

What is your favorite science innovation involving energy? Tell us in the comments!

Fangry

Unless you’ve been living under a rock, you’ve probably heard about the petition started by fans demanding a re-do of Season 8 of Game of Thrones, and this may have given you a flashback to last year, when fans of Star Wars demanded the same thing in the same way for The Last Jedi. Hm. Oddly enough, that was Episode VIII, but I’m sure that’s just a coincidence.

Of course, there’s no chance in hell that any of this is going to happen. Personally, if I were one of the producers on the receiving end of that petition, my response would be, “Okay, sure. Season 8 cost $90 million. When I checked, 218,141 of you signed the petition. So if each of you sends us $412.56, we’ll do it.” (Note: I am not going to link to the petition at all, and the reasons why not should become obvious shortly.)

This is called “putting your money where your mouth is,” although I’m sure that many of these fans who are complaining are either torrenting the series illegally or sharing HBO to Go passwords with each other, which just makes it more infuriating.

As an artist, nothing galls me more than armchair quarterbacking from the fans. Note that this is different than critiquing. If a fan sees one of my plays or reads one of my books and says, “I really didn’t like how the story played out,” or “I couldn’t relate to the lead character,” or similar, than that is totally valid. But as soon as a fan (or a critic) gets into, “It should have ended like this,” or “I would have written it like that,” or “this character should have done this instead,” then you’ve gone over the line.

Note, though: Professional critics do not do this. That’s what sets them apart from angry fanboys.

Thanks to the internet, we’ve moved into this weird area where what used to be a consumer culture has morphed into a participatory culture. Sorry to go Wiki there, but those are probably the most accessible ways in to what are very abstract concepts involving economics, marketing, and politics.

There are good and bad sides to both, which I’ll get to in a moment, and while the latter has always been lurking in the background, it hasn’t become as prevalent until very recently. Again, not necessarily a bad thing, but it needs understanding and context to work.

So what do we mean by consumer and participatory? The short version is “buy stuff” vs. “give stuff.” A consumer culture focuses on getting people to spend money in the pursuit of having a better life in a capitalist economy. Its marketing mantra is, “Hey… you have problem A? Product X will solve it!” It is also aimed at large groups based on demographics in order to bring in the herd mentality. Keeping up with the Joneses writ large. “Everybody is doing it/has one!”

Ever wonder why people line up down the block at midnight in order to get the latest iPhone or gaming console on the day it comes out? It’s because they have been lured, hook, line, and sinker into consumer culture. But here’s the thing people miss, or used to miss because I think we’re becoming a bit more aware. Because demographics are very important to consumer culture, you are also a product. And if some corporation is giving you something for free — like Google, Facebook, Instagram, etc. — then you are the only product.

Participatory culture is one in which people do not just buy, watch, or read the products, but in which they give input and feedback, and the rise of the internet and social media has pushed this to the forefront. Ever commented on a post by one of your favorite brands on how they could make it better? Ever snarked an elected official for whom you’re a constituent? Ever blasted a movie, show, or sketch in a mass media corporation’s website? Congratulations! That’s participatory culture.

As I mentioned above, it’s not new. In the days before the internet, people could always write letters to newspapers, legislators, corporations, and studios. The only difference then was that it was a bit harder — physically creating the message, whether with pen and paper or typewriter, then putting it in an envelope, looking up the address via dead tree media, taking the thing to a post office, putting a stamp on it, and dropping it off.

Phew. That’s some hard work. Now? Fire up Twitter, drop an @ and some text, click send, done.

And, again, this isn’t necessarily a bad thing. I’ve had more direct responses from my own elected officials to my social media comments than I ever did back in the days of mail of the E or snail variety only. The mail responses were always form letters with the subtext of, “Yeah, we get this a lot, we don’t care, here’s some boilerplate.” Social media doesn’t allow for that because it becomes too obvious.

But where participatory culture goes too far is when the fans turn it into possessory culture. Again, this isn’t a new phenomenon. It’s only become more common because being a participant and not just a consumer has become so much easier.

Here’s the anecdotal part. I’ve spent a lot of my working career in the entertainment industry, particularly film and television, and a lot of that dealing directly or indirectly with fans. And one thing that I can say for certain is that people who aren’t in the industry — termed “non-pro” by the trades and often called “muggles” by us — don’t have a clue about how it all works.

If you don’t know what “the trades” are, then you probably fall into the muggle category. Although it’s really a dying term, it refers to the magazines that covered the industry (“the trade”) from the inside, and which were read voraciously every day — principally Variety, The Hollywood Reporter, and Billboard.

But I do digress.

In college, I interned for a game show production company, and one of my jobs was reading and properly directing fan mail, or replying to it with one of a dozen form letters they had printed out en masse, because yes, the questions or complaints were so predictable. One of the big recurring themes was the mistaken belief that the host of the game show personally wrote, directed, edited, and selected contestants for the entire thing. Yeah, no. Unless the host was an executive producer (and the only example that comes to mind is Alex Trebek, for whom I almost worked), then the only thing the host did was show up for the taping day, when they would do five half-hour shows back to back.

And so… I would read endless letters with sob stories begging the host to cast them, or complaints about wanting them to fire one or another guest celebrities, or, ridiculously often, outright requests for money because reasons (always from red states, too), prefiguring GoFundMe by a decade or two.

A lot of these letters also revealed how racist a lot of Americans were then (and still are) and yes, the response to that crap was one of our most sent-out form letters.

This pattern continued though, on into the days of the internet and email. When I worked on Melrose Place, we would constantly get emails telling the stars of the show things like, “I hated what you did to (character) in that episode. Why are you such a bitch?” or “Why don’t you change this story line? I hate it.”

Really? Really.

Gosh. I guess I never realized that scripted TV had so damn much improv going on. Yes, that was irony. And here’s a fun fact: While a lot of it may seem like it’s improv, SNL is actually not, and doing improv there is the quickest way to never get invited back.

At least those comments were much easier to respond to. “Thank you, but Heather Locklear does not actually write her parts, she only performs them. We will pass your concerns on to the producers.” (Which we never did, because, why?)

Still… misguided but fine. And even things like fan fiction are okay, because they aren’t trying to change canon so much as honor it — although it can sometimes spin off the rails, with Fifty Shades of Gray being the ur-example of a fangirl turning a Twilight fanfic into a super dumpster fire of bad writing and terrible movies and still somehow making a fortune off of it — the perfect storm of participatory culture turning around to bite the ass of consumer culture. I’m not sure whether that’s good or bad, but if anybody did this to my work, I’d probably want to punch them in the throat.

Of course, there are always textual poachers, who approach fanfic from a slightly different angle. Their aim isn’t to make their own fortune off of rewriting stuff. Rather, it’s to, well, as a quote from the book Textual Poachers says, “Fan fiction is a way of the culture repairing the damage done in a system where contemporary myths are owned by corporations instead of owned by the folk.”

So that’s perfectly fine. If you’re not happy with how Star Wars or Game of Thrones turned out, then write your own damn version yourself. Do it on your own time and at your own expense, and enjoy. But the second you’d deign to try to demand that any other artist should change their work to make you happy, then you have lost any right whatsoever to complain about it.

See how that works? Or should I start a petition demanding that the other petition be worded differently? Yeah. I don’t think that would go over so well with the whiny fanboys either.

The perception of art is completely subjective while the creation thereof is completely under the artist’s control. If you don’t like it, don’t look at it, don’t watch it, don’t buy it. But, most of all, don’t tell the artist how they should have done it. Period. Full stop.

Icons passing

One sure sign of incredible talent is becoming a cultural icon. What defines a cultural icon? Somebody who is famous for generations after they’ve actually done their final work. One of the major examples in the Western World is, of course, William Shakespeare. You know his name. You know his plays. All of this even though he died 408 years ago, which is 287 years before anyone now living was born. Yes, you read both of those numbers correctly.

Closer to home, though, there are names of people I can mention who did their final work and/or died long ago that are still known to all current living generations, right down to Millennials, and probably even Gen-Z: Jimi Hendrix. Jim Morrison. Marilyn Monroe. James Dean. The Marx Brothers. Charlie Chaplin. Buster Keaton. I mean, just the fact that every one of those links goes to an official site for the named person should tell you a lot, considering that they all died before the internet was officially born.

It can go back even further — Van Gogh, Da Vinci, Michelangelo, Dante, Aeschylus. And if you throw in political leaders like presidents and monarchs and emperors, the list gets really long. In your own lives, it includes your parents and grandparents and, if you’re lucky, maybe even at least a great-grand round, if not great-great.

So when we lose true icons during our own lifetimes, they become a matter of mass mourning across generations, and we lost two of them this week. I’m referring, of course, to Doris Day and Tim Conway. It’s a perfect example of how humans are naturally drawn to contrasts — it is far more tragic when comedic actors pass away.

It’s also very telling that their deaths blew up social media.

I saw posts from people of all generations about both of them, even though Day was 97 and Conway was 85. She made her last two films in 1968, then went on to focus on animal welfare, only coming back to do a brief TV talk show in the 1980s, most notable for her interview of previous co-star and good friend Rock Hudson, who was visibly emaciated due to AIDS. He would die because of it a year later — the first high-profile public figure to be outed in this manner. Ironically (because he’d always been closeted until then), this was a big impetus for the whole gay rights AIDS treatment/ACT UP movements. Doris stuck by him through it all and all the way to the end, which says a lot about her character. This also made her a gay icon, more on which below.

Her film career and music career almost completely overlap — 1948 to 1968 for the former, and 1945 to 1967 for the latter.

As for Conway, although he kept working into this century, after doing one episode of 30 Rock in 2008, he only made two more appearances in 2013 and 2015 on TNT and the Hallmark Channel. Arguably, though, he is probably most well-known for his role in The Carol Burnett Show from 1975 until it ended in 1978 — kind of surprising, really, since the show actually ran for eleven seasons, beginning in 1967, and yet he is mainly associated with it. The big reason that Conway became iconic for those three years is because the show was syndicated and, like I Love Lucy, has been rerun almost continuously since it went off the air.

There’s another icon for you. Lucille Ball. When Gillian Anderson popped up playing her in American Gods, you didn’t need any explanation no matter how young you are. See how that works?

For me, I first saw a lot of those classic Doris Day films in the 80s and 90s thanks to the miracle of video rental. And, by that point, since we all knew that Hudson was gay and he was dead, it made those rom-coms they made together in the 50s and 60s all the more… interesting. She always had this reputation as being virginal and he’d always had the reputation of being homosexual, so they were sort of the perfect couple. Toss in Tony Randall — who was the prissiest straight man on the planet — and it became really entertaining high camp.

There’s a reason that Doris became a gay icon, at least in WeHo in the 80s and 90s, and a lot of that had to do with a place called Video West — sadly, another victim of the internet and streaming. They had all of her movies, and I think they might have even had a Doris Day section, so the old queens who ran the place passed the torch to us twinks who were renting.

And so on.

But she also became an icon to everyone else for very similar reasons. She did the right thing when it was necessary, and she made some really entertaining films over the course of only twenty years. Imagine that for a second. Her film career was only about one fifth of her life.

As for Conway, as I mentioned above, he  actually benefited from the internet, because so many of his clips from The Carol Burnett Show wound up online thanks to that show being replayed constantly, and a YouTube search for “Tim Conway Carol Burnett” will turn up a treasure trove of clips. (Currently, of course, it will also result in a lot of news stories lamenting his passing, but that’s just how it works.)

One thing I loved about Tim was that he could make anyone else on stage with him crack in a heartbeat while keeping a straight face, and one of the most famous moments in which he did that is his “Elephant Story” from a “Mama’s Family” sketch on The Carol Burnett Show. Here it is:

If this one doesn’t make you fall on the floor laughing, you have no soul. He’s clearly making it up on the fly, so he’s an improviser after my own heart, but the more sincerely he does it, the harder he makes it for everyone else not to just lose it. This is comedic brilliance, it is why Mr. Tim Conway is an icon, now and forever, even if you were born two or three decades after he last appeared on Carol’s show. (And Vicki Lawrence is no slouch for having added the button to the scene that kills everyone.)

As for Doris, let me leave you with this — one of her most famous songs in a famous Hitchcock film, Que Será Será from his second version of The Man Who Knew Too Much.

By the way, she really nails the Spanish pronunciation, too. In context, she’s singing the song in order to send a signal to her kidnapped son that Mom and Dad are here, which makes the lyrics even more meaningful near the end. This is basically a woman with a metaphorical gun held to her head trying to put on a brave face, and Doris nailed it.

So there you go. There are reasons that people become icons, and Doris and Tim definitely earned that status. The Earth is a sadder place for them having left it, but we are fortunate that what they left behind is so damn wonderful. Search them up, watch their stuff, and enjoy. They’d like that.

Who are your favorite icons who died long before you were born? Share in the comments!

Bad movies that really aren’t

Judging any art form is really subjective. After all, one man’s masterpiece is another man’s mishegoss. And you can’t really measure the entire world of creativity based on just your standards. Sure, it’d be nice if everything conformed to your taste, but why does it have to? You don’t have to watch it if you don’t like it.

I mean, if I ruled the world of entertainment, then most reality shows would not exist, no one would have ever heard of the Kardashians or the residents of the Jersey Shore, and professional wrestling would have died in the 1950s, along with a lot of other things. And sorry, but there would also be no MCU or DC movies.

If all of that pisses you off, good. It should. Because, like I said, if there’s room for my fanboy stuff, there’s room for yours. If I don’t like your stuff, I don’t have to watch it, and vice versa.

This isn’t to say that everything ever produced is perfect, or that all critics are wrong. Sometimes, a hot mess can be damn entertaining despite, or even because of, its flaws, and here are my ten examples of movies that, IMHO, are much better than they have any business being.

  1. Myra Breckinridge (1970)

Adapted from Gore Vidal’s “book that couldn’t be written,” this “motion picture that couldn’t be made” is actually much better and far more subversive than it was given credit for at the time. Then again, maybe it was too far ahead of its time, since it dealt with issues of gender identity, sexual orientation, feminism, and the capitalistic rape of the arts at a time when American society wasn’t ready for that discussion. As if we really are now.

Vidal disowned the film, and a lot of the cast involved, especially Raquel Welch in the titular (ahemn) role bad-mouthed it before it came out. Some critics think it’s the one thing that prevented her meteoric rise to stardom from continuing, which is a shame. Rex Reed also isn’t half bad in his debut, but the rest of his onscreen acting career amounted to small parts, cameos, or appearing on game shows, although he did frequently appear as himself in documentaries about other performers.

Still… viewed through the lens of the world almost fifty years later, the film comes across as a wry and knowing satire that somehow managed to understand the marginalized, even if the director was a straight and probably homophobic moron.

  1. Caligula (1979)

This one is an interesting milestone mainly because it’s the only example I can think of that had a big name, famous cast combined with hardcore porn. Oh, sure. None of the stars were involved in the actual boinking, but nonetheless there was plenty of real sex happening onscreen in this film, money shots and all — and some of the big names did do a lot of faking it.

But here’s the thing. I’ve been a fan of Roman history for a long time, and had read Suetonius long before I saw this film in an art house revival, and if anything, it actually holds back a little bit from the reality, despite all  that jizz and gore on screen.

If you can handle all the ick, though, what’s not to like? We have Malcolm McDowell, Helen Mirren, Peter O’Toole, and John Gielgud, leading up a cast of mostly Italian actors who were probably doing the old “it’s getting dubbed later” trick with their dialogue. But, anyway… for the most part, Caligula follows Suetonius pretty accurately, paints a really nice portrait of Rome circa 40 C.E., has a little bit of something for everyone, and has some really nice dark humor.

Bonus points and a connection to the first entry: the screenplay was written by… Gore Vidal, who also disowned it and insisted that his name be taken off. Somewhere in my collection, I think I still have a rare paperback edition of the novelization of the film that credits him as the author. He would have hated that.

  1. The Apple (1980)

Kudos for this one, because it happened right at a point when Hollywood musicals seemed dead — although it didn’t manage to get the same attention as the next entry on the list. This is definitely a B Movie and set in the then far-off world of 1994, where “life is nothing but show business.” The only thing they got wrong was in jumping the gun a little bit, but not by much.

I’d classify this one as pretentious silliness, but the musical numbers are enjoyable enough and well-choreographed, and the issues of reality shows with audience manipulation to tinker with the results still ring true today. Bonus points for the Big Bad being played by famous Polish character actor Vladek Sheybal in what is, as far as I know, his only musical role. He made a career out of playing dubious Soviets during the Cold War, but is probably best known to mainstream audiences for his role in the James Bond flick From Russia with Love. Here, it’s a hoot seeing him play a saucy singing and dancing stand-in for Satan.

Oh, yeah. In case the symbolism in the title was too subtle for you, yes, it’s that apple ultimately, with Mr. Boogalow and Mr. Topps competing for the souls of innocents Pandi and Dandi. I’m sure the symbolism of the protagonist’s and antagonist’s names will probably jump right out at you, too.

  1. Xanadu (1980)

Another musical dealing with religious mythology, although this time around it’s Greek, and involves a muse (Olivia Newton-John) who came down to Earth. (She’s Terpsichore, the muse of dance, in case you’re keeping track.) The plot involves some silliness about re-opening a long closed roller rink as a failed mural artist (Michael Beck) teams up with an old time band leader (Gene Kelly), and they all sing, dance, and skate around combined with some really cheesy 1980 visual effects that were in that awkward slot between purely optical and purely CGI.

Still… it’s an entertaining romp if you just let your brain go and marvel at this attempt to combine the au courant (Olivia) with the past (Kelly), and an even further past (the Pan Pacific Theater, which was another character, really),  and set it all to cheesy as hell pop songs. Hey, it was good enough to be unironically turned into a Broadway musical.

  1. Meet Joe Black (1998)

The main critique I hear of this film is that it’s just too damn long, but come on. It happens to be exactly the same length as Avengers: End Game to the minute. What I enjoy about this film, though, besides the amazing pairing of Brad Pitt and Anthony Hopkins in the leads, is how much of a throw-back to 1930s and 40s Hollywood films it is, particularly Heaven Can Wait (1943) and Death Takes a Holiday (1934) — which was the direct inspiration for this film.

After Joe Black takes over the body of a young man who struck the interest of Hopkins’ character’s daughter before he was hit by multiple cars and apparently killed, it becomes a meditation on the need for love and the inevitability of death and, indeed, how the former can conquer the latter. This is a film about big ideas, and it takes its time with it, which is probably why it put a lot of people off.

  1. Battlefield Earth (2000)

This one is on the top of my “So goddamn bad it’s gold” list. Cheesy as hell? Oh yeah. Cosmic shit show? In spades. Worth watching? Definitely. Here’s the review I wrote when it came out. What wasn’t to hate about it? Crazed cult member spends millions on vanity project with no apparent oversight, chews up and spits out the scenery, and everything in it seems derivative.

On that last point, here’s where I have to give props to L. Ron, though. Sure, there are bits that seem to have been ripped from Logan’s Run and Blade Runner and other stuff. However, he did write his pulp epic before those books and movies ever came out. So this is chicken and egg stuff. Still…

The best part of Battlefield Earth is that if you know it’s a thinly veiled explanation of Scientology, then everything in it makes that pseudo religion look so goddamn ridiculous that this movie is practically an anti-recruiting tool — and Travolta couldn’t even see that. And that was L. Ron Hubbard’s joke, really, I think, because he parodied hard the religion he created and its structure. Who are the villains in this story? The Psychlos. And even though he gave them a name reminiscent of the people Scientologists consider the villains — psychiatrists and psychologists — that was just a dodge,  because everything thing the Psychlos do and say, especially to each other, is right out of the Scientology  rule book.

So, yeah. This movie more than anything reminds me of what an evil genius L. Ron was. He managed to create a cult and then mock them quite openly in his fiction, knowing that they’d never get it because he’d blinded them to it. Brilliant!

  1. National Treasure (2004)

History: 0. Fun: 10. That’s all I really have to say about this one and its sequels. It’s a romp that may teach some people some stuff, and it’s sort of an Americanized Dan Brown, except without quite so much made up bullshit. Okay, a modicum of made up bullshit, but at least it’s not stolen from other writers who made it up first.

  1. John Carter (2012)

The only reason that John Carter tanked is this: Disney bought Lucasfilm. Period. Why did that have an effect? Simple. They didn’t want to start supporting another science fiction franchise in the wake of the behemoth they’d just reined in. So all PR and marketing for this film stopped abruptly before it opened, and more’s the shame, because it’s a pretty accurate take on what is arguably one of the earliest American science fiction franchises, and Mr. Carter deserved a hell of a lot more.

I mean, come on. Is Disney really that blind that they don’t realize how damn many fourteen-year-old boys (and girls) they could have gotten to come see A Princess of Mars? Otherwise, John Carter is a well-done ripping adventure that combines every desert planet from Star Wars with all of that MCU jumping about.

  1. Jupiter Ascending (2015)

This one was misunderstood by people who don’t like comedy or satire in their science fiction. (“You got chocolate in my peanut butter!”) But, come on. It’s funny and off-kilter, and it’s meant to be. The other thing to keep in mind: during the time this film was in production, the Wachowskis were going through some difficult personal times, just after Lana’s public transition and just before Lily’s — and one of them was outed as transgender against her will. So take that title, as well as the plot, as symbolic.

Is the whole thing meant to be camp and with a double meaning? Oh, hell yes.

  1. Valerian and the City of a Thousand Planets (2017)

I consider this film to be the unofficial sequel to Luc Besson’s amazing The Fifth Element, because it really feels like it’s set in the same world, and it starts off with an absolutely amazing opening sequence (with Rutger Hauer and Bowie bonus points) and then includes this amazing bit of stuff from Rihanna that made me question my sexuality. What’s not to like?

Which movies that are generally considered “bad” do you really love and why? Tell us in the comments.

 

You have the right to remain silent

I’ve often told people that I’m glad I grew up in an English-speaking country, although not out of any kind of chauvinism. Rather, it’s just that if I hadn’t learned English as my first language, I doubt that I ever would have been able to learn it as my second, and a huge part of that is because the spelling and pronunciation of things just seem to make no damn sense. There’s an example right there: we spell it “pronunciation” as a noun, but as a verb it’s “pronounce.” Ta… what? Where’d that extra “o” come from?

The only other language I can think of off the top of my head where the spelling seems to make no sense is Irish Gaelic. Let’s just look at a few names. The example a lot of people probably know is Sinéad, as in Sinéad O’Connor. Now, if you didn’t know, you’d probably think it was “Sineed” or “SinEE-ad,” but it’s not. It’s “shi-NAYD.” A couple of Oscar shows back, we all learned that Saoirse wasn’t “sao-irse” or “sa-oyers,” but “SEER-sha.”

So what would you make of the names Niamh or Caoimhe? Neeam and Cammy, right? Nope. Neev (or NEE-av) and KEE-va.

Now, I’m assured that the rules of pronouncing words in Gaelic are completely consistent and easy to remember, but I’ve tried to learn the language, since it is part of my genetic background, and failed miserably. Then again, looking at the last three names together, it does start to make sense, although it’s still a brain breaker.

No such luck in English. It’s tough enough to plough through without silent letters messing things up. Even if you had read it in your head before you read it out loud, you could still make big mistakes if you’re not completely fluent.

I’m not even going to get into all the multiple ways various vowels and diphthongs can be pronounced — and note that diphthong can either be pronounced as “dipthong” (more common) or “difthong” (rarer.) I’m more interested in one particular culprit for this post, though: The Silent E.

In English, the pronunciation of vowels is not consistent as it is in a lot of other Indo-European languages, particularly the Romance languages. In the latter, whatever their vowels are — typically A, E, I, O, U — each have the same pronunciation. In Spanish, for example, they are ah, eh, ee, oh, oo. To jump to Germanic, they are very similar in Deutsche, too: ah, ay, ih, oh, oo.

Any changes come through putting two vowels together, and they’re also consistent. For example, in German, put “ie” together and you get “ee.” In Spanish, put “ui” together and get “uee” On the other hand, other combos in Spanish just give you two syllables. “AE” in a word like “caer,” for example, gives you “ky-air,” the infinitive form of the verb “to fall.”

There’s another concept Spanish has that English doesn’t: Strong and weak vowels. A, O, and U are strong. E and I are weak. And it plays out like this — by affecting certain consonants that come before the vowels, as well as how the vowels combine. In Spanish, the affected consonants tend to be C and G. When the C comes before a strong vowel, then it has the hard K sound (casa — kah-sa); when it comes before a weak vowel, then it’s an S (ciudad — see-ooh dahd). Likewise, when G comes before a strong vowel, it’s more of a hard G (dame gasolina… that second word is pronounced just like in English) and before a weak vowel, more of an H; general, “HEN-eh-ral.”

Final note: notice that the “CIU” combo in “ciudad” is pronounced “see-ooh. That happens when you put a weak vowel before a strong one. It’s the opposite of the “UI” combo. When the strong vowel comes first, the weak one gets absorbed, more or less.

None of which has anything at all to do with how fucked up English vowels are, except as an example of a language with easy and consistent rules. Know how the vowels and diphthongs in Spanish or German or Italian work? Then you’re good to go, and can read and pronounce any word you run across. Period.

Meanwhile, in English, we have little word pairs like these: cat, Cate; fat, fate; gat, gate; hat, hate; mat, mate; Nat, Nate; pat, pate; rat, rate; sat, sate; bit, bite; kit, kite; sit, site; bon, bone; con, cone; don, done; non, none; ton, tone; dun, dune; run, rune.

There are probably a lot more, but I stuck to single-consonant starts. The interesting thing to notice, though, is that we have examples for every first vowel except for E. The only example I can kind of stretch out of it are “Ben” and “Bene” (bin and baynay), but that only works because the latter word is Latin, and both of its E’s are pronounced.

Another thing to note: In other Germanic and Romance languages, the final E is always pronounced. For example, in Italian, the words “molto bene” and “calzone” are pronounced “mole-toe bay-nay” and “kal-zo-nay.” (At least they are by modern Italians. Italian-Americans, who came here before the language was codified after WW II get it “wrong.” At least according to modern Italians.) And, in German, a good example is the word “heute,” which means “today.” It’s pronounced “oy-tuh,” with a great diphthong to start and a pronounced E that doesn’t affect the vowels to end it.

Oh, by the way, the Spanish word for “today” is “hoy,” which is pronounced almost the same as the German word without that little extra syllable at the end.

And, honestly, “syllables at the end” is kind of the trick to it because, once upon a time, before the Great Vowel Shift and back in Chaucer’s day, the E on the end of English words was pronounced as its own syllable. In Shakespeare’s day, the E in the last syllable was also pronounced, especially in participles, so that pronounced would have been pronounced pronounce-ed. This is why modern Shakespearean texts will be marked in one of two ways, depending on the meter… you may see the word as markéd writ, or otherwise unstressed, it is just mark’d.

And while grammarians have tried to come up with logical reasons for silent E’s on the end of words, it’s really a stretch because, again, it’s all based on the vagaries of how English is pronounced in the first place. And there’s a particularly heinous example with a word like “lead.”

If it’s a verb, it’s pronounced the same as “lede,” which is a journalistic concept referring to the most important part of the story which usually starts it off — hence, it leads the piece. However, the reason it’s spelled that way is to distinguish it from the noun, lead, which is pronounced the same as “led,” which is the past tense of the verb to lead.

Confused yet? The reason that journalism needed the easy distinction is because lead or leading (short E) refers to the space between lines of type. When type was set by hand, lines were literally separated by one or more thin strips of lead one point or 1/72nd of an inch thick. The term did carry over into the computer world for a long time, though, only eventually giving away to “line spacing” in modern digital publishing. But lede, lead, led, and lead’s friend read all bring up a good point: Vowels in English make no damn sense.

They used to, and that brings us back to Chaucer and English before the great vowel shift — and before Samuel Johnson and Noah Webster independently sat down to decide how words “should” be spelled. (Hint: Johnson was a pedantic putz, and a big part of the reason that English spelling makes no sense. Webster tried to simplify a bit, but not enough.) See, if you read the prologue to the Canterbury Tales out loud and pronounce every word exactly how it’s spelled, remembering that every vowel is pronounced, even the last E’s in words like “bathed” and “veyne”, and that every vowel has only one pronunciation, you can recite it and sound exactly like a speaker of Chaucer’s English without even knowing the language.

Good luck for any non-English speaker trying to read a modern English work and getting it right. It would come out about as clear as me trying to read Gaelic. I’d imagine that this is probably a good approximation of what this mutt language called English looks like to a non-speaker. Here are the first lines of Chaucer in Gaelic: “Nuair a chuir cithfholcadáin i mí Aibreáin an triomach i leataobh, is féidir go dtéann sé go dtí an fhréamh …”

Yeah. I have no idea, either. I do know that Ben Franklin tried to reform English by creating a slightly new alphabet — or alfabet — in which each letter had only one pronunciation, but it never caught on. Too bad, because most of the rest of English is actually a lot easier. After all, possible it is to greatly do much manglement to the words and syntax yet thus ensues a sentence over all intelligible still in English speech, it is. There aren’t a lot of languages you can do that to.

So I’m glad I learned this difficult chimera first. It makes it easier to deal with a lot of the others.

Photo credit: Carole Raddato, The Chimera of Arezzo, c. 400 BC, Museo Archeologico Nazionale, Florence

23 and me (and thee)

Warning: after you read this, you’re going to start seeing the numbers 23 and 5 everywhere. Sorry.

When I was 23 years old, I first encountered and read the very trippy book The Illuminatus! Trilogy by Robert Shea and Robert Anton Wilson. I’ve mentioned the latter several times here, and probably will again. Along with several others, he became one of my major writing influences early on.

Now, the thing about me coming to read the book for the first time when I was 23 is that it seemed to come about completely by happenstance. I mentioned to a coworker, who was a Wiccan, that I’d just turned 23, and she said, “Oh, you need to read this book.” I did a little research into it, thought it looked interesting, and headed down to the Bodhi Tree, the now-defunct Melrose Avenue bookshop that specialized in all things new age and esoteric.

The thing is massive — something like 800 pages, I think, and was published in trade paperback format, which is the bigger size in comparison to mass-market paperback. Trade paperbacks are close to the dimensions of standard hardcover books.

Anyway, I started to read it, and the book hooked me immediately. Why not? I was 23, and it was full of sex, drugs, and rock and roll. It also affectionately mimicked and imitated the styles and structures of things like Joyce’s Ulysses and the cut-up technique preferred by William S. Burroughs. Threads of the story weave in and out of each other in constant interruptions, the identity of narrator keeps changing by passing among omniscient third person to first-person from the characters — some of whom seem aware that they are characters in a novel, maybe — and the whole thing plays out as a neo noir detective mystery wrapped around a psychedelic conflation of every far right and far left conspiracy theory of the time, with a healthy dose of science fiction, fantasy, and eldritch horror.

Besides Joyce and Burroughs, H.P. Lovecraft and his universe receive various nods, and one of the protagonists (?) travels about in a golden submarine that evokes both the Beatles and Captain Nemo at the same time.

One of the running ideas in the book is the mystical importance of the number 23, which pops up constantly in the narrative. This also implies the importance of the number 5, which is the sum of 2 and 3. This is also why, in later years, it was tradition for Wilson to always publish his newest book on May 23rd.

There are some very interesting facts about the number, actually — and it shouldn’t escape notice that Wilson’s last initial, W, is the 23rd letter of the Latin alphabet. Those facts do go on and on, too. Here’s another list that has surprisingly little overlap with the first.

William S. Burroughs was obsessed with the number 23, which is mentioned in the novel, and many works created post-Illuminatus! capitalize on the concept by using it. You’ll find 23s in things like the TV show Lost, various films including Star Wars Episode IV, and two films that specifically deal with it, the American film The Number 23 and the German film 23, although the latter would be more properly called Dreiundzwanzig.

There are, of course, also plenty of examples of the number 5 doing interesting things as well.

So here I was, reading this amazing brain-bender of a book at the young age of 23, and I started to wonder whether there was any truth to this idea. You know what happened? I started seeing the number 23 everywhere. It would be on the side of taxis and buses — bonus points, sometimes I’d see 523, 235, 2355 or similar combinations. It would show up on receipts — “You’re order number 23!” It would be one of the winning numbers or the mega number for the current lottery winner. The total when shopping would end in 23 cents, or else 67 cents, meaning that I’d get 23 cents in change.

Wilson eventually gives up the secret to the secret, although not in this book. He does offer another interesting exercise that worked for me at the time, although probably not so much anymore since people don’t tend to carry change around any longer. He referred to it as The Quarter Experiment, although I think of it as “Find the Quarter,” and it’s exactly what it sounds like. When you’re out and about walking around, visualize a quarter (or local coin in your currency of similar size, about 25mm) and then look for one that’s been dropped on the ground.

Back in the day, Wilson claimed success with this and, sure enough, so did I. It’s worth it to click the link above and read the explanation, as well as the several ways to interpret it. (It’s also worthwhile to check out and do the other exercises listed, but especially number four. Too bad the list didn’t make it to five.)

But, again, people just aren’t as likely to drop quarters because they probably only trot them out to do laundry, especially with most parking meters accepting debit and credit cards now. A lot of public washers and driers are also doing the same, so we may be swiftly approaching a day where the only likely place someone might drop some coins is in front of one of those grocery store change converter machines.

Still, you can probably do this experiment fnord with any other object likely to be dropped, like a pen, or a receipt, or keys.

After I finished my first read of Illuminatus!, I went on to read almost all of the rest of Wilson’s oeuvre, both fiction and non. He wrote a number of books outlining his philosophy, like Prometheus Rising and Right Where You Are Sitting Now, as well as his Cosmic Trigger series, which is a cross between autobiography and philosophy, and the amazing novel Masks of the Illuminati, in which James Joyce, Albert Einstein, and Aleister Crowley walk into a bar in Geneva and things get trippy. I’ve always wanted to adapt this one into a play or film and, in fact, it was influential in the creation of my own play Three Lions, which involved Crowley, Ian Fleming, and Hermann Hesse. (Available for production, if you’re interested — here’s the first scene.)

Okay, Wilson has got too many works to cite individually, so just go check out his website for the full list. Meanwhile, this is where we’re going to go meta and full circle.

I’ve re-read Illuminatus! multiple times, and in fact started another read-through about (d’oh!) five weeks ago. Every time through it, it’s a completely different work and I get different things out of it. When I was 23, it was one story. Each of three times after that, it was another. Now, it’s yet again completely different and I just realized that this is, in fact, my fifth pass through the text.

So it was weirdly appropriate when I found out that a friend of mine from our improv company was going to turn 23 on April 30. That date itself is significant because a large part of the present story of the book takes place in April and May, but on top of that I suddenly had the chance to return the favor that my coworker had done for me oh so long ago, so I gifted my young friend a dead-tree copy of the anthology version.

Hey, I survived that journey and I think it made me a better person. Might as well share the love, right? My only hope is that somewhere down the line, after he’s read it a bunch of times, he’s in the position to pass the torch to another 23-year-old.

Pictured: My photo of the covers of my original U.S. paperback versions of the books, which I was lucky enough to find in a used bookstore for cheap a few years back. Interestingly enough, that bookstore is called Iliad Books, and it used to be next door to a video rental place called Odyssey. Both of those also figure into the fnord book. Yes, it’s quite the rabbit hole.

Not pictured: my autographed by RAW himself original edition and my later “checkerboard” cover version from the 90s.