Friday Free-for-All #59: Multiple viewings, theater or home, hobby time, techsplosion

The next in an ongoing series in which I answer random questions generated by a website. Here are this week’s questions. Feel free to give your own answers or ask your own questions in the comments.

What movie have you seen more than seven times?

For starters, I know that I’ve watched Stanley Kubrick’s 2001: A Space Odyssey way more than seven times. Once home video and DVD happened, watching 2001 on New Year’s Day instead of a certain parade became a long-standing tradition with me.

The more than seven viewings is also true of several of his films, including Dr. Strangelove, or How I learned to Stop Worrying and Love the Bomb and A Clockwork Orange.

I can’t leave off The Rocky Horror Picture Show. I’m pretty sure I saw that more than seven times in high school alone, and The Wizard of Oz, It’s a Wonderful Life, and The Ten Commandments also make the list because they are still being rerun at least once a year on TV.

I can’t forget the Star Wars Original Trilogy and most of the Prequel Trilogy. The Sequel Trilogy hasn’t been around long enough yet. As for Star Trek, The Wrath of Khan and The Voyage Home are the only ones I’ve definitely seen that often.

There are a few James Bond films — definitely Goldfinger, Live and Let Die, and Moonraker (one good, one okay, and one cheesy as hell) again because of the TV return thing.

I’m not sure, but I think that Willy Wonka and the Chocolate Factory (that’s the amazing Gene Wilder-starring version and not the Tim Burton travesty) probably also makes the list. Oh. Also, Cabaret, All that Jazz, and Westside Story.

There are probably others, but these are the ones that I can definitely put in the more than seven list.

Do you prefer to watch movies in the theater or in the comfort of your own home?

This is an answer that’s changed enormously. Once upon a time, my reply would have been absolutely in a theater, because that’s where they were made to be seen.

But then as my interest in seeing all the latest MCU/DCEU franchise films fell to zero, waiting for home video or streaming became enough mostly — although I would still go out for the big event films that interested me, mainly Star Wars installments and Bladerunner 2049.

The last film I did see in an actual theatre was Star Wars: The Rise of Skywalker, back in February 2020. It was a mid-weekday thing and there were about four of us in the place.

So already having discovered the joys and convenience of streaming, not to mention the lower cost if it’s something on a service you already have, by the time the theaters shut down it was a no-brainer, and I’m not really inclined to go back anytime soon.

Honestly, seeing a Marvel movie on a big screen doesn’t really add much to it, not compared to the quality I can get at home. Plus I also don’t have to put up with other people, sticky floors, or an endless parade of pre-show trailers and adverts.

What hobby would you get into if time and money weren’t an issue?

I would become a total model train geek, although it would be about more than just the trains. I’d want to create an entire miniature city in a dedicated room, like a full basement, and build it in something like N Scale, which is ¾” to 1 foot, or 1:160 scale.

This would make a model of the Empire State building just over 9 feet tall at the tip of its mast, although it would take 33 linear feet of model to make up one mile of street, so it wouldn’t be a very big city. (Z scale would cut this down to 24 feet per mile, but definitely sacrifice some realism.)

To get a scale model of all of San Francisco into an area 33 feet on a side, you’d wind up with city buses being just under half an inch long and a tenth of an inch wide. You’d only need to cut the N scale in half to model two square miles of Midtown Manhattan.

But wait… it does say that time and money aren’t an issue, right? So instead of building a single square mile of city in a basement, why not go for a warehouse or buy an abandoned big box store? Aim for something that would fit fifty or a hundred square miles of city, and if it had multiple floors, go for various layouts — urban mega-city, suburban smaller town, historical city — with a scale ten mile footprint, you could easily build two separate 19th century Main Street towns surrounded by countryside and connected by railroad and telegraph.

And I wouldn’t need to go it alone. Hell, it could become an entire project that would employ model and miniature makers, urban planners, painters, designers, builders, electricians, programmers, and more. Give the big city a working harbor and airport, also have miniature cars and people moving around, design it to not only have a night and day cycle but seasons and weather as well, and it could be quite a thing.

It could even become a tourist attraction. Hell, they already did it in Hamburg, Germany.

And why does the idea fascinate me so much? Maybe because I was into model trains as a kid, although never had a neat, permanent layout. But this also led to me becoming a big fan of games like Sim City, in which I could indulge my curiosity about building and creating things and see where they led — especially urban landscapes.

Hm. Give me all the resources, and I just might make TinyTowns a major tourist destination.

Why did technology progress more slowly in the past than it does now?

I believe that this is because technological development is exponential, not algebraic. The latter is a very slow, additive process. You go from 1 to 1+1, or 2, then to 2+1 for 3 and so and so on. Repeat the process 100 times, and you land on 101.

Take the simplest exponential progression, though, in which each subsequent step is double the one before it. That is, go from 1 to 1×2, or 2, then 2 to 2×2 for 4, and so on. After a hundred steps, your result is 1.25×10^30, or roughly 1 followed by 30 zeros, which is one nonillion.

For perspective, a yottabyte — currently the largest digital storage standard yet set — is equal to one trillion terabytes, the latter currently being a very common hard drive size on a home computer.  The number noted above is ten thousand times that.

It’s exactly how we wound up with terabyte drives being so common when, not very long ago, a 30 megabyte drive was a big deal. That was really only within a generation or so. This relates to Moore’s Law, stated in 1965 as “the number of transistors in a computer chip doubles every 18 to 24 months.”

What wasn’t stated with the law was that this doubling didn’t just affect the number of transistors, and therefore the number of simultaneous operations, that a chip could perform. It extended to every other aspect of computers. More operations meant more data, so you could either speed up your clocks or widen your data buses (i.e. length of allowable piece of information in bits) or both.

And this is why we’ve seen things like computers going from 8 to 64 and 128 bit operating systems, and memory size ballooning from a few kilobytes to multiple gigabytes, and storage likewise exploding from a handful of kilobytes to terabytes and soon to be commercial petabyte drives.

Perspective: A petabyte drive would hold the entire Library of Congress print archive ten times over. If would probably also hold a single print archive and all the film, audio, and photographic content comfortably as well.

Now, all of this exploding computer technology fuels everything else. A couple of interesting examples: Humans went from the first ever manned flight of an airplane to walking on the moon in under 66 years. We discovered radioactivity in 1895 and tested the first atomic bomb 50 years later. The transistor was invented in 1947. The silicon chip integrating multiple transistors was devised in 1959, twelve years later.

And so on. Note, too, that a transistor’s big trick is that it turns old mathematical logic into something that can be achieved by differences in voltage. a transistor has two inputs and an output, and depending how it’s programmed, it can be set up to do various things, depending upon how the inputs compare and what the circuit has been designed to do.

The beauty of the system comes in stringing multiple transistors together, so that one set may determine whether digits from two different inputs are the same or not, and pass that info on to a third transistor, which may be set to either increment of leave unchanged the value of another transistor, depending on the info it receives.

Or, in other words, a series of transistors can be set up to perform addition, subtraction, multiplication, or division. It’s something that mechanical engineers had figured out ages previously using cogs and levers and gears, and adding machines and the like were a very  19th century technology. But the innovation that changed it all was converting decimal numbers into binary, realizing that the 0 and 1 of binary corresponded perfect to the “off” and “on” of electrical circuits, then creating transistors that did the same thing those cogs and levers did.

Ta-da! You’ve now turned analog math into digital magic. And once that system was in place and working, every other connected bit developed incredibly over time. Some people focused on making the human interfaces easier, moving from coding in obscure and strictly mathematical languages, often written via punch cards or paper tape, into not much improved but still infinitely better low level languages that still involved a lot of obscure code words and direct entry of numbers (this is where Hex, or Base 16 came into computing) but which was at least much more intelligible than square holes a card.

At the same time, there had to be better outputs than another set of punched cards, or a series of lights on a readout. And the size of data really needed to be upped, too., With only four binary digits, 1111, the highest decimal number you could represent was 15. Jump it to eight digits, 1111 1111, and you got… 255. Don’t forget that 0 is also included in that set, so you really have 256 values, and voila! The reason for that being such an important number in computing is revealed.

Each innovation fueled the need for the next, and so the ways to input and readout data kept improving until we had so-called high-level programming languages, meaning that on a properly equipped computer, a programmer could type in a command in fairly intelligible language, like,

10 X = “Hello world.”

20 PRINT X

30 END

Okay, stupid example, but you can probably figure out what it does. You could also vary it by starting with INPUT X, in which case the user would get a question mark on screen and the display would return whatever they typed.

Oh yeah… at around the same time, video displays had become common, replacing actual paper printouts that had a refresh rate slower than a naughty JPG download on 1200 baud modem. (There’s one for the 90s kids!) Not to mention a resolution of maybe… well, double digits lower than 80 in either direction, anyway.

Surprisingly, the better things got, the better the next versions seemed to get, and faster. Memory exploded. Computer speeds increased. Operating systems became more intuitive and responsive.

And then things that relied on computers took off as well. Car manufacturers started integrating them slowly, at first. Present day, your car is run more by computer than physical control, whether you realize it or not. Cell phones and then smart phones are another beneficiary — and it was the need to keep shrinking transistors and circuits to fit more of them onto chips in the first place that finally made it possible to stick a pretty amazing computer into a device that will fit in your pocket.

Oh yeah… first telephone, 1875. Landline phones were ubiquitous in less than a hundred years, and began to be taken over by cell phones, with the first one being demonstrated in 1973 (with a 4.4 lb handset, exclusive of all the other equipment required), and affordable phones themselves not really coming along until the late 1990s.

But, then, they never went away, and then they only just exploded in intelligence. Your smart phone now has more computing power than NASA and the Pentagon combined did at the time of the Moon landings.

Hell, that $5 “solar” (but probably not) calculator you grabbed in the grocery checkout probably has more computing power than the Lunar Lander that made Neil Armstrong the first human on the Moon.

It’s only going to keep getting more advanced and faster, but that’s a good thing, and this doesn’t even account for how explosions in computing have benefited medicine, communications, entertainment, urban planning, banking, epidemiology, cryptography, engineering, climate science, material design, genetics, architecture, and probably any other field you can think of — scientific, artistic, financial, or otherwise.

We only just began to escape the confines of Analog Ville less than 40 years ago, probably during the mid to late 80s, when Millennials were just kids. By the time the oldest of them were second semester sophomores in college, we had made a pretty good leap out into Digital World, and then just started doubling down, so that two decades into this century, the tech of the turn of the century (that’d be 2000) looks absolutely quaint.

Remember — we had flip phones then, with amazing (cough) 640×480 potato-grade cameras.

Compare that to 1920 vs 1900. A few advances, but not a lot. The only real groundbreaker was that women could now vote in the U.S., but that wasn’t really a technological advance, just a social one. And if you look at 1820 vs. 1800, or any twenty-year gap previously, things would not have changed much at all except maybe in terms of fashion, who current world monarch were, or which countries you were currently at war with.

And that, dear readers, is how exponential change works, and why technology will continue to grow in this manner. It’s because every new innovation in technology sews the seeds for both the need and inevitability of its next round of advancement and acceleration.

We pulled the genie out of the bottle in 1947. It’s never going back in.

Friday Free-for-All #58: Movie love, movie hate, major useless, and “normal”

The next in an ongoing series in which I answer random questions generated by a website. Here are this week’s questions. Feel free to give your own answers in the comments.

What terrible movie do you love?

This one is easy. A lot of critics and others think that the movies Caligula is total crap, despite the all-star cast. But the thing is this — it is actually a really faithful retelling of Suetonius’ The lives of the Twelve Caesers.

Sure, Suetonius may have been totally full of shit and he may have libeled the fuck out of Caligula for the sake of kissing up to later Emperors. Still, ignore that part of it, and the film’s story follows the source pretty closely.

In fact, if anything, the producers actually held back on the sex and violence. But, come on. What’s not to love about this flick? Malcolm McDowell, Helen Mirren, Peter O’Toole, and John Gielgud, plenty of eye candy for all genders and preferences in the supporting cast, and script by Gore Vidal – even though he disowned it — but he shouldn’t have.

Seriously, ignore the scissor sisters BS that Guccione snuck into it because he could just before he had to smuggle the footage out of Italy to avoid obscenity charges, boom, done.

One really interesting aspect of the anniversary edition I own is that one of the features on the DVD is raw footage from a scene set in Tiberius’ (infamous) grotto on Capri, where it looks like all kinds of bizarre sex acts are going on in the background – but unedited and from angles not used in the film, it’s quite clear that what you thought you saw was far more graphic and nasty than what was really happening. The magic of film!

What is the most overrated movie?

Oh, there are many, but two stand out because they won Best Picture and had absolutely no goddamn business doing so.

Exhibit A: Forrest Gump.

Exhibit B: Gladiator.

I mean, come on. In the case of the first movie, it’s the glorification of stupid, and I did not ever for one second connect with or empathize with Gump. Why would I? He obviously has mental problems and, given the era, if his Mama wasn’t able to help, he would have been put into an institution, preventing the rest of the movie, period.

Still… Forrest’s character through the rest of the film is an object lesson in this: The mentally ill, despite their condition, are still quite capable of being total assholes.

Second film, Gladiator… as a Roman History buff, this stack of shit just loses from the get-go. And it only gets worse from there, for ten thousand reasons. One big one beyond the rape of history at the end?

Well, true Gladiators were not slaves. They were celebrities. Think MMA fighters now, or social media influencers. So if they got tossed into the ring, it was not to die. It was to play up a high-profile slap fight at the most.

But don’t even get me started on the whole “Pissed off Gladiator killed Commodus in the ring, in public” bullshit.

Anyway, long story short: No way in hell that Gladiator deserved a single accolade, much less “Best Picture.” Nope. It was a steaming pile of crap then, and it still is now.

What is the most useless major in college?

I’m going to have to go with Philosophy – and not that I’m pegging it as a major, not a course of study. I absolutely think that everyone should have to take two philosophy courses in college, one general and the other more specific – but beyond that, majoring in it is pretty pointless.

You learn that when you take your lower division general philosophy course and realize that quite a lot of these philosophers were basically talking out of their asses, and most of them were stuck in the same error that wasn’t even discussed in philosophy until the 20th century.

That is, they forget to include themselves and their own experiences in seeing how their philosophies formed, and instead tried to create these grand mystical rules for what is “reality.”

And it all started with the worst of them, Plato, and his “ideal” forms. This meant that for every object, there was an ideal version of it that existed in some invisible ethereal realm, and that version was the one invoked every time an earthly imitation was created.

Carpenter makes a chair? He’s just copying from that ideal. Singer creates a song – echo of the ideal, and so on. Of course, he never talked about whether that dump your kid just took was a copy of the ideal ethereal shit. What he implied, though, was that everything ever yet to be invented was just floating out there somewhere, waiting to be invoked down here.

He did have one good bit though, his parable of the “slave in a cave.” In it, a slave is chained to a rock in a cave, constrained so that he’s facing the back wall with the entrance behind him. Way beyond the entrance is a bright fire. All the slave can see of the outside world are the shadows on the wall, created by people and animals and the like passing between the fire and the entrance.

In other words, he was saying, we could not perceive the real world of these ideal forms because our perception was limited. And that’s a kind of yes, kind of no, although I’d think of it more in terms of things like we couldn’t conceive of germ theory until we’d made the microscopes to see them, or couldn’t fathom the skies above until we had telescopes and math. Lots and lots of math.

There is, though, a great parody of Plato’s Cave that I first heard from the late, self-proclaimed “guerrilla ontologist,” Robert Anton Wilson. In that version, a slave and a Buddhist are chained up in the cave, just watching the shadows. Then, the Buddhist suddenly slips his chains off and walks outside, staying there for a while.

The Buddhist finally returns to the cave calmly, sits down and puts his chains back on.

“What did you see out there?” the slave asks, excitedly.

The Buddhist replies, “Nothing.”

Anyway, don’t major in Philosophy. It’s not worth it and doesn’t translate to anything marketable.

What seemed normal in your family when you were growing up, but seems weird now?

How rarely my parents had any kind of dinners or parties or invited guests, to the extent that the few times we did host something really stand out in my mind. And the lack of invited guests growing up extended to my friends. I was expected to go play elsewhere, and god forbid that I invite one of my friends into my Mother’s Holy of Holies.

I think we did host a couple of extended family Thanksgiving dinners, as well as my Mom’s older brother when he was in town with his college debate team (he was the professor/coach, who came with his two students, and I can still remember their names to this day: Vinnie and Tim.) Mom’s mom came and stayed with us twice, and one of my cousins (my mom’s niece) came and stayed with us once.

My mom did plan to host my 4th birthday party, but that happened to be the year that we had a bit of a flu epidemic, the end result being that the only guest who finally was well enough to make it was a kid down the street named Scott, whom I didn’t really know. Yeah, awkward!

When I was in Kindergarten, my parents did invite my teacher, Miss Jones, over for dinner and it wasn’t until I became an adult that I wondered. Jones was my father’s mother’s maiden name. Any relation? Although it’s such a common name, who knows.

Anyway, this all seemed normal until I grew up, and then saw friends who were constantly hosting parties of get-togethers, or frequently had relatives or distant friends visiting for a few days, and it blew my mind.

People did this? How weird. How… intrusive. And, unfortunately, I think I wound up inheriting the “No guests!” gene (definitely from my mother), and I cannot come up with more than maybe one time I hosted an overnight guest – an old friend and former roommate – and I’ve never hosted a party. Keep in mind, I’m only counting the times when I’ve lived alone. I’ve had plenty of roommates who were really into the parties and weekend visitors and the like.

And I don’t mind that. I think I was just not programmed on how to do this shit. Of course, I used to live along in a two-bedroom apartment back when that was affordable, but nowadays, it’s a one bedroom, so unless I’m really intimate with an overnight guest, there’s really nowhere to put them.

Theatre Thursday: Remembering my real second language

As this time of lockdown and uncertainty goes on, what does become clear is that large, live events are probably not coming back soon. Live theatre, movies, concerts, and sports may take the rest of this year off, if not longer. Likewise, the fate of amusement parks of all kinds seems uncertain, or at least will be drastically changed.

Right now, we do have certain areas that have insisted on becoming field experiments, and by the time you read this, it may become clear whether the people who ran out to bars without masks last week did the right thing or made a stupid sacrifice.

Concerts may survive on live-streaming pay-per-view events for a while, and movie theaters may rediscover the drive-in, although those take a lot of real estate. Then again, indoor malls may now be officially dead, so look for their parking lots and large, blank walls to be easily converted.

Live sports are another matter because, by their very nature, they often involve full-body contact, and nobody is going to be going all-out on the field while wearing any kind of mask. Without quarantining every player, official, and support staff member, and testing each of them constantly, it’s just not feasible.

Even then, what about the live fans? It might be possible to limit attendance and assign seats so that social distancing is maintained, but that relies on trusting people to stay in the seats they’re put in, and as we all know, if someone is stuck in the outfield nosebleeds but sees plenty of empty space on the other side behind home plate, they’re going to try to get there.

One unexpected outcome is that eSports, like Overwatch League, may become the new sports simply because they absolutely can keep the players and fans apart while they all participate together.

See? The prophecy is true. After the apocalypse wipes out the jocks, the nerds will take over the world!

As for live theatre, it’s hanging on through a combination of streams of previously recorded, pre-shutdown performances, along with live Zoom shows. And, again, this is where the magic of theatre itself is a huge advantage because, throughout its history, it hasn’t relied on realistic special effects, or realism at all, to tell its stories.

Okay, so there have been times when theatre has gone in for the big-budget spectacle, but that goes back a lot further than modern Broadway. In ancient Rome, they were staging Naumachia, mock naval battles, but they were doing them as theatrical shows in flooded amphitheaters, including the Colosseum, and on a large scale.

And they’ve gone on throughout history, including Wild West Shows in the U.S. in the 19th century right up to the modern day, with things like amusement park spectacles, including Universals Waterworld and Terminator attractions, and Disney’s newly minted Star Wars Rise of the Resistance attraction,

But these big-budget spectacles are not necessary for theatre to work. All you need for theatre is one or more performers and the words.

Theatre is one of the earliest art-forms that each of us experiences, probably second only to music. And we experience it the first time, and every time, that someone reads to or tells us a story, no matter how simple or complicated.

Once upon a time…

That is theatre, and that’s why I know that it will survive eventually — but not right now, at least not in a familiar form.

And yes, this is a big blow to me on two fronts. First, I know that I won’t be doing improv or performing for a live audience for a long time. Second, I know that I won’t be seeing any of my plays performed onstage for a live audience for a long time.

This current plague quashed both of those options, shutting down my improv troupe and cancelling a play production that had been scheduled to open in April, then postponed to May, then postponed until… who knows?

But I’m not marching in the streets without a mask and armed to the teeth demanding that theatre reopen because I’m not selfish like that.

First, it’s because I still have a venue in which to tell stories and write and share, and you’re reading it right now, wherever in the world you are — and I see that I do have visitors from all over — in fact, from every continent except Antarctica, but including Australia, most of the Americas and Europe, some of Africa, and just about all of Asia. Greetings, everyone!

Second, I realized quite recently that this whole situation has inadvertently handed me the opportunity to get back into the first art-form that I officially trained in but never pursued as a profession for one reason: I loved it too much to turn it into the drudgery of a career, and always wanted to keep it for my own enjoyment.

Okay, sure, I did use it a few times from middle school through just after college in order to entertain others but, again, I was doing it for my own enjoyment.

That art-form is music, and I consider it my second language, because I started taking piano lessons at seven — and I was the one who cajoled my parents into letting me do so. The end result was that I was never really into playing other people’s stuff because, once all that music theory landed in my head and made sense, I started making my own.

That seems to be a common thing with my brain. Learn the way the modules work, start to stick them together to make them break the rules while still working. This is probably also the reason why I took to programming and coding early, and why I abuse Excel the way that I do.

Dirty little secret: Music is just math that sounds good. However, the great thing about it is that music also takes all of the pain out of math because it turns it into feelings. When I’m playing, improvising, and composing, my brain is absolutely not thinking in terms of what specific chord I’m playing, how it relates to the others, how it’s going to get from Point X to Y to make Z make sense, etc.

The thing about music and me is that its rules are buried so deeply into my subconscious that, well, like I said… I consider it to be my second language. And, when you’re fluent in any language, you don’t need to think. You just speak, whether it’s via your mouth and tongue, or via your heart and fingers.

So… live performance has been taken away from me by this virus for a while but that’s okay — because online research and ordering still exist, and stuff is on the way. So… I’m diving back into the most direct, emotional and, most importantly, non-word-dependent form of communication humans have ever invented.

Watch this space. Or… well, listen.