Friday Free-for-All #59: Multiple viewings, theater or home, hobby time, techsplosion

Friday Free for All

The next in an ongoing series in which I answer random questions generated by a website. Here are this week’s questions. Feel free to give your own answers or ask your own questions in the comments.

What movie have you seen more than seven times?

For starters, I know that I’ve watched Stanley Kubrick’s 2001: A Space Odyssey way more than seven times. Once home video and DVD happened, watching 2001 on New Year’s Day instead of a certain parade became a long-standing tradition with me.

The more than seven viewings is also true of several of his films, including Dr. Strangelove, or How I learned to Stop Worrying and Love the Bomb and A Clockwork Orange.

I can’t leave off The Rocky Horror Picture Show. I’m pretty sure I saw that more than seven times in high school alone, and The Wizard of Oz, It’s a Wonderful Life, and The Ten Commandments also make the list because they are still being rerun at least once a year on TV.

I can’t forget the Star Wars Original Trilogy and most of the Prequel Trilogy. The Sequel Trilogy hasn’t been around long enough yet. As for Star Trek, The Wrath of Khan and The Voyage Home are the only ones I’ve definitely seen that often.

There are a few James Bond films — definitely Goldfinger, Live and Let Die, and Moonraker (one good, one okay, and one cheesy as hell) again because of the TV return thing.

I’m not sure, but I think that Willy Wonka and the Chocolate Factory (that’s the amazing Gene Wilder-starring version and not the Tim Burton travesty) probably also makes the list. Oh. Also, Cabaret, All that Jazz, and Westside Story.

There are probably others, but these are the ones that I can definitely put in the more than seven list.

Do you prefer to watch movies in the theater or in the comfort of your own home?

This is an answer that’s changed enormously. Once upon a time, my reply would have been absolutely in a theater, because that’s where they were made to be seen.

But then as my interest in seeing all the latest MCU/DCEU franchise films fell to zero, waiting for home video or streaming became enough mostly — although I would still go out for the big event films that interested me, mainly Star Wars installments and Bladerunner 2049.

The last film I did see in an actual theatre was Star Wars: The Rise of Skywalker, back in February 2020. It was a mid-weekday thing and there were about four of us in the place.

So already having discovered the joys and convenience of streaming, not to mention the lower cost if it’s something on a service you already have, by the time the theaters shut down it was a no-brainer, and I’m not really inclined to go back anytime soon.

Honestly, seeing a Marvel movie on a big screen doesn’t really add much to it, not compared to the quality I can get at home. Plus I also don’t have to put up with other people, sticky floors, or an endless parade of pre-show trailers and adverts.

What hobby would you get into if time and money weren’t an issue?

I would become a total model train geek, although it would be about more than just the trains. I’d want to create an entire miniature city in a dedicated room, like a full basement, and build it in something like N Scale, which is ¾” to 1 foot, or 1:160 scale.

This would make a model of the Empire State building just over 9 feet tall at the tip of its mast, although it would take 33 linear feet of model to make up one mile of street, so it wouldn’t be a very big city. (Z scale would cut this down to 24 feet per mile, but definitely sacrifice some realism.)

To get a scale model of all of San Francisco into an area 33 feet on a side, you’d wind up with city buses being just under half an inch long and a tenth of an inch wide. You’d only need to cut the N scale in half to model two square miles of Midtown Manhattan.

But wait… it does say that time and money aren’t an issue, right? So instead of building a single square mile of city in a basement, why not go for a warehouse or buy an abandoned big box store? Aim for something that would fit fifty or a hundred square miles of city, and if it had multiple floors, go for various layouts — urban mega-city, suburban smaller town, historical city — with a scale ten mile footprint, you could easily build two separate 19th century Main Street towns surrounded by countryside and connected by railroad and telegraph.

And I wouldn’t need to go it alone. Hell, it could become an entire project that would employ model and miniature makers, urban planners, painters, designers, builders, electricians, programmers, and more. Give the big city a working harbor and airport, also have miniature cars and people moving around, design it to not only have a night and day cycle but seasons and weather as well, and it could be quite a thing.

It could even become a tourist attraction. Hell, they already did it in Hamburg, Germany.

And why does the idea fascinate me so much? Maybe because I was into model trains as a kid, although never had a neat, permanent layout. But this also led to me becoming a big fan of games like Sim City, in which I could indulge my curiosity about building and creating things and see where they led — especially urban landscapes.

Hm. Give me all the resources, and I just might make TinyTowns a major tourist destination.

Why did technology progress more slowly in the past than it does now?

I believe that this is because technological development is exponential, not algebraic. The latter is a very slow, additive process. You go from 1 to 1+1, or 2, then to 2+1 for 3 and so and so on. Repeat the process 100 times, and you land on 101.

Take the simplest exponential progression, though, in which each subsequent step is double the one before it. That is, go from 1 to 1×2, or 2, then 2 to 2×2 for 4, and so on. After a hundred steps, your result is 1.25×10^30, or roughly 1 followed by 30 zeros, which is one nonillion.

For perspective, a yottabyte — currently the largest digital storage standard yet set — is equal to one trillion terabytes, the latter currently being a very common hard drive size on a home computer.  The number noted above is ten thousand times that.

It’s exactly how we wound up with terabyte drives being so common when, not very long ago, a 30 megabyte drive was a big deal. That was really only within a generation or so. This relates to Moore’s Law, stated in 1965 as “the number of transistors in a computer chip doubles every 18 to 24 months.”

What wasn’t stated with the law was that this doubling didn’t just affect the number of transistors, and therefore the number of simultaneous operations, that a chip could perform. It extended to every other aspect of computers. More operations meant more data, so you could either speed up your clocks or widen your data buses (i.e. length of allowable piece of information in bits) or both.

And this is why we’ve seen things like computers going from 8 to 64 and 128 bit operating systems, and memory size ballooning from a few kilobytes to multiple gigabytes, and storage likewise exploding from a handful of kilobytes to terabytes and soon to be commercial petabyte drives.

Perspective: A petabyte drive would hold the entire Library of Congress print archive ten times over. If would probably also hold a single print archive and all the film, audio, and photographic content comfortably as well.

Now, all of this exploding computer technology fuels everything else. A couple of interesting examples: Humans went from the first ever manned flight of an airplane to walking on the moon in under 66 years. We discovered radioactivity in 1895 and tested the first atomic bomb 50 years later. The transistor was invented in 1947. The silicon chip integrating multiple transistors was devised in 1959, twelve years later.

And so on. Note, too, that a transistor’s big trick is that it turns old mathematical logic into something that can be achieved by differences in voltage. a transistor has two inputs and an output, and depending how it’s programmed, it can be set up to do various things, depending upon how the inputs compare and what the circuit has been designed to do.

The beauty of the system comes in stringing multiple transistors together, so that one set may determine whether digits from two different inputs are the same or not, and pass that info on to a third transistor, which may be set to either increment of leave unchanged the value of another transistor, depending on the info it receives.

Or, in other words, a series of transistors can be set up to perform addition, subtraction, multiplication, or division. It’s something that mechanical engineers had figured out ages previously using cogs and levers and gears, and adding machines and the like were a very  19th century technology. But the innovation that changed it all was converting decimal numbers into binary, realizing that the 0 and 1 of binary corresponded perfect to the “off” and “on” of electrical circuits, then creating transistors that did the same thing those cogs and levers did.

Ta-da! You’ve now turned analog math into digital magic. And once that system was in place and working, every other connected bit developed incredibly over time. Some people focused on making the human interfaces easier, moving from coding in obscure and strictly mathematical languages, often written via punch cards or paper tape, into not much improved but still infinitely better low level languages that still involved a lot of obscure code words and direct entry of numbers (this is where Hex, or Base 16 came into computing) but which was at least much more intelligible than square holes a card.

At the same time, there had to be better outputs than another set of punched cards, or a series of lights on a readout. And the size of data really needed to be upped, too., With only four binary digits, 1111, the highest decimal number you could represent was 15. Jump it to eight digits, 1111 1111, and you got… 255. Don’t forget that 0 is also included in that set, so you really have 256 values, and voila! The reason for that being such an important number in computing is revealed.

Each innovation fueled the need for the next, and so the ways to input and readout data kept improving until we had so-called high-level programming languages, meaning that on a properly equipped computer, a programmer could type in a command in fairly intelligible language, like,

10 X = “Hello world.”

20 PRINT X

30 END

Okay, stupid example, but you can probably figure out what it does. You could also vary it by starting with INPUT X, in which case the user would get a question mark on screen and the display would return whatever they typed.

Oh yeah… at around the same time, video displays had become common, replacing actual paper printouts that had a refresh rate slower than a naughty JPG download on 1200 baud modem. (There’s one for the 90s kids!) Not to mention a resolution of maybe… well, double digits lower than 80 in either direction, anyway.

Surprisingly, the better things got, the better the next versions seemed to get, and faster. Memory exploded. Computer speeds increased. Operating systems became more intuitive and responsive.

And then things that relied on computers took off as well. Car manufacturers started integrating them slowly, at first. Present day, your car is run more by computer than physical control, whether you realize it or not. Cell phones and then smart phones are another beneficiary — and it was the need to keep shrinking transistors and circuits to fit more of them onto chips in the first place that finally made it possible to stick a pretty amazing computer into a device that will fit in your pocket.

Oh yeah… first telephone, 1875. Landline phones were ubiquitous in less than a hundred years, and began to be taken over by cell phones, with the first one being demonstrated in 1973 (with a 4.4 lb handset, exclusive of all the other equipment required), and affordable phones themselves not really coming along until the late 1990s.

But, then, they never went away, and then they only just exploded in intelligence. Your smart phone now has more computing power than NASA and the Pentagon combined did at the time of the Moon landings.

Hell, that $5 “solar” (but probably not) calculator you grabbed in the grocery checkout probably has more computing power than the Lunar Lander that made Neil Armstrong the first human on the Moon.

It’s only going to keep getting more advanced and faster, but that’s a good thing, and this doesn’t even account for how explosions in computing have benefited medicine, communications, entertainment, urban planning, banking, epidemiology, cryptography, engineering, climate science, material design, genetics, architecture, and probably any other field you can think of — scientific, artistic, financial, or otherwise.

We only just began to escape the confines of Analog Ville less than 40 years ago, probably during the mid to late 80s, when Millennials were just kids. By the time the oldest of them were second semester sophomores in college, we had made a pretty good leap out into Digital World, and then just started doubling down, so that two decades into this century, the tech of the turn of the century (that’d be 2000) looks absolutely quaint.

Remember — we had flip phones then, with amazing (cough) 640×480 potato-grade cameras.

Compare that to 1920 vs 1900. A few advances, but not a lot. The only real groundbreaker was that women could now vote in the U.S., but that wasn’t really a technological advance, just a social one. And if you look at 1820 vs. 1800, or any twenty-year gap previously, things would not have changed much at all except maybe in terms of fashion, who current world monarch were, or which countries you were currently at war with.

And that, dear readers, is how exponential change works, and why technology will continue to grow in this manner. It’s because every new innovation in technology sews the seeds for both the need and inevitability of its next round of advancement and acceleration.

We pulled the genie out of the bottle in 1947. It’s never going back in.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.