The return of Friday free-for-all #88: Tech, retirement, food

Friday free-for-all is back for 2022! Here are some more random questions.

Happy New Year! Now it’s time to go back to our more regular schedule, so here’s the next in an ongoing series in which I answer random questions generated by a website. Feel free to give your own answers or ask your own questions in the comments.

Does technology simplify life or make it more complicated?

Yes.

It actually does both, and it all comes down to the human user. If you know how to use the technology and do it properly, it can greatly simplify life. If you never bother to learn how to use it the right way or all the tricks and tips to making it work for you, then it will complicate your life.

I have seen this in every office job I’ve ever had. Hell, any job that involved computers, which has been all of them, not to mention copiers, fax machines back in the day, any telephone more complicated than the buttons needed to dial, smart phones now, and so on.

I’ll also toss in VCRs, DVD players, modern TVs, and anything with a built-in clock that sometimes needs to be set.

If I walk into your place and the time on your VCR is flashing “12:00,” I’m going to judge you — first for not figuring out how to do one of the simplest things on a VCR, and second for still having a goddamn VCR. (It’s sitting right next to your turntable for your vinyl collection, isn’t? Fucking hipster.)

Of course, I’ll also notice this if your stove or microwave is either flashing “12:00” or the time is arbitrarily off by any number of hours or minutes.

It’s just that most things nowadays auto-adjust themselves for the beginning and ending of Daylight Saving Time as well as reset after a power outage or battery removal/shut down. Hell, I’ve got an alarm/sleep-sound generator that has to be about fifteen years old by now, and even it self-adjusts for DST.

But, beyond that, if you’re going to be using software, take the time to learn how to get it to do what you want. Another way I judge people’s skills is by looking at a Word Document from them. then seeing if it’s set for the default font (Calibri — ech!) with the useless BS paragraph settings of 1.5 lines and 10 pts before or after each or, worse, both.

Also, they tend to never turn off the automatic double-space after a period, which is absolutely useless and wrong, or the automatic superscripting of abbreviated ordinals — st, nd, rd, and th.

Whenever I have to update or reinstall Word, these are the first things I change. In my case, it’s usually Times New Roman 12 pt, single lines, and no forced paragraph spacing, and that stupid two spaces after a period goes right off, along with the superscripts.

There are ways to tell in Excel as well, which mostly revolve around word wrapping (as in turned off) and number formatting (as in whatever the cell defaulted to.) There’s also a definite lack of complicated formula, so that someone might enter A1+B1+C1+D1+E1+F1 in a cell instead of =SUM(A1:F1).

This all falls under the category of “Tell me that you never learned to use this software properly without telling me you never learned to use this software properly.”

It has become fun, though, to watch people in Zoom meetings edit a Word doc on their screen and see that they only know one (tedious) way to do it. Type the stuff, highlight it, then find the right ribbon at the top of the screen in order to apply whatever format you’re trying to.

It’s really not that hard to memorize the essential shortcut keys which has the great advantage of not interrupting your typing flow. If I want to go to bold mid-sentence, I don’t have to do the highlight, pick from the menu, and click BS. I can literally turn it on and off with a two-key combo.

Incidentally, it seems like the higher up someone is, the less they know about how to use technology — or when not to. Most of the productivity software they pick (and I’ve dealt with this for years) actually makes it harder for the team to function, not easier.

So, as with a lot of things, what you get out of technology is what you put into it. Bother to learn it and it will reward you. Shirk off, and you’ll wind up hating it.

When do you want to retire, and what do you want to do after you retire?

Well, being a creative person, this is kind of a trick question. When it comes to working at being creative, writing every day, and so on, then I am never going to retire. I am going to do this until the day they have to pry my keyboard from under my cold, dead hands.

As for when would I want to retire from selling my time to someone else for money, that’s also going to be a while. First, I do like the money coming in, and right now it’s for what I’d be doing anyway. It’s also nice to be able to work remotely so that I could theoretically live anywhere in the world as long as I had an internet connection.

I’m probably going to be doing the working for someone else thing for as long as they’ll have me or until I win enough in a lottery to be able to buy a modest home somewhere and cover my living expenses for thirty or forty years (with other retirement contingencies padding that out.)

As for what I’d like to do after I retire, the big thing would be to expand my creativity, since I’d finally have the time to get back to graphic arts and design, music, and video production — all of which are very time-consuming — but all of which could also come together into one big project or a series of projects written, directed, filmed, edited, scored, and produced by me.

Oh — on top of time-consuming also very expensive, unless you luck into a good prosumer editing program with regular and cheap updates (which I did), and your ancient graphics editing software continues to be compatible with newer computers (which it finally didn’t.)

The one advantage to having used the latter for so long, by the way, was that I was having to figure out how to do things that had long since been turned into new functions in later versions, like auto-masking foreground objects, color matching, and so forth.

What food do you absolutely hate?

I know that you’re probably expecting something specific, like brussels sprouts, but that’s not what I’m going to list. I mean, I could rattle off green beans, string beans, beets, cauliflower, olives (black or red), most fruit that hasn’t been turned into juice or jelly (it’s a texture thing), and definitely melons of all kinds.

But that’s not what I’m going to list here.

No. The food that I absolutely hate is any kind of “dare you to like” culinary bullshit that oozes out of the fetid taste of some pretentious chef (especially of the celebrity kind) and particularly if the word “gastro” appears anywhere in the name of the establishment and/or on the menu.

If I see a place advertised as a gastro-pub, I run the other way for two reasons. One, I know that I’m not going to like the food at all. Two, I know that I’m not going to like the people who do.

These chefs have an amazing ability to take classic fare and absolutely ruin it. Just searching at random, I found one place offering a “Ruben” sandwich (it’s actually Reuben), that pays lip service to shaved pastrami, coleslaw, and horseradish, but then uses something called “sour cherry Dijon mustard,” which is exactly the abomination it sounds like and then, instead of putting it on rye, uses something called “townie focaccia,” which is exactly the wrong kind of bread.

And, trust me, nobody can fuck up a good cheeseburger like one of these gastrolls can. They’ll either seem to be going along normally until the last ingredient, which makes it inedible — like you’re reading along and it sounds great until they add mint-infused Thai peanut sauce reduction — or it just goes south from the beginning, through everything and the kitchen sink on top of that poor, innocent meat.

Avoid places that use terms like infusion, reduction, sous vide, sea salt, jam or jelly in connection with anything not normally made into either, and compote, Also find out whether they ever use liquid nitrogen while “cooking,” because this is a huge red flag.

I think the only reason that these gastrochefs pull this shit is because they hate really rich people and want to play Emperor’s New Clothes with them constantly. There’s probably a constant gambling pool going on in the kitchen, too — whoever can concoct the most disgusting combination and not only get people in the restaurant to eat it and say they love it but to get a good review from a food critic for that item wins the entire pot for that week.

I’m probably not wrong, but I’m definitely not eating their shit.

Friday Free-for-All #83: Double tech, impact, business

More random internet questions: Technology in education and in general, the pros and cons of IQ tests, and my ideal business

Here’s the next in an ongoing series in which I answer random questions generated by a website. Here are this week’s questions. Feel free to give your own answers or ask your own questions in the comments.

How can technology improve education? Can it hurt education?

Technology, when it comes to education, is a double-edged sword. On the one hand, it makes access to information so much easier. when I finished high school, we did have computers, but the internet was only just starting to develop, and it hadn’t gotten far enough while I was in college to make it all that useful.

Unfortunately, my school was not among the first six to receive a .edu domain and hence email for everyone. Hell, they hadn’t even received one by the time the 90s rolled around and I can’t remember the first time I got an email begging for money to which I wanted to reply, “Bitches, I already gave you enough that I’m still paying off, so just back off.”

But… technology is a double-edged sword in education because on the one hand, it can make learning and research too easy and, on the other hand, it can make learning and research too easy.

If that sounds like a contradiction, it’s not. These are two sides of the coin, which means that no matter how advanced technology gets or how integrated it becomes with our educational experience, it will always need experienced, trained humans to guide students through it.

On the positive side, if you need to look up quick facts about things, you can now do it in seconds from your own home or phone, with no need to make a trip to the library or pull that encyclopedia volume off the shelf. Or, right, no one has had a set of encyclopedias in their home since maybe the mid-90s.

But note that this only refers to simple facts, and you still have to be wary of your sources. Remember: Anyone can edit Wikipedia, and if you happen to hit the page on Edgar Allan Poe during that ten minutes when some joker added American Psycho to his list of works and you don’t know better, you’re going to wind up with a bad book report.

Not to say that Poe wouldn’t have written American Psycho if he lived in the last century, but he didn’t.

But with a proper teacher, or a curated and guided site, like Khan Academy, then education can be in very good hands via technology. But don’t fall prey to Prager “University” or its ilk, because that is the opposite of education. It’s indoctrination.

The other great benefit technology offers for education is the one we’ve seen over the last year and a half plus: Remote learning. It’s a boon for parents who also work from home, very helpful for children who might otherwise have issues interacting in person, and can also allow for parental involvement in their child’s educational process, which is very important to learning.

Well, as long as the parents just shut up and defer to the teacher. Because the dark side of that is what we’ve seen in contentious schoolboard meetings with angry and misguided parents protesting everything from mask and vaccine mandates to the actual content and curriculum, often with violent threats. And, yeah. We don’t need that shit. That helps no one.

Another downside to learning remotely can be social isolation. However, like it or not, the current generation of kids born in the last decade are probably going to grow up know as many people online as they know in real life, if not more, and will probably never meet them.

And you know what? That’s fine. I’ve been living in that world for at least the last 25 years or so, and since everyone has now been exposed to life via Zoom in the last 19 months, that’s only going to become more normal.

Then again, I was regularly doing video remote meetings fifteen years ago in a ridiculously high-tech room in which an entire wall was covered in super hi-def video screens, and we would have live meetings with the staff of our co-production company. We were in Glendale, U.S., and they were in Bristol, UK, but that tech created one long boardroom table that all of us sat around.

Okay, sure. No one had the tech or bandwidth to do this at home at that time, but just look at us now, and imagine where technology will be in another ten or twenty years.

Can it help education? Oh, hell yeah. But with one gigantic caveat. We will still and always need educators to keep rein on the tech to make sure that bad information is not leaking through. We will always need teachers no matter how well we think that our AI can teach.

And that is why this should become one of the most highly paid professions before the end of 2022.

Is there a limit to what humans can create through technology and science?

Of course there are, and those limits are written into the universe itself. We can never create a system that will propel anything with mass faster than light-speed — although we may be able to figure out how to travel through space without moving through space, effectively creating a warp drive or fold that will get us from point A to B without violating the universal speed limit.

We will probably also never be able to negate the force of gravity because it doesn’t seem to be a force mediated by a field or particle, but rather an intrinsic property of space and time. We might be able to manipulate space via achieving some sort of control over matter, and hence being able to concentrate gravity, though.

But this is all getting into Kardashev scale territory, which ranks a society based on how much energy they are able to exploit. We’re close to but not quite at Type I, which is harnessing all of the energy that reaches a planet from its star but, of course, all of the energy on our planet came from the Sun in the first place.

If we want to get to Type II, we’d need to harness all of the energy of our Sun, which would mean surrounding it in something like a Dyson Sphere, although this would be bad for planets that we don’t hook up to this energy boon. Remember: We’re only getting a little cone of sunlight that only hits half of our planet at a time. A sphere capturing everything would increase that power output enormously because it would expand that tiny cone to include the entire surface and circumference of the Sun in three dimensions.

Imagine the difference between shining the light and heat of an incandescent bulb through a small hole punched in a piece of carboard, and then imagine the light and heat created if you surrounded that bulb with a spherical screen that was entirely mirrored on the inside.

To get to Type III, good luck — you have to harvest all of the energy available in your own galaxy, which would probably make your galaxy go dark to the rest of the universe and might be a dead giveaway. Then again, if you can harness the energy of an entire galaxy, I don’t think that any non-Type III society would be a threat at all.

Kardashev never postulated a Type IV, but that society would be able to harness the power of the entire universe, although what they could actually do with it would be questionable. Maybe they could accelerate a ship with substantial mass to 99.99% the speed of light, but given universal distances, that would still be incredibly slow and, unless all that extra energy can somehow greatly extend the lifespans of organic creatures, it seems a useless party trick, really.

Still, there’s reason for optimism. Earth right now is at the Type 0 level, but we’re only a century or two away from Type I if we keep trying and, literally, aiming for the stars.

What single event has had the biggest impact on who you are?

I’ve discussed this before but, ultimately, it was that fucking IQ test my school gave me when I was seven, and I can’t believe that this bullshit persisted for so long. The long and short of it was that IQ tests were created early in the 20th century as yet another facet or institutional racism, founded in the ridiculous theory that some races were not as smart as others.

Of course, when your race is creating the test, you can skew to prove whatever you want it to, and that is exactly what IQ tests in western white society did.

If you were white and middle class, the whole thing was biased to fit right into your experiences, which is something you weren’t supposed to notice when you were seven years old, which is when they tested us.

And, surprise, surprise… little white kids tended to test much higher, while little black kids didn’t. It had nothing to do with intelligence, and everything to do with the test designers putting their thumb on the scale in favor of… well, you know.

Of course, by the time I took it, the melting pot of America had at least stirred enough that two other groups also did very well on the tests — Jews and Asians — but that was mostly a byproduct of decades and multiple generations of assimilation.

Their success would have driven the originators of the tests nuts, but it was actually a good thing, because I wound up with most of my friends through school being Asian and Jewish.

And how did that happen? Simple. The IQ test was pretty much a filing system for students, and what was determined in that short period of time in first grade changed everything that came after.

I happened to land in the “Profoundly Gifted” category, and that launched me into the school track that actually stuck me in the “Hey, y’all are super-privileged” slot. The two problems were that, for one thing, I didn’t know this and, for the other, while I may have been super-privileged in school, my parents were not super-connected, so it really didn’t advantage me much at all.

Maybe that’s a good thing, though. Because I always identified more with those other kids — the Jews and the Asians — and set out to be an artist. But because of that test, I wound up having the friends I did, making the choices I did, and never really bonding with other classmates who weren’t on my same track.

These were the ones who scored below me and here’s a funny thing. The ones who wound up in kind of the average track were also the ones who landed, in their adult lives, in conservative-ville and, sadly, are still living there. And I won’t say that it’s because they’re stupid. It’s just that they were steered in a direction that gave them less of an advantage in education.

I can only imagine what would have been different if I had tested three or four levels below where I landed. But everything that came later was born out of that, for good or for ill.

If you opened a business, what kind of business would it be?

This question has been on the list a long time, but every time it’s come up I’ve tended to ignore it. I guess I might as well answer. I don’t know why I was avoiding it because the answer was always obvious to me. Maybe it just seemed too simple or obvious to state.

Assuming unlimited funds, I would open a non-profit theatre center dedicated to three things: education in all aspects of theatre — writing, performance, directing, design, tech, etc.; development of new artists and their works; and actual production of those works, in several variations.

I imagine the place as having one large mainstage producing new works and pushing the boundaries of theatre technology and content, but there would also be a studio space which would produce the works of the students as a part of their training. You can’t really learn about theatre, after all, until you do productions, and it would work a lot like a university theatre program — everybody does everything at least once, whether it’s in their emphasis or not.

Ideally, the mainstage productions finance the training and studio. Attached to both would be the development labs, designed for writers and creators, with the student actors and dancers available for developmental workshops. These would eventually lead to productions in the studio space, with selected works possibly moving on to the mainstage or pitched to regional theatres.

It would take a space about the size of the L.A. Theatre Center, although maybe not quite as many stories. That building downtown is five stories up and five stories down, although the public usually only sees three of them, plus a small part of the first basement, which is where the restrooms are, located in what was originally a vault when the building was a bank.

In case you’re wondering, yes theatres do need that kind of height, although five stories is a bit unusual. In the case of LATC, it’s because of the way the theatres are arranged, with one of them actually being partly underground and going up several stories. Meanwhile, the largest house has a very steep audience section, and the smaller space that’s on the second level of the building itself goes up a couple of stories.

What you don’t see is what’s above those theatres, several of which have so-called fly-space, which has to be at least as high as the stage itself. That’s because these spaces hold set pieces or flats that are lowered onto the stage when needed. That process of lowering in theatre is also known as “flying in,” hence “fly space,” because these flats and such are often just referred to as flies.

Above the entire theatre, you can also find the light grids, which is where most of the various units that will be illuminating the stage will be living. This includes not only lights, but projectors, although any of these can also be located on the stage and in the wings.

So, anyway, it can be quite easy to wind up with a large theatre that goes up five stories, even if part of that space is also below ground. Surrounding the unused space above the public second floor were offices, costume shops, and various rehearsal spaces.

Meanwhile, downstairs was where they kept the prop and scenery shops, the dressing rooms, and so on. The first basement was also the floor accessible via the truck ramp off the back alley that led to the elephant doors. You need big doors to get big set pieces in and out.

Of course, even smaller theatres can have a bit of height and depth to them. When I worked at the El Portal with ComedySportz, we only had two theatres. One was about 300 seats, and the other was 49. But the main stage probably took up three stories as well, at least, although it was probably closer to four, because I don’t think people realized that the front end of the house after going down the various steps past the audience seats was actually a full story below street level.

And something even I didn’t know until one night when I was the last one out and the alarm system told me that there was a door ajar somewhere in the bowels of the place — the building had not one but two basements below the basement that was behind the platform under the stage itself.

It had the typical dressing rooms and storage and such, but the way it was designed, you just had to walk all the way through one level to take the steps down to the next. It was kind of a labyrinth in that regard. At least there was no Minotaur, and I didn’t have to leave a thread to find my way back out.

But I do digress. A building with the footprint size and height of the El Portal would actually be perfect for my imagined theatre center, although I would make damn sure that the offices on the second and third floors had windows. That, and go for a much more mid-century modern/futurist design aesthetic, rather than attempted 1890s brothel.

Oh — and parking. The place would have to have plenty of parking on site for students, staff, and guests, with students and staff having designated spots and permits, and guests never having to pay. I guess that might add a couple of stories to it, or we could just use the front half of the basement levels for parking and the back half for all of the dressing room and design space.

But, sadly, it’s one of those dreams only achievable with a major lottery win or some other sudden pot of gold moment.

Wonderous Wednesday: 5 Things that are older than you think

A lot of our current technology seems surprisingly new. The iPhone is only about fourteen years old, for example, although the first Blackberry, a more primitive form of smart phone, came out in 1999. The first actual smart phone, IBM’s Simon Personal Communicator, was introduced in 1992 but not available to consumers until 1994. That was also the year that the internet started to really take off with people outside of universities or the government, although public connections to it had been available as early as 1989 (remember Compuserve, anyone?), and the first experimental internet nodes were connected in 1969.

Of course, to go from room-sized computers communicating via acoustic modems along wires to handheld supercomputers sending their signals wirelessly via satellite took some evolution and development of existing technology. Your microwave oven has a lot more computing power than the system that helped us land on the moon, for example. But the roots of many of our modern inventions go back a lot further than you might think. Here are five examples.

Alarm clock

As a concept, alarm clocks go back to the ancient Greeks, frequently involving water clocks. These were designed to wake people up before dawn, in Plato’s case to make it to class on time, which started at daybreak; later, they woke monks in order to pray before sunrise.

From the late middle ages, church towers became town alarm clocks, with the bells set to strike at one particular hour per day, and personal alarm clocks first appeared in 15th-century Europe. The first American alarm clock was made by Levi Hutchins in 1787, but he only made it for himself since, like Plato, he got up before dawn. Antoine Redier of France was the first to patent a mechanical alarm clock, in 1847. Because of a lack of production during WWII due to the appropriation of metal and machine shops to the war effort (and the breakdown of older clocks during the war) they became one of the first consumer items to be mass-produced just before the war ended. Atlas Obscura has a fascinating history of alarm clocks that’s worth a look.

Fax machine

Although it’s pretty much a dead technology now, it was the height of high tech in offices in the 80s and 90s, but you’d be hard pressed to find a fax machine that isn’t part of the built-in hardware of a multi-purpose networked printer nowadays, and that’s only because it’s such a cheap legacy to include. But it might surprise you to know that the prototypical fax machine, originally an “Electric Printing Telegraph,” dates back to 1843.

Basically, as soon as humans figured out how to send signals down telegraph wires, they started to figure out how to encode images — and you can bet that the second image ever sent in that way was a dirty picture. Or a cat photo.

Still, it took until 1964 for Xerox to finally figure out how to use this technology over phone lines and create the Xerox LDX. The scanner/printer combo was available to rent for $800 a month — the equivalent of around $6,500 today — and it could transmit pages at a blazing 8 per minute. The second generation fax machine only weighed 46 lbs and could send a letter-sized document in only six minutes, or ten page per hour. Whoot — progress!

You can actually see one of the Electric Printing Telegraphs in action in the 1948 movie Call Northside 777, in which it plays a pivotal role in sending a photograph cross-country in order to exonerate an accused man.

In case you’re wondering, the title of the film refers to a telephone number from back in the days before what was originally called “all digit dialing.” Up until then, telephone exchanges (what we now call prefixes) were identified by the first two letters of a word, and then another digit or two or three. (Once upon a time, in some areas of the US, phone numbers only had five digits.) So NOrthside 777 would resolve itself to 667-77, with 667 being the prefix. This system started to end in 1958, and a lot of people didn’t like that.

Of course, with the advent of cell phones, prefixes and even area codes have become pretty meaningless, since people tend to keep the number they had in their home town regardless of where they move to, and a “long distance call” is mostly a dead concept now as well, which is probably a good thing.

CGI

When do you suppose the first computer animation appeared on film? You may have heard that the original 2D computer generated imagery (CGI) used in a movie was in 1973 in the original film Westworld, inspiration for the recent TV series. Using very primitive equipment, the visual effects designers simulated pixilation of actual footage in order to show us the POV of the robotic gunslinger played by Yul Brynner. It turned out to be a revolutionary effort.

The first 3D CGI happened to be in this film’s sequel, Futureworld in 1976, where the effect was used to create the image of a rotating 3D robot head. However, the first ever CGI sequence was actually made in… 1961. Called Rendering of a planned highway, it was created by the Swedish Royal Institute of Technology on what was then the fastest computer in the world, the BESK, driven by vacuum tubes. It’s an interesting effort for the time, but the results are rather disappointing.

Microwave oven

If you’re a Millennial, then microwave ovens have pretty much always been a standard accessory in your kitchen, but home versions don’t predate your birth by much. Sales began in the late 1960s. By 1972 Litton had introduced microwave ovens as kitchen appliances. They cost the equivalent of about $2,400 today. As demand went up, prices fell. Nowadays, you can get a small, basic microwave for under $50.

But would it surprise you to learn that the first microwave ovens were created just after World War II? In fact, they were the direct result of it, due to a sudden lack of demand for magnetrons, the devices used by the military to generate radar in the microwave range. Not wanting to lose the market, their manufacturers began to look for new uses for the tubes. The idea of using radio waves to cook food went back to 1933, but those devices were never developed.

Around 1946, engineers accidentally realized that the microwaves coming from these devices could cook food, and voìla! In 1947, the technology was developed, although only for commercial use, since the devices were taller than an average man, weighed 750 lbs and cost the equivalent of $56,000 today. It took 20 years for the first home model, the Radarange, to be introduced for the mere sum of $12,000 of today’s dollars.

Music video

Conventional wisdom says that the first music video to ever air went out on August 1, 1981 on MTV, and it was “Video Killed the Radio Star” by The Buggles. As is often the case, conventional wisdom is wrong. It was the first to air on MTV, but the concept of putting visuals to rock music as a marketing tool goes back a lot farther than that.

Artists and labels were making promotional films for their songs back at almost the beginning of the 1960s, with the Beatles a prominent example. Before these, though, was the Scopitone, a jukebox that could play films in sync with music popular from the late 1950s to mid-1960s, and their predecessor was the Panoram, a similar concept popular in the 1940s which played short programs called Soundies.

However, these programs played on a continuous loop, so you couldn’t chose your song. Soundies were produced until 1946, which brings us to the real predecessor of music videos: Vitaphone Shorts, produced by Warner Bros. as sound began to come to film. Some of these featured musical acts and were essentially miniature musicals themselves. They weren’t shot on video, but they introduced the concept all the same. Here, you can watch a particularly fun example from 1935 in 3-strip Technicolor that also features cameos by various stars of the era in a very loose story.

Do you know of any things that are actually a lot older than people think? Let us know in the comments!

Photo credit: Jake von Slatt

More stupid Excel tricks: A secret power of IF

The hardest part about working with data, especially in large sets, is the people who input it in the first place. The reason they make it so difficult is because they’re inconsistent, not only in their day-to-day habits, but between one or more different people all entering info into the same database.

When you’re creating something solely for yourself, then by all means be as inconsistent or idiosyncratic as you want. But if it’s a group project creating information that someone like me is going to have to derive useful information from at some point in the future, inconsistency can make my job infinitely more difficult.

This is the reason why things like style guides were created — and they don’t just exist for the written word. Accounting and data management have their own style guides. So does computer programming, although that field has the advantage, because the program itself won’t let you get it wrong. Excel is the same way, although it won’t always tell you how to make it right.

Little things can cause problems and cost a business money. Sally may prefer to spell out words in addresses, like Avenue or Boulevard, while Steve likes to abbreviate with Ave or Blvd. Sam is also big on abbreviations, but always with periods. Seems innocuous, doesn’t it?

It does until the only way to make sure that a massive mailing doesn’t go to the same household at the same address twice is to compare the addresses to each other. That’s because, to a computer, 1234 Main Street, 1234 Main St, and 1234 Main St. are all completely different addresses. There’s no easy way to fix that because computers don’t have a “kinda sorta look the same” function.

Garbage in, garbage out

It’s also important that a database be designed properly. For example, names should always be entered as separate units — title/prefix, first name, middle name, last name, suffix. They can be combined later when necessary. A lot of good databases do this, but it’s completely worthless if somebody enters the first and middle names in the first name field or adds the suffix to the last name. You may have heard the expression “garbage in, garbage out,” and this is a prime example of that. All of the right fields were there, but if used improperly, it doesn’t matter.

Of course, the proper fields aren’t always included. One example I had to wrestle with recently in a former career was a database showing the various insurance policies people had with the agency. Now, that is useful and necessary information, as well as something that legally needs to be maintained. And it’s all right that a person gets one row of data for each policy that they’ve had. Some people will have one or two rows, others might have a dozen or more.

So what’s the problem? This: There are no data flags to indicate “this is the policy currently in effect.” This was doubly complicated since it’s Medicare related health insurance, so someone can have up to two active policies at a time, one covering prescription medications and the other a Medicare supplement. Or a policy may have expired after they decide to drop an MAPD and go back to “original” Medicare but the only way to know that is to look for an ending or termination date — if it was ever entered.

The secret power of “IF”

This is where one of my “stupid Excel tricks” came into it. You may or may not be familiar with some of the numeric functions dealing with columns or rows of numbers, but they basically operate on a whole range. They include functions like SUM, MAX, MIN, and AVG. The usual usage is to apply them to a defined range or series of cells and they have no operators, so you get things like:

=SUM([Range])
=MAX([Range])
=MIN([Cell1],[Cell2],[Cell3],...[Cell(n)])

Here’s the fun trick, though. If you add one or more “IF” statements within any of these functions, you can perform the operation on a sub-range of data defined by certain criteria. In the example I’m giving, it would look at all of the insurance effective dates for one person and determine the most recent one, which is usually a good indicator of which policy is in effect.

Generally, each item you’re evaluating is in the form of [DataRange]=[CellValue], or in actual Excel terminology, it might look like “$A$1:$A$470=A12” for the version entered in row 12. After the criteria ranges, you enter the range that you want to perform the operation on, close out the parenthesis, then enter.

So let’s say that we have last name in column B, first name in column D, and the dates we want to look at to find the latest are in column N. Our formula would look like this, assuming that the first row has the field headers and the data starts in row two:

=MAX(IF($B$1:$B$525=B2,IF($D$1:$D$525=D2,$N$1:$N$525))

If you’ve entered it right, the formula should be displaying the right number. In effect, you’ll have created a column down the far right side in which the value opposite any particular person’s name equals the maximum date value, meaning the latest. Then you can do an advanced filter (oh, google it!) to pull out just the unique name data and date, then use that to do an INDEX and MATCH to create a dataset of just the most recent plan and effective date. (I covered those two functions in a previous post.)

Or… the original database administrator could have just put those current plan flags in the data in first place, set them to automatically update whenever a newer plan of the same type was added, and voilà! Every step since I wrote “This is where one of my stupid Excel tricks came into it” 396 words ago would have been unnecessary. Time and money saved and problem solved because there was never a problem in the first place.

The art of improv in Excel

On the other hand… solving these ridiculous problems of making large but inconsistent datasets consistent with as little need as possible to look at every individual record just lets me show off my ninja skills with Excel.

It’s really no different than improv. Badly entered data keeps throwing surprises at me, and I have to keep coming up with new and improved ways to ferret out and fix that bad data. In improv, this is a good thing, and one of our mottos is, “Get yourself in trouble,” because that creates comedy gold as things in the scene either get irredeemably worse or are suddenly resolved. Damn, I miss doing Improv, and long for the days when we can finally return to the stage, which has been seeming even more remote by the day. But I do digress…

Back to the point: In real life, not so much for easy resolution. It’s a pain in the ass to have to fix the curveballs tossed at us by other people’s laziness and lack of diligence — unless we approach it like a game and an interesting challenge. Then, real life becomes improv again in the best sense.

And I’ll find it forever amusing that the same rules can apply to both a spontaneous, unplanned, free-wheeling art form, and an un-wielding, rigid and unforgiving computer program. They both have their rules. Only the latter won’t allow them to be bent. Okay, some improv games have rules that are not supposed to be bent. But half the fun is in bending those rules, intentionally or not.

With Excel and data-bashing, all of the fun is in following Excel’s rules, but getting them to do things they were never intended to.

Image source: Author, sample output from a completely randomized database generator in Excel used to create completely artificial test data for practicing functions and formulae without compromising anyone’s privacy. Maybe I’ll write about this one some day, if there’s interest.

Why astrology is bunk

A lot of people believe in astrology — but not only is there no basis in fact for it, believing in it can be dangerous.

This piece, which I first posted in 2019, continues to get constant traffic and I haven’t had a week go by that someone hasn’t given it a read. So I felt that it was worth bringing to the top again.

I know way too many otherwise intelligent adults who believe in astrology, and it really grinds my gears, especially whenever I see a lot of “Mercury is going retrograde — SQUEEEE” posts, and they are annoying and wrong.

The effect that Mercury in retrograde will have on us: Zero.

Fact

Mercury doesn’t “go retrograde.” We catch up with and then pass it, so it only looks like it’s moving backwards. It’s an illusion, and entirely a function of how planets orbit the sun, and how things look from here. If Mars had (semi)intelligent life, they would note periods when the Earth was in retrograde, but it’d be for the exact same reason.

Science

What force, exactly, would affect us? Gravity is out, because the gravitational effect of anything else in our solar system or universe is dwarfed by the Earth’s. When it comes to astrology at birth, your OB/GYN has a stronger gravitational effect on you than the Sun.

On top of that, the Sun has 99.9% of the mass of our solar system, which is how gravity works, so the Sun has the greatest gravitational influence on all of the planets. We only get a slight exception because of the size of our Moon and how close it is, but that’s not a part of astrology, is it? (Not really. They do Moon signs, but it’s not in the day-to-day.)

Some other force? We haven’t found one yet.

History

If astrology were correct, then there are one of two possibilities. A) It would have predicted the existence of Uranus and Neptune, and possibly Pluto, long before they were discovered, since astrology goes back to ancient times, but those discoveries happened in the modern era, or B) It would not have allowed for the addition of those three planets (and then the removal of Pluto) once discovered, since all of the rules would have been set down. And it certainly would have accounted for the 13th sign, Ophiuchus, which, again, wasn’t found until very recently, by science.

So… stop believing in astrology, because it’s bunk. Mercury has no effect on us whatsoever, other than when astronomers look out with telescopes and watch it transit the Sun, and use its movements to learn more about real things, like gravity.

Experiment

The late, great James Randi, fraud debunker extraordinaire, did a classroom exercise that demolishes the accuracy of those newspaper horoscopes, and here it is — apologies for the low quality video.

Yep. Those daily horoscopes you read are general enough to be true for anyone, and confirmation bias means that you’ll latch onto the parts that fit you and ignore the parts that don’t although, again, they’re designed to fit anyone — and no one is going to remember the generic advice or predictions sprinkled in or, if they do, will again pull confirmation bias only when they think they came true.

“You are an intuitive person who likes to figure things out on your own, but doesn’t mind asking for help when necessary. This is a good week to start something new, but be careful on Wednesday. You also have a coworker who is plotting to sabotage you, but another who will come to your aid. Someone with an S in their name will become suddenly important, and they may be an air sign. When you’re not working on career, focus on home life, although right now your Jupiter is indicating that you need to do more organizing than cleaning. There’s some conflict with Mars, which says that you may have to deal with an issue you’ve been having with a neighbor. Saturn in your third house indicates stability, so a good time to keep on binge-watching  your favorite show, but Uranus retrograde indicates that you’ll have to take extra effort to protect yourself from spoilers.”

So… how much of that fit you? Or do you think will? Honestly, it is 100% pure, unadulterated bullshit that I just made up, without referencing any kind of astrological chart at all, and it could apply to any sign because it mentions none.

Plus I don’t think it’s even possible for Uranus to go retrograde from the Earth’s point of view.

Conclusion

If you’re an adult, you really shouldn’t buy into this whole astrology thing. The only way any of the planets would have any effect at all on us is if one of them suddenly slammed into the Earth. That probably only happened once, or not, but it’s probably what created the Moon. So ultimately not a bad thing… except for anything living here at the time.

Wednesday Wonders: Adding depth

In April, 1953, the first-ever experimental broadcast of a TV show in 3D happened, via KECA-TV in Los Angeles. If those call letters don’t sound familiar to any of my Southern California audience, that’s because they only lasted for about the first four-and-a-half years of the station’s existence, at which point they became the now very familiar KABC-TV, the local ABC affiliate also known as digital and broadcast channel 7.

The program itself was a show called Space Patrol, which was originally a 15-minute program that was aimed at a juvenile audience and aired daily. But once it became a hit with adults, ABC added half-hour episodes on Saturday.

Remember, at this point in television, they were at about the same place as internet programming was in 2000.

By the way, don’t confuse this show with the far more bizarre British production of 1962 with the same name. That one was done with marionettes, and judging from this promotional trailer for a DVD release of restored episodes, it was incredibly weird.

Anyway, because of its subject matter and popularity, it was a natural for this broadcast experiment. This was also during the so-called “golden age” of 3D motion pictures, and since the two media were in fierce competition back in the day, it was an obvious move.

Remember — at that time, Disney didn’t own ABC, or anything else. In fact, the studios were not allowed to own theaters, or TV stations.

The original 3D broadcast was designed to use glasses, of course, although not a lot of people had them, so it would have been a blurry mess. Also note that color TV was also a rarity, so they would have been polarizing lenses rather than the red/blue possible in movies.

Since it took place during the 31st gathering of what was then called the National Association of Radio and Television Broadcasters (now just the NAB) it was exactly the same as any fancy new tech rolled out at, say, CES. Not so much meant for immediate consumption but rather to wow the organizations and companies that could afford to develop and exploit it.

Like pretty much every other modern innovation in visual arts and mass media, 3D followed the same progression through formats: still photography, motion pictures, analog video and broadcast, physical digital media, streaming digital media.

It all began with the stereoscope way back in 1838. That’s when Sir Charles Wheatstone realized that 3D happened because of binocular vision, and each eye seeing a slightly different image, which the brain would combine to create information about depth.

Early efforts at putting 3D images into motion were akin to later animated GIFs (hard G, please), with just a few images repeating in a loop.

giphy-downsized

While there was a too-cumbersome to be practical system that projected separate images side-by-side patented in 1890, the first commercial test run with an audience came in 1915, with  series of short test films using a red/green anaglyph system. That is, audience members wore glasses with one red and one green filter, and the two images, taken by two cameras spaced slightly apart and dyed in the appropriate hues, were projected on top of each other.

The filters sent each of the images to a different eye and the brain did the rest, creating the illusion of 3D, and this is how the system has worked ever since.

The first actual theatrical release in 3D premiered in Los Angeles on September 27, 1922. It was a film called The Power of Love, and it screened at the Ambassador Hotel Theater, the first of only two public showings.

You might think that 3D TV took a lot longer to develop, since TV had only been invented around this time in 1926, but, surprisingly, that’s not true. John Logie Baird first demonstrated a working 3D TV set in 1928. Granted, it was an entirely mechanical system and not very high-res, but it still worked.

Note the timing, too. TV was invented in the 1920s, but didn’t really take off with consumers until the 1950s. The world wide web was created in the 1960s, but didn’t really take off with consumers until the 1990s. You want to get rich? Invest in whatever the big but unwieldly and expensive tech of the 1990s was. (Hint, related to this topic: 3D printing.)

That 30 year repeat happens in film, too. As previously noted, the first 3D film premiered in the 1920s, but the golden age came in the 1950s. Guess when 3D came back again? If you said the 1980s, you win a prize. And, obviously, we’ve been in another return to 3D since the ‘10s. You do the math.

Oh, by the way… that 30 year thing applies to 3D printing one more generation back as well. Computer aided design (CAD), conceived in the very late 1950s, became a thing in the 1960s. It was the direct precursor to the concept of 3D printing because, well, once you’ve digitized the plans for something, you can then put that info back out in vector form and, as long as you’ve got a print-head that can move in X-Y-Z coordinates and a way to not have layers fall apart before the structure is built… ta-da!

Or, in other words, this is why developing these things takes thirty years.

Still, the tech is one step short of Star Trek replicators and true nerdvana. And I am so glad that I’m not the one who coined that term just now. But, dammit… now I want to go to Tennessee on a pilgrimage, except that I don’t think it’s going to be safe to go there for another, oh, ten years or so. Well, there’s always Jersey. Or not. Is Jersey ever safe?

I kid. I’ve been there. Parts of it are quite beautiful. Parts of it are… on a shitty reality show. Pass.

But… I’d like to add that 3D entertainment is actually far, far older than any of you can possibly imagine. It doesn’t just go back a couple of centuries. It goes back thousands of years. It also didn’t require any fancy technology to work. All it needed was an audience with a majority of members with two eyes.

That, plus performers acting out scenes or telling stories for that audience. And that’s it. There’s you’re 3D show right there.

Or, as I like to remind people about the oldest and greatest art form: Theatre Is the original 3D.

Well, nowadays, the original virtual reality as well, but guess what? VR came 30 years after the 80s wave of 3D film as well, and 60 years after the 50s. Funny how that works, isn’t it? It’s almost like we’re totally unaware that our grandparents invented the stuff that our parents perfected but which we’re too cool to think that any of them are any good at.

So… maybe let’s look at 3D in another way or two. Don’t think of it as three dimensions. Think of it as two times three decades — how long it took the thing to go from idea to something you take for granted. Or, on a generational level, think of it roughly as three deep: me, my parents, and my grandparents.

Talk about adding depth to a viewpoint.

Image licensed by (CC BY-ND 2.0), used unaltered, Teenager wears Real 3D Glasses by Evan via flickr.

Being a basic bit

Zeroes and Ones are the building blocks of what’s known as binary, and the juice that our digital world runs on. Another way to think of it is this. In our so-called Base 10 world, we deal with ten digits, and depending upon what you’re counting, you can either go from 0 to 9 (things) or 1 to (1)0 (dates, place order). Since 9 is the highest digit, when any column hits it, the 9 next rolls back to 0 and the digit to its left increments up, initially from 1.

So after the number 9, we get 10. After 19, we get 20, after 99, it’s 100, and so on. Also note that 100 happens to be 10 x 10, 1,000 is 10 x 100, 10,000 is 100 x 100, etc. This will be important in a moment.

In the binary world, things roll over faster. In fact, the only digits you have are 0 and 1, so counting works like this: start with 0, then 1. But 1 is as high as we can go, so after 1 comes 10, which, in binary, represents 2.

That might seem strange, but here’s the logic behind it, going back to decimal 10. What is 10, anyway? Well, it’s the number that comes after we’ve run out of digits. Since we’re used to base 10, it doesn’t require any explanation to see that 10 always comes after 9. At least in base 10. I’ll get to that in a moment, but first there’s a very important concept to introduce, and that’s called “powers.”

The powers that be

No, I’m not talking Austin Powers. Rather, raising a number to a power just means multiplying the number by itself that many times. In its basic form, you’ll often see Xn. That’s what this means. It’s just a more efficient way of writing things out:

            2 x 2 = 22 = 4

            3 x 3 x 3 = 33 = 3 x 9 = 27

            10 x 10 x 10 x 10 x 10 = 105 = 100 x 100 x 10 = 10,000 x 10 = 100,000

Here’s an interesting thing about powers of 10, though. The end result will always have exactly as many zeros as the exponent, or power that you raised 10 to. 109. Simple: 1,000,000,000. If it’s 102, 100, and so on.

And the two fun sort of exceptions that aren’t exceptions to keep in mind:

            X x 0 x N = N, aka X0 = 1

            X x 1 = X1 = X.

101 is 10 with 1 zero, or 10; 100 is 10 with no zeroes, or 1.

In other words, any number to the power of zero equals 1, and any number to the power of 1 equals itself. And there you go, that’s all you need except for this: When it comes to determining what the power is, we count “backwards” from right to left. The last digit before the decimal takes the 0 power, next to the left is 1, next over from that is 2, and so on.

Everything in its place

Since places correspond to powers, in Base 10 we would call the digits, right to left, the ones, tens, hundreds, thousands, ten-thousands, hundred-thousands, and so on places. In binary, you’d have the ones, twos, fours, eights, sixteens, thirty-twos, etc.

Makes sense? Then let’s look at a four-digit number in binary: 1776.

But here’s an interesting trick: in computer logic, it often becomes much easier for the circuits to literally read in the digits backwards in order to do these steps upwards in the proper order. This saves the step of having to figure out how long a string is before assigning the proper power to the most significant digit, which is the last one on the left.

So, to calculate, we’ll count it from right to left, which will make it easier to follow what’s happening. Let’s go with 6771 for ease of use. The 6 is in the zero position, so it represents 6 x 100, in which case this is 6 x 1, meaning just plain old 6.

Next, a 7 times 101, which is just 10, so this spot is worth 70 and we’re up to 76.

Next, 7 times 102, which is 100 times 7. Add that to the rest, it’s now 776.

Finally, a 1 in the spot multiplied by 103, which is 10 x 10 x 10, which is 10 x 100, so… 1,000. Slap that on the rest, and there you go: 1776.

This works exactly the same way in any other base. So let’s look at a typical binary number: 1011 1110. As humans, we could deal with doing the whole thing backwards, but again, let’s make it easy for the machine, feed it in right to left, and watch the sequence in action:

Digit (D) 0    1    1    1    1    1    0    1
Power (p) 0    1    2    3    4    5    6    7
2^p       1    2    4    8    16   32   64   128
2^p x D   0    2    4    8    16   32   0    128
SUM       0    2    6    14   30   62   62   190

Or in base three or trinary, let’s look at 21221121, entered again in reverse:

Digit (D) 1    2    1    1    2    2    1    2
Power (p) 0    1    2    3    4    5    6    7
3^p       1    3    9    27   81   243  729  2187
3^p x D   1    6    9    27   162  486  729  4374
SUM       1    7    16   43   205  691  1420 5794

Now, let’s take a look at an interesting property in Base 10 and see if it translates over.

Dressed to the nines

In Base 10, any number divisible by nine also has all of its digits add up to nine. You can easily see this with the first few pairs of two-digit multiples of nine: 18, 27, 36, 45, 54, and so on. The tens digit goes up by one while the ones digit goes down by one, and that makes perfect sense. Why? Because when you add nine, what you’re really doing is the same as adding 10 and then taking away one.

It doesn’t matter how big the number is. If you can add the digits up to nine, then you can say it’s divisible by nine. To just pull a number out of thin air, I guarantee that 83,764,251 is evenly divisible by nine. I could also put any number of nines anywhere in that number and it would still be divisible, or put the digits in any order. And if you have a number that has all of the digits from 0 to 9 in any order, then it’s divisible by 9.

So does this property hold for other bases? What about Base 8? In that case, we should expect seven to be the magic number. I’ll spare you the torturing of Excel I did to run a  test, but the answer is: Yes. If a number is divisible by seven in Base 8, then its digits add up to seven. Here’s the list from the Base 8 equivalent of 1 to 99 (which is 1 to 77): 7, 16, 25, 34, 43, 52, 61, 70. Now none of those numbers in Base 10 is divisible by seven, but in Base 8 they are. Here’s how and why it works.

When you divide a number in Base 10 by 9, you start on the left, figure out how many times 9 goes into that whole number, carry the remainder to the next digit, and repeat the process. So to divide 27 by 9, you start by dividing 20 by 9. This gives you 2 times 9 = 18. Subtract 18 from 20, you get 2. Carry that over to the next place, which is 7, add 2 and 7, you get 9, which is divisible by 9. Add the 2 from the first result to 1, and your answer is 3.

Did you notice anything interesting there? It’s that you happened to wind up with the number in the Base digit twice. Two times 9, with the remainder of 2 adding to the other digit, and what was the other thing we noticed? That’s right. The sum of the digits is 9, so what’s left when you divide the ten’s digit by 9 has to add to the one’s digit to total 9.

This is true in any other base. Let’s look at our Base 8 example of 34. We can’t cheat by converting to Base 10, so the 3 tells us that 7 goes into the number three times. But since 3 times 7 is 3 less than 3 times 8, that’s our remainder. Add that to the 4 to get 7, and boom, done. In Base 8 34/7 = 3+1 = 4. Convert the Base 8 to Base 10 to get 28, and voila… 4 times 7 is 28. The answer is the same either way when you reduce it to a single digit.

A spot check bears this out with other bases, so it would seem to be a rule (though I’m not sure how to write the formula) that for any Base, B, and any number evenly divisible by B-1, the digits of that number will add up to B-1.

That’s the funny thing about whole numbers and integers. They have periodicity. What they do is predictable. Multiples of any integer will appear at regular intervals without jumping around no matter how far towards any particular ∞ you go. Irrational numbers and primes, not so much. But it’s good to know that the basis of digital computing is so reliable and regular. In fact, here’s a funny thing about binary: Every single number in binary is evenly divisible by 1 because all of the digits of every single number in it adds up to a number divisible by… 1. And a Base 1 numbering system is impossible, because the largest possible number in it is 0. It also breaks the whole Base rule above, because nothing can be divided by 0. Base 1 is the Black Hole of the numbering system. The rest should be pretty transparent.

Sunday Nibble Extra: Power up

You could say that May 16 can be an electrifying day in history. Or at least a very energetic one. On this day in 1888, Nikola Tesla described what equipment would be needed to transmit alternating current over long distances. Remember, at this time, he was engaged in the “War of the Currents” with that douche, Edison, who was a backer of DC. The only problem with DC (the kind of energy you get out of batteries) is that you need retransmission stations every mile or so. With Tesla’s version, you can send that power a long way down the wires before it needs any bump up in energy.

Of course, it might help to understand in the first place what electric charge is. Here’s Nick Lucid from Science Asylum to explain:

But if you think that electric current flows through a wire like water flows through a pipe, you’re wrong, and there’s a really interesting and big difference between the one and the other, as well as between AC and DC current. DC, meaning “direct current,” only “flows” in one direction, from higher to lower energy states. This is why it drains your batteries, actually — all of the energy potential contained therein sails along its merry way, powers your device, and then dumps off in the lower energy part of the battery, where it isn’t inclined to move again.

A simplification, to be sure, but the point is that any direct current, by definition, loses energy as it moves. Although here’s the funny thing about it, which Nick explains in this next video: neither current moves through that wire like it would in a pipe.

Although the energy in direct current moves from point A to point B at the speed of light, the actual electrons wrapped up in the electromagnetic field do not, and their progress is actually rather slow. If you think about it for a minute, this makes sense. Since your battery is drained when all of the negatively charged electrons move down to their low energy state, if they all moved at the speed of light, your battery would drain in nanoseconds. Rather, it’s the field that moves, while the electrons take their own sweet time moving down the crowded center of the wire — although move they do. It just takes them a lot of time because they’re bouncing around chaotically.

As for alternating current, since its thing is to let the field oscillate back and forth from source to destination, it doesn’t lose energy, but it also keeps its electrons on edge, literally, and they tend to sneak down the inside edges of the wire. However, since they’re just as likely to be on any edge around those 360 degrees, they have an equally slow trip. Even more so, what’s really guiding them isn’t so much their own momentum forward as it is the combination of electricity and magnetism. In AC, it’s a dance between the electric field in the wire and the magnetic field outside of it, which is exactly why the current seems to wind up in a standing wave between points A and B without losing energy.

I think you’re ready for part three:

By the way, as mentioned in that last video, Ben Franklin blew it when he defined positive and negative, but science blew it in not changing the nomenclature, so that the particle that carries electrical charge, the electron, is “negative,” while we think of energy as flowing from the positive terminal of batteries.

It doesn’t. It flows backwards into the “positive” terminals, but that’s never going to get fixed, is it?

But all of that was a long-winded intro to what the Germans did on this same day three years later, in 1891. It was the International Electrotechnical Exhibition, and they proved Edison dead wrong about which form of energy transmission was more efficient and safer. Not only did they use magnetism to create and sustain the energy flow, they used Tesla’s idea of three-phase electric power, and if you’ve got outlets at home with those three prongs, frequently in an unintended smiley face arrangement, then you know all about it.

Eleven years later, Edison would film the electrocution of an elephant in order to “prove” the danger of AC, but he was fighting a losing battle by that point. Plus, he was a colossal douche.

Obviously, the power of AC gave us nationwide electricity, but it also powered our earliest telegraph systems, in effect the great-grandparent of the internet. Later on, things sort of went hybrid, with the external power for landlines coming from AC power, but that getting stepped down and converted to operate the internal electronics via DC.

In fact, that’s the only reason that Edison’s version wound up sticking around: the rise of electronics, transistors, microchips, and so on. Powering cities and neighborhoods and so on requires the oomph of AC, but dealing with microcircuits requires the “directionality” of DC.

It does make sense though, if we go back to the water through a house analogy, wrong as it is. Computer logic runs on transistors, which are essentially one-way logic gates — input, input, compare, output. This is where computers and electricity really link up nicely. Computers work in binary: 1 or 0; on or off. So does electricity. 1 or 0; positive voltage, no voltage. Alternating current is just going to give you a fog of constant overlapping 1s and 0s. Direct current can be either, or. And that’s why computers manage to convert one to the other before the power gets to any of the logic circuits.

There’s one other really interesting power-related connection to today, and it’s this: on May 16, 1960, Theodore Maiman fired up the first optical LASER in Malibu, California, which he is credited with creating. Now… what does this have to do with everything before it? Well… everything.

LASER, which should only properly ever be spelled like that, is an acronym for the expression Light Amplification by Stimulated Emission of Radiation.

But that’s it. It was basically applying the fundamentals of electromagnetism (see above) to electrons and photons. The optical version of electrical amplification, really. But here’s the interesting thing about it. Once science got a handle on how LASERs worked, they realized that they could use to send the same information that they could via electricity.

So… all those telegraphs and telephone calls that used to get shot down copper wires over great distances in analog form? Yeah, well… here was a media that could do it through much cheaper things called fiber optics, transmit the same data much more quickly, and do it with little energy loss over the same distances.

And, ironically, it really involved the same dance of particles that Tesla realized in figuring out how AC worked way back in the day, nearly a century before that first LASER.

All of these innovations popped up on the same day, May 16, in 1888, 1891, and 1960. I think we’re a bit overdue for the next big breakthrough to happen on this day. See you in 2020?

What is your favorite science innovation involving energy? Tell us in the comments!

Friday Free-for-All #59: Multiple viewings, theater or home, hobby time, techsplosion

The next in an ongoing series in which I answer random questions generated by a website. Here are this week’s questions. Feel free to give your own answers or ask your own questions in the comments.

What movie have you seen more than seven times?

For starters, I know that I’ve watched Stanley Kubrick’s 2001: A Space Odyssey way more than seven times. Once home video and DVD happened, watching 2001 on New Year’s Day instead of a certain parade became a long-standing tradition with me.

The more than seven viewings is also true of several of his films, including Dr. Strangelove, or How I learned to Stop Worrying and Love the Bomb and A Clockwork Orange.

I can’t leave off The Rocky Horror Picture Show. I’m pretty sure I saw that more than seven times in high school alone, and The Wizard of Oz, It’s a Wonderful Life, and The Ten Commandments also make the list because they are still being rerun at least once a year on TV.

I can’t forget the Star Wars Original Trilogy and most of the Prequel Trilogy. The Sequel Trilogy hasn’t been around long enough yet. As for Star Trek, The Wrath of Khan and The Voyage Home are the only ones I’ve definitely seen that often.

There are a few James Bond films — definitely Goldfinger, Live and Let Die, and Moonraker (one good, one okay, and one cheesy as hell) again because of the TV return thing.

I’m not sure, but I think that Willy Wonka and the Chocolate Factory (that’s the amazing Gene Wilder-starring version and not the Tim Burton travesty) probably also makes the list. Oh. Also, Cabaret, All that Jazz, and Westside Story.

There are probably others, but these are the ones that I can definitely put in the more than seven list.

Do you prefer to watch movies in the theater or in the comfort of your own home?

This is an answer that’s changed enormously. Once upon a time, my reply would have been absolutely in a theater, because that’s where they were made to be seen.

But then as my interest in seeing all the latest MCU/DCEU franchise films fell to zero, waiting for home video or streaming became enough mostly — although I would still go out for the big event films that interested me, mainly Star Wars installments and Bladerunner 2049.

The last film I did see in an actual theatre was Star Wars: The Rise of Skywalker, back in February 2020. It was a mid-weekday thing and there were about four of us in the place.

So already having discovered the joys and convenience of streaming, not to mention the lower cost if it’s something on a service you already have, by the time the theaters shut down it was a no-brainer, and I’m not really inclined to go back anytime soon.

Honestly, seeing a Marvel movie on a big screen doesn’t really add much to it, not compared to the quality I can get at home. Plus I also don’t have to put up with other people, sticky floors, or an endless parade of pre-show trailers and adverts.

What hobby would you get into if time and money weren’t an issue?

I would become a total model train geek, although it would be about more than just the trains. I’d want to create an entire miniature city in a dedicated room, like a full basement, and build it in something like N Scale, which is ¾” to 1 foot, or 1:160 scale.

This would make a model of the Empire State building just over 9 feet tall at the tip of its mast, although it would take 33 linear feet of model to make up one mile of street, so it wouldn’t be a very big city. (Z scale would cut this down to 24 feet per mile, but definitely sacrifice some realism.)

To get a scale model of all of San Francisco into an area 33 feet on a side, you’d wind up with city buses being just under half an inch long and a tenth of an inch wide. You’d only need to cut the N scale in half to model two square miles of Midtown Manhattan.

But wait… it does say that time and money aren’t an issue, right? So instead of building a single square mile of city in a basement, why not go for a warehouse or buy an abandoned big box store? Aim for something that would fit fifty or a hundred square miles of city, and if it had multiple floors, go for various layouts — urban mega-city, suburban smaller town, historical city — with a scale ten mile footprint, you could easily build two separate 19th century Main Street towns surrounded by countryside and connected by railroad and telegraph.

And I wouldn’t need to go it alone. Hell, it could become an entire project that would employ model and miniature makers, urban planners, painters, designers, builders, electricians, programmers, and more. Give the big city a working harbor and airport, also have miniature cars and people moving around, design it to not only have a night and day cycle but seasons and weather as well, and it could be quite a thing.

It could even become a tourist attraction. Hell, they already did it in Hamburg, Germany.

And why does the idea fascinate me so much? Maybe because I was into model trains as a kid, although never had a neat, permanent layout. But this also led to me becoming a big fan of games like Sim City, in which I could indulge my curiosity about building and creating things and see where they led — especially urban landscapes.

Hm. Give me all the resources, and I just might make TinyTowns a major tourist destination.

Why did technology progress more slowly in the past than it does now?

I believe that this is because technological development is exponential, not algebraic. The latter is a very slow, additive process. You go from 1 to 1+1, or 2, then to 2+1 for 3 and so and so on. Repeat the process 100 times, and you land on 101.

Take the simplest exponential progression, though, in which each subsequent step is double the one before it. That is, go from 1 to 1×2, or 2, then 2 to 2×2 for 4, and so on. After a hundred steps, your result is 1.25×10^30, or roughly 1 followed by 30 zeros, which is one nonillion.

For perspective, a yottabyte — currently the largest digital storage standard yet set — is equal to one trillion terabytes, the latter currently being a very common hard drive size on a home computer.  The number noted above is ten thousand times that.

It’s exactly how we wound up with terabyte drives being so common when, not very long ago, a 30 megabyte drive was a big deal. That was really only within a generation or so. This relates to Moore’s Law, stated in 1965 as “the number of transistors in a computer chip doubles every 18 to 24 months.”

What wasn’t stated with the law was that this doubling didn’t just affect the number of transistors, and therefore the number of simultaneous operations, that a chip could perform. It extended to every other aspect of computers. More operations meant more data, so you could either speed up your clocks or widen your data buses (i.e. length of allowable piece of information in bits) or both.

And this is why we’ve seen things like computers going from 8 to 64 and 128 bit operating systems, and memory size ballooning from a few kilobytes to multiple gigabytes, and storage likewise exploding from a handful of kilobytes to terabytes and soon to be commercial petabyte drives.

Perspective: A petabyte drive would hold the entire Library of Congress print archive ten times over. If would probably also hold a single print archive and all the film, audio, and photographic content comfortably as well.

Now, all of this exploding computer technology fuels everything else. A couple of interesting examples: Humans went from the first ever manned flight of an airplane to walking on the moon in under 66 years. We discovered radioactivity in 1895 and tested the first atomic bomb 50 years later. The transistor was invented in 1947. The silicon chip integrating multiple transistors was devised in 1959, twelve years later.

And so on. Note, too, that a transistor’s big trick is that it turns old mathematical logic into something that can be achieved by differences in voltage. a transistor has two inputs and an output, and depending how it’s programmed, it can be set up to do various things, depending upon how the inputs compare and what the circuit has been designed to do.

The beauty of the system comes in stringing multiple transistors together, so that one set may determine whether digits from two different inputs are the same or not, and pass that info on to a third transistor, which may be set to either increment of leave unchanged the value of another transistor, depending on the info it receives.

Or, in other words, a series of transistors can be set up to perform addition, subtraction, multiplication, or division. It’s something that mechanical engineers had figured out ages previously using cogs and levers and gears, and adding machines and the like were a very  19th century technology. But the innovation that changed it all was converting decimal numbers into binary, realizing that the 0 and 1 of binary corresponded perfect to the “off” and “on” of electrical circuits, then creating transistors that did the same thing those cogs and levers did.

Ta-da! You’ve now turned analog math into digital magic. And once that system was in place and working, every other connected bit developed incredibly over time. Some people focused on making the human interfaces easier, moving from coding in obscure and strictly mathematical languages, often written via punch cards or paper tape, into not much improved but still infinitely better low level languages that still involved a lot of obscure code words and direct entry of numbers (this is where Hex, or Base 16 came into computing) but which was at least much more intelligible than square holes a card.

At the same time, there had to be better outputs than another set of punched cards, or a series of lights on a readout. And the size of data really needed to be upped, too., With only four binary digits, 1111, the highest decimal number you could represent was 15. Jump it to eight digits, 1111 1111, and you got… 255. Don’t forget that 0 is also included in that set, so you really have 256 values, and voila! The reason for that being such an important number in computing is revealed.

Each innovation fueled the need for the next, and so the ways to input and readout data kept improving until we had so-called high-level programming languages, meaning that on a properly equipped computer, a programmer could type in a command in fairly intelligible language, like,

10 X = “Hello world.”

20 PRINT X

30 END

Okay, stupid example, but you can probably figure out what it does. You could also vary it by starting with INPUT X, in which case the user would get a question mark on screen and the display would return whatever they typed.

Oh yeah… at around the same time, video displays had become common, replacing actual paper printouts that had a refresh rate slower than a naughty JPG download on 1200 baud modem. (There’s one for the 90s kids!) Not to mention a resolution of maybe… well, double digits lower than 80 in either direction, anyway.

Surprisingly, the better things got, the better the next versions seemed to get, and faster. Memory exploded. Computer speeds increased. Operating systems became more intuitive and responsive.

And then things that relied on computers took off as well. Car manufacturers started integrating them slowly, at first. Present day, your car is run more by computer than physical control, whether you realize it or not. Cell phones and then smart phones are another beneficiary — and it was the need to keep shrinking transistors and circuits to fit more of them onto chips in the first place that finally made it possible to stick a pretty amazing computer into a device that will fit in your pocket.

Oh yeah… first telephone, 1875. Landline phones were ubiquitous in less than a hundred years, and began to be taken over by cell phones, with the first one being demonstrated in 1973 (with a 4.4 lb handset, exclusive of all the other equipment required), and affordable phones themselves not really coming along until the late 1990s.

But, then, they never went away, and then they only just exploded in intelligence. Your smart phone now has more computing power than NASA and the Pentagon combined did at the time of the Moon landings.

Hell, that $5 “solar” (but probably not) calculator you grabbed in the grocery checkout probably has more computing power than the Lunar Lander that made Neil Armstrong the first human on the Moon.

It’s only going to keep getting more advanced and faster, but that’s a good thing, and this doesn’t even account for how explosions in computing have benefited medicine, communications, entertainment, urban planning, banking, epidemiology, cryptography, engineering, climate science, material design, genetics, architecture, and probably any other field you can think of — scientific, artistic, financial, or otherwise.

We only just began to escape the confines of Analog Ville less than 40 years ago, probably during the mid to late 80s, when Millennials were just kids. By the time the oldest of them were second semester sophomores in college, we had made a pretty good leap out into Digital World, and then just started doubling down, so that two decades into this century, the tech of the turn of the century (that’d be 2000) looks absolutely quaint.

Remember — we had flip phones then, with amazing (cough) 640×480 potato-grade cameras.

Compare that to 1920 vs 1900. A few advances, but not a lot. The only real groundbreaker was that women could now vote in the U.S., but that wasn’t really a technological advance, just a social one. And if you look at 1820 vs. 1800, or any twenty-year gap previously, things would not have changed much at all except maybe in terms of fashion, who current world monarch were, or which countries you were currently at war with.

And that, dear readers, is how exponential change works, and why technology will continue to grow in this manner. It’s because every new innovation in technology sews the seeds for both the need and inevitability of its next round of advancement and acceleration.

We pulled the genie out of the bottle in 1947. It’s never going back in.

Momentous Monday: Meet the Middletons

Thanks to boredom and Amazon Prime, I watched a rather weird movie from the 1930s tonight. While it was only 55 minutes long, it somehow seemed much longer because it was so packed with… all kinds of levels of stuff.

The title is The Middleton Family at the New York World’s Fair, and while the content is exactly what it says on the tin, there are so goddamn many moving parts in that tin that this is one worth watching in depth, mainly because it’s a case study in how propaganda can be sometimes wrong, sometimes be right and, really, only hindsight can excavate the truth from the bullshit.

While it seems like a feature film telling the fictional story of the (snow-white but they have a black maid!) Middleton Family from Indiana who goes back east ostensibly to visit grandma in New York but, in reality, in order to attend the New York World’s Fair of 1939, in reality this was nothing more than a piece of marketing and/or propaganda created by the Westinghouse Corporation, major sponsors of the fair, poised on the cusp of selling all kinds of new and modern shit to the general public.

Think of them as the Apple, Microsoft and Tesla of their day, with solutions to everything, and the World’s Fair as the biggest ThingCon in the world.

Plus ça change, right?

But there’s also a second, and very political, vein running through the family story. See, Dad decided to bring the family to the fair specifically to convince 16 year-old son Bud that, despite the bad economic news he and his older friends have been hearing about there being no job market (it is the Great Depression, after all) that there are, in fact, glorious new careers waiting out there.

Meanwhile, Mom is hoping that older daughter Babs will re-connect with high school sweetheart Jim, who had previously moved to New York to work for (wait for it) Westinghouse. Babs is having none of it, though, insisting that she doesn’t love him but, instead, is in love with her art teacher, Nick.

1939: No reaction.

2020: RECORD SCRATCH. WTF? Yeah, this is one of the first of many disconnect moments that are nice reminders of how much things have changed in the 82 years since this film happened.

Girl, you think you want to date your teacher, and anyone should be cool with that? Sorry, but listen to your mama. Note: in the world of the film, this relationship will become problematic for other reasons but, surprise, the reason it becomes problematic then is actually problematic in turn now. More on which later.

Anyway, obviously richer than fuck white family travels from Indiana to New York (they’re rich because Dad owns hardware stores and they brought their black maid with them) but are too cheap to spring for a hotel, instead jamming themselves into Grandma’s house, which is pretty ritzy as well and that says grandma has money too, since her place is clearly close enough to Flushing Meadows in Queens to make the World’s Fair a daily day trip over the course of a weekend.

But it’s okay — everyone owned houses then! (Cough.)

And then it’s off to the fair, and this is where the real value of the film comes in because when we aren’t being propagandized by Westinghouse, we’re actually seeing the fair, and what’s really surprising is how modern and familiar everything looks. Sure, there’s nothing high tech about it in modern terms, but if you dropped any random person from 2020 onto those fairgrounds, they would not feel out of place.

Well, okay, you’d need to put them in period costume first and probably make sure that if they weren’t completely white they could pass for Italian or Greek.

Okay, shit. Ignore that part, let’s move along — as Jimmy, Babs’ high school sweetheart and Westinghouse Shill character, brings us into the pavilion. And there are two really weird dynamics here.

First is that Jimmy is an absolute cheerleader for capitalism, which is jarring without context — get back to that in a moment.

The other weird bit is that Bud seems to be more into Jimmy than Babs ever was, and if you read too much gay subtext into their relationship… well, you can’t read too much , really. Watch it through that filter, and this film takes on a very different and subversive subplot. Sure, it’s clear that the family really wishes Jimmy was the guy Babs stuck with, but it sure feels like Bud wouldn’t mind calling him “Daddy.”

But back to Jimmy shilling for Westinghouse. Here’s the thing: Yeah, sure, he’s all “Rah-Rah capitalism!” and this comes into direct conflict with Nicholas, who is a self-avowed communist. But… the problem is that in America, in 1939, capitalism was the only tool that socialism could use to lift us out of depression and, ultimately, create the middle class.

There’s even a nod to socialism in the opening scene, when Bud tells his dad that the class motto for the guys who graduated the year before was, “WPA, here we come!” The WPA was the government works program designed to create jobs with no particular aim beyond putting people to work.

But once the WPA partnered with those corporations, boom. Jobs. And this was the beginning of the creation of the American Middle Class, which led to the ridiculous prosperity for (white) people from the end of WW II until the 1980s.

More on that later, back to the movie now. As a story with relationships, the film actually works, because we do find ourselves invested in the question, “Who will Babs pick?” It doesn’t help, though, that the pros and cons are dealt with in such a heavy-handed manner.

Jimmy is amazing in every possible way — young, tall, intelligent, handsome, and very knowledgeable at what he does. Meanwhile, Nicholas is short, not as good-1ooking (clearly cast to be more Southern European), obviously a bit older than Babs, and has a very unpleasant personality.

They even give him a “kick the puppy” moment when Babs introduces brother Bud, and Nicholas pointedly ignores the kid. But there’s that other huge issue I already mentioned that just jumps out to a modern audience and yet never gets any mention by the other characters. The guy Babs is dating is her art teacher. And not as in past art teacher, either. As in currently the guy teaching her art.

And she’s dating him and considering marriage.

That wouldn’t fly more than a foot nowadays, and yet in the world of 1939 it seems absolutely normal, at least to the family. Nowadays, it would be the main reason to object to the relationship. Back then, it isn’t even considered.

Wow.

The flip side of the heavy-handed comes in some of Jimmy’s rebukes of Nicholas’ claims that all of this technology and automation will destroy jobs. While the information Jimmy provides is factual, the way his dialogue here is written and delivered comes across as condescending and patronizing to both Nicholas and the audience, and these are the moments when Jimmy’s character seems petty and bitchy.

But he’s also not wrong, and history bore that out.

Now this was ultimately a film made to make Westinghouse look good, and a major set piece involved an exhibit at the fair that I actually had to look up because at first it was very easy to assume that it was just a bit of remote-controlled special effects set up to pitch an idea that didn’t really exist yet — the 1930s version of vaporware.

Behold Elektro! Here’s the sequence from the movie and as he was presented at the fair. Watch this first and tell me how you think they did it.

Well, if you thought remote operator controlling movement and speaking lines into a microphone like I did at first, that’s understandable. But the true answer is even more amazing: Elektro was completely real.

The thing was using sensors to actually interpret the spoken commands and turn them into actions, which it did by sending light signals to its “brain,” located at the back of the room. You can see the lights flashing in the circular window in the robot’s chest at around 2:30.

Of course, this wouldn’t be the 1930s if the robot didn’t engage in a little bit of sexist banter — or smoke a cigarette. Oh, such different times.

And yet, in a lot of ways, the same. Our toys have just gotten a lot more powerful and much smaller.

You can probably guess which side of the argument wins, and while I can’t disagree with what Westinghouse was boosting at the time, I do have to take issue with one explicit statement. Nicholas believes in the value of art, but Jimmy dismisses it completely, which is a shame.

Sure, it’s coming right out of the Westinghouse corporate playbook, but that part makes no sense, considering how much of the world’s fair and their exhibit hall itself relied on art, design, and architecture. Even if it’s just sizzle, it still sells the steak.

So no points to Westinghouse there but, again, knowing what was about to come by September of 1939 and what a big part industry would have in ensuring that the anti-fascists won, I can sort of ignore the tone-deafness of the statement.

But, like the time-capsule shown in the film, there was a limited shelf-life for the ideas Westinghouse was pushing, and they definitely expired by the dawn of the information age, if not a bit before that.

Here’s the thing: capitalism as a system worked in America when… well, when it worked… and didn’t when it didn’t. Prior to about the early 1930s, when it ran unfettered, it didn’t work at all — except for the super-wealthy robber barons.

Workers had no rights or protections, there were no unions, or child-labor laws, or minimum wages, standard working hours, safety rules, or… anything to protect you if you didn’t happen to own a big chunk of shit.

In other words, you were management, or you were fucked.

Then the whole system collapsed in the Great Depression and, ironically, it took a member of the 1% Patrician Class (FDR) being elected president to then turn his back on his entire class and dig in hard for protecting the workers, enacting all kinds of jobs programs, safety nets, union protections, and so on.

Or, in other words, capitalism in America didn’t work until it was linked to and reined-in by socialism. So we never really had pure capitalism, just a hybrid.

And, more irony: this socio-capitalist model was reinforced after Pearl Harbor Day, when everyone was forced to share and work together and, suddenly, the biggest workforce around was the U.S. military. It sucked in able-bodied men between 17 and 38, and the weird side-effect of the draft stateside was that suddenly women and POC were able to get jobs because there was no one else to do them.

Manufacturing, factory jobs, support work and the like boomed, and so did the beginnings of the middle class. When those soldiers came home, many of them returned to benefits that gave them cheap or free educations, and the ability to buy homes.

They married, they had kids, and they created the Boomers, who grew up in the single most affluent time period in America ever.

Side note: There were also people who returned from the military who realized that they weren’t like the other kids. They liked their own sex, and couldn’t ever face returning home. And so major port towns — San Francisco, Los Angeles, Long Beach, San Diego, Boston, New York, Miami, New Orleans — were flooded with the seeds of future GLB communities. Yes, it was in that order back then, and TQIA+ hadn’t been brought into the fold yet. Well, later, in the 60s. There really wasn’t a name for it or a community in the 1940s.

In the 60s, because the Boomers had grown up with affluence, privilege, and easy access to education, they were also perfectly positioned to rebel their asses off because they could afford to, hence all of the protests and whatnot of that era.

And this sowed the seeds of the end of this era, ironically.

The socio-capitalist model was murdered, quite intentionally, beginning in 1980, when Ronald fucking Reagan became President, and he and his cronies slowly began dismantling everything created by every president from FDR through, believe it or not, Richard Nixon. (Hint: EPA.)

The mantra of these assholes was “Deregulate Everything,” which was exactly what the world was like in the era before FDR.

Just one problem, though. Deregulating any business is no different from getting an alligator to not bite you by removing their muzzle and then saying to them, “You’re not going to bite me, right?”

And then believing them when they swear they won’t before wondering why you and everyone you know has only one arm.

Still, while it supports an economic system that just isn’t possible today without a lot of major changes, The Middletons still provides a nice look at an America that did work because it focused on invention, industry, and manufacturing not as a way to enrich a few shareholders, but as a way to enrich everyone by creating jobs, enabling people to actually buy things, and creating a rising tide to lift all boats.

As for Bud, he probably would have wound up in the military, learned a couple of skills, finished college quickly upon getting out, and then would have gone to work for a major company, possibly Westinghouse, in around 1946, starting in an entry-level engineering job, since that’s the skill and interest he picked up during the War.

Along the way, he finds a wife, gets married and starts a family, and thanks to his job, he has full benefits — for the entire family, medical, dental, and vision; for himself, life insurance to benefit his family; a pension that will be fully vested after ten years; generous vacation and sick days (with unused sick days paid back every year); annual bonuses; profit sharing; and union membership after ninety days on the job.

He and the wife find a nice house on Long Island — big, with a lot of land, in a neighborhood with great schools, and easy access to groceries and other stores. They’re able to save long-term for retirement, as well as for shorter-term things, like trips to visit his folks in Indiana or hers in Miami or, once the kids are old enough, all the way to that new Disneyland place in California, which reminds Bud a lot of the World’s Fair, especially Tomorrowland.

If he’s typical for the era, he will either work for Westinghouse for his entire career, or make the move to one other company. Either way, he’ll retire from an executive level position in about 1988, having been in upper management since about 1964.

With savings, pensions, and Social Security, he and his wife decide to travel the world. Meanwhile, their kids, now around 40 and with kids about to graduate high school, aren’t doing so well, and aren’t sure how they’re going to pay for their kids’ college.

They approach Dad and ask for help, but he can’t understand. “Why don’t you just do what I did?” he asks them.

“Because we can’t,” they reply.

That hopeful world of 1939 is long dead — although, surprisingly, the actor who played Bud is still quite alive.

Image: Don O’Brien, Flickr, 2.0 Generic (CC BY 2.0), the Middleton Family in the May 1939 Country Gentleman ad for the Westinghouse World’s Fair exhibits.

%d bloggers like this: