The Information Age? Part two

Continuing from the previous installment, inspired by a conversation at a grocery store checkout and looking at the rise of the Information Age. Here, I return to that conversation, which concerned whether Camila Parker-Bowles was about to become Queen of England, based on a single magazine cover.

First of all, why do the British Royals seems to be such an endless source of fascination for Americans? I mean, we did fight that whole war to get rid of them at one point, right? But thanks to new technology, they suddenly started landing in our backyards again.

The coronation of Queen Elizabeth II on June 2, 1953, was televised in the U.S. — sort of. We didn’t have the capability yet to transmit television signals across the ocean, but they could transmit radio waves, and send images via a primitive sort of fax machine, even called MuFax, that could send a still image by wire in “only” nine minutes.

Apparently, though, probably the moment when the former colonies started to have a thing for the Royals started with the investiture of Charles as Prince of Wales on July 1, 1969 — right before the Moon landing — but it was broadcast around the world, quite likely in a convenient test-run of satellite technology twenty days before the Real Big Event.

Twelve years later, Charles would be back in American consciousness when he married Lady Diana Spencer on July 29, 1981.  Their worldwide TV audience comprised 750 million people. And, of course, his bride would capture worldwide attention upon her death on August 31, 1997.

I definitely remember that evening, because my then-boyfriend and I had just come home from a Sunday evening movie date, got to his place and he turned on the TV. There it was — the top story on American network news on every channel.

While the previous weird fascination seemed to be just a fluke of technology making delusions of monarchy appealing to some Americans in a sort of fantasy/soap-opera way, the growth of the Internet also seemed to blur borders, at least in another way.

By this century, the only cultural divides among the English-speaking countries seem to be superficial, like word-choices, culinary decisions, and major brands. And Americans have continued to fawn over Charles and Diana’s kids, and then their grandkids, and his second marriage, and no, I don’t get it.

What I did get, though, when the woman finally turned the magazine to us after asking, “When did Camilla become the queen?” I realized that it was a copy of InTouch, which answered everything.

While it’s an American magazine, it has a particular fascination with the British Royal Family. It also lies its ass off, and their cover accuracy (from the editors and publisher) and content accuracy (from the writers) are both well below 25%. — although I knew that without looking, because the drug store next to my place that’s been a really convenient quick drop-in during the pandemic stocks it by the registers, so I’ve had plenty of time to look at their cover stories and just shake my head.

Then again, I kind of know stuff about British royalty and how it works, outside of this pop culture fascination with them. Why? Because I learned a lot of history in school, as well as on my own, because I’m just curious like that.

Like the British stuff before our revolution that was later important to the Western world. I’ve also paid attention to the stuff behind the tabloid headlines on the Royals, which led to a nice moment, actually, because if this woman had just gone off of the very inaccurate and IRL click-bait headline, she could have walked away thinking that Camilla Parker Bowles was about to become Queen of England.

Fortunately, I knew enough to be able to tell her — and the checker, who seemed to be going along with the headline — that InTouch was incredibly unreliable and, if anything, the cover title was satirical at best, because “Queen Camilla” would never happen.

And that much is definitely true, because part of the deal of allowing Charles to marry Camilla after Diana died was the specific stipulation that she would never become the queen should he ascend to the throne, but rather only gain the title of princess consort, which is a polite royal way of saying “the chick the king is fucking, but his relatives totally disapprove.”

I didn’t explain it in quite those terms. I just said that because of her divorce and the deal Charles made, she could never become queen, and that seemed to settle the issue.

On the way out, it struck me — and this article was born — how is it that modern humans (and it’s not just Americans) can have access to so damn much information and yet seem to know so little about what is really going on?

No, I’m not pushing conspiracy-theory bullshit there. It’s a legitimate question. How is it that so many people do not seem to engage with actual, fact-based information?

Unfortunately, I think a big part of it is that in modern culture (late-stage capitalism largely to blame) too many people have been ingrained with learned helplessness.

“You cannot get ahead, so you never will, so just soldier on and buy our shit! It’s better than trying to learn, because learning is hard, right?”

Well, wrong. But this is the message that has kept people disengaged from curiosity and education. Consumer culture, particularly in the west, is focused on one thing: Selling you artificial substitutes for happiness.

If they let you learn and get smart, then you’ll learn to not buy into their bullshit, so they start to try to get you when you’re young. McDonald’s Happy Meals, anyone?

Modern capitalism has really adopted an old idea that has been attributed to several sources, but which really has none. St. Ignatius Loyola never said anything like, “Give me the child until seven and I will give you the adult” (original version, substitute “boy” and “man” for the subjects in the sentence.) Not was it said by Aristotle.

But… it somehow crawled into the Zeitgeist, especially in marketing, and voila… your kids became targets. How do you think that companies like McDonald’s and Disney became so damn huge in the first place? Each of them had you as kids, probably your parents as kids, they’ve got your kids, and they’re going to get your grandkids.

And how do these companies do their thing? Oh, they will never, ever tell you, “Not happy? Maybe you should learn a new skill!” Instead, they’ll hide the question, make the problem seem like it’s your fault, and then sell you the solution.

“Feel like a loser today? Pick yourself up with Pakolyze candy!” (Hidden message: Because you’re a stupid, useless loser without us.)

This is literally the basis of every single ad that is trying to sell you shit you don’t need, including those fact-free magazines in grocery store check-outs, but also including most of what you read on the internet.

The only way to unlearn learned helplessness is to ignore the (corporate) voices that tell you, “Nah, I can’t,” and instead focus on your inner voice, which tells you, “Oh, fuck yeah, I can, and I am going to!.”

And you will. Just become your own gatekeeper and filter info — after you find the least biased sources, which an internet search actually will give you.

Then stop obsessing on the Royals because, after all, while Queen Liz Jr. is hella cool, we did kick her ancestor out a couple hundred years ago via violent revolution.

So… honor, but don’t obsess. Hey, Elizabeth II is my 36th cousin (and I think also 35th and 37th — no, really), but… no American needs to waste that much head canon on any of the Royals, okay?

Unfortunately, we moved from The Information Age into The Post-Information Age, or maybe even the Dis-Information Age at least five years ago. We’re only just making an attempt at crawling out of it now, but please stand by.

After all, if an otherwise apparently intelligent, affluent, and educated woman in a very liberal suburb in a very liberal county in a very Blue state can almost, for a moment, think that Camilla Parker Bowles is about to become the Queen of England and that Elizabeth II can do nothing to stop it all because of a trash tabloid magazine cover, then there is definitely some breakdown in keeping the people informed.

Which leads to the either/or salvation or danger of the Internet as it is now, because, in essence, it empowers anyone on it to be their own media platform.

Think about it. If you’re reading these words, then you probably have a device that is capable of allowing you to become any or all of the following kinds of creators: film or TV producer and distributor/broadcaster; radio producer and/or broadcaster; magazine or newspaper writer and/or publisher; artist with gallery and shop, or museum curator with likewise; designer with online store; adult entertainer with your own private studio, theatre, and fanbase; and I’ve probably missed quite a few.

The point is that the technology that capitalism has given us, ironically, will also be the tool we can use to destroy the monolithic entities that have come to dominate market share of everything. The trick is that we have to be willing to let go of the stories and franchises we’ve been sold, trust our friends and acquaintances, and then trust the acquaintances of acquaintances, and so on, to create our market.

That and… tax the fuck out of the super-rich, just like Republican Eisenhower did, and look how well off the Middle-Class was at the time. Hell, his policies created the Middle-Class in the U.S.

Combine that with progressive policies to give everyone access to education, health care, affordable housing, equal treatment under the law, and equal representation at the ballot box, and we could create America’s Middle-Class 2.0.

That is the real way to Make America Great Again.

Image source, Old computer that used punchcards, found in the London Science Museum, (CC) BY-SA 4.0, via Wikimedia Commons

The Information Age? Part one

This was originally going to be a single piece but, as often happens, I got so into the subject at hand that I had to split it. The second installment will appear tomorrow.

“When did Camilla become the queen?” a voice behind me in line at Ralphs suddenly blurts out. The clerk and I look over to see a thirtyish woman who otherwise seems well-off and intelligent holding up a magazine with a cover story title along the lines of “Elizabeth and Camilla face off over the crown!”

The course of that conversation was rather illuminating for me — and I hope for the woman — but before I can get to that part, let’s take a little detour, shall we?

The so-called Information Age, also called the Computer Age, among other names, began around the 1970s, and its major hallmark was that the gathering, processing, and dissemination of information was rapidly moving away from analog media and devices — snail mail, paper files, and typewritten documents among them.

Telephone landlines were also moving away from using analog encryption and decryption of signals, as well as rotary dials that literally told the switches what digits you’d dialed by the number of clicks.

I could swear I’ve written about it here before, but can’t find the link. But this “number of clicks” thing is why the most populous regions at the time they instigated area codes got the ones with the fewest number of “clicks”: 212, 312, and 213 going respectively to New York City, Chicago, and Los Angeles.

Digital technology had existed by this point, ever since the invention of the transistor, first demonstrated in 1947. However, the earliest computers were huge — the size of entire rooms — very expensive, and ran ridiculously hot because they used vacuum tubes instead of integrated circuits, which didn’t exist yet anyway.

In addition, they were programmed either by physically flipping switches to set parameters, or in the more “advanced” models, data was fed into them — and read out of them — using either punch-encoded paper tape, or the infamous punch card, yards of the former and tons of the latter. To store just one gigabyte of data would have taken over 16,000 typical punch cards, at 64 bytes per card.

That’s bytes, using the 8-bit standard. The highest eight-digit number in binary is 1111 1111, which equals 255. Add in the extra number zero, 0000 0000, and that is why the number 256 is so important in digital technology.

In contrast, a 16-bit system gives you 65,536, which happened to be the original number of possible colors on the first VGA monitors eons ago. Beyond 16-bit, you’d need scientific notation. A 64-bit system would give you 1.85×10^19.

These are physical limit on how big a number any sized system can generate and use at one time, as well as how many binary digits can pass through a serial bus at one time.

The likely reason that the Information Age finally took off is directly tied into the birth of what would become the Internet in October 1969. It developed on into the 1980s. At the same time, the first small-ish computers that were affordable — at first to business, and then eventually to more affluent consumers — started to come onto the market.

Those early machines didn’t do a whole lot, but they created the first generation to go digital: Gen X. To this day, that seems to be the dividing line, and I know very few Boomers who seem comfortable with computers if they were never exposed to them during their professional lives.

I’m not generalizing Boomers, though — I know Millennials who couldn’t even manage to turn on a laptop, and I’ve met octogenarians who can work their way around a computer like an expert.

And then, about a decade after the first really affordable personal computers, the Internet happened. Well, I think it was the World Wide Web (WWW) at the time, although people didn’t used to distinguish the two.

The short version is that the WWW is the stuff you see online — the web pages that actually come to your machine. The Internet, meanwhile, is the vast network of computers that hosts the WWW, along with lots of other stuff, and which makes the magic happen that gets a document on somebody’s private server in Tierra del Fuego through an elaborate route that can sometimes cross the entire planet before it shows up on your laptop while you’re on vacation in Iceland, usually in under a second.

Another way to think of it is that the WWW are the letters and packages, and the Internet is the postal service. Of course, anyone who refers to the World Wide Web nowadays will just get looked at funny. There’s a reason that the “www.” Part of a URL hasn’t been required for a long, long time.

End result? The Information Age, which has been a wonderful thing.

If I’d tried to write this article, or something like it, in 1970, with all of the above information, it would have involved one or more trips to a physical library or a few, lots of manual searches and pulling out books to look for information in a very analog way — by reading it.

Notes by hand, or by using the library’s copiers, paying per page. And if said information weren’t in the library, I would have somehow had to find someone with expertise in the field and either (gasp!) call them on the phone, or send them a neatly typed letter nicely asking for info and waiting for however long it took them to respond, if they ever did.

I could never have adulted in that age. So that’s the upside of The Age of Information.

But there are several downsides, one having to do with quantity (and quality) of information, and the other having to do with a frequent lack of engagement, which is not unrelated to the first part.

I have a terabyte hard drive in my computer, and another terabyte external drive connected to it. These are ridiculously huge amounts of data, something we couldn’t even have conceived of needing back in the 1990s.

Now, multiply those two drives of mine by five to bring it to ten terabytes, and that’s enough storage space to hold the entire collection of the U.S. Library of Congress, digitized.

The amount of information available via the Internet dwarfs that number by a lot. It’s been estimated that just the “Big Four” of sites — Google, Amazon, Microsoft, and Facebook — store at least 1.2 petabytes of data.

Note that it’s not clear from my source whether this data includes YouTube and Instagram under the umbrellas of Google and Facebook respectively.

A petabyte is two steps up from a terabyte, meaning that one petabyte is a million terabytes — and the figure above doesn’t even include all the other storage spaces out there for all the other people. Granted, a lot of smaller sites contract to Google, Amazon, or Microsoft for cloud storage, but I’m sure that many of them don’t.

Have an email account that isn’t Google or Microsoft? Multiply the storage they allow by their users and add it in. Toss in all the university and library servers, as well as private industry servers — banking, real estate, finance, healthcare, retail, manufacturing, media.

And then don’t forget publicly accessible government servers which, in every country, go from a national to regional to an administrative to a local level. In the U.S., that’s Federal, state, county, city. The U.S. has 50 states and well over 3,000 counties. A county can have any number of cities, townships, unincorporated areas, boroughs, parishes, or whatever.

It all adds up.

Again, on the one hand, it can be the greatest thing ever. I certainly love it for writing and researching, because when I want to create a link, I just need to tap a couple of keys and do a quick search.

Since I’m a stickler for getting it right if I happen to be writing a period piece, and I want to know what the weather was in a certain place on a certain day, or what was on TV on a certain day and time, boom, done. The information is out there.

Nerd stuff. But that’s what I’m into, that’s what I love to do, and I know how to filter — as in which news and websites to ignore, which to use with caution, and which to trust.

Speaking of which, the main thing that Wikipedia is good for is a broad overview, but if you want the real story, always follow their external links to sources. Yes, I will link to them if I’m only going to give a superficial dash of info, especially if it involves pop culture, but they’re the card catalog, not the book, to dredge up an old analog metaphor.

But we live in a paradox in which information very much fits the old line from The Rime of the Ancient Mariner: “Water water everywhere, nor any drop to drink.”

This referred to being stranded in the ocean, so surrounded by water, but since it’s sea water, it is too salty to be drinkable.

Modern version: “Info info everywhere, oh fuck, I’ve got to think?”

A lot of people, for various reasons, don’t or can’t take the time to filter, and each of us is getting bombarded more in an hour than, say, our grandparents would have been in a week when they were our ages.

The interesting part is that analog media — like checkout lane magazines and TV and radio ads — are still very much a part of the mix. So back to the story I started with.

(To be continued…)

Image source, Old computer that used punchcards, found in the London Science Museum, (CC) BY-SA 4.0, via Wikimedia Commons

Sunday Nibble #35: A life online

The world may be going to hell in a very big handbasket, and whether we’re all going to die of the plague, roast to death as temperatures rise (either drowning in the rising seas or choking on the endless smoke or both), or we’ll perish in a WW III most likely started by a collapsing and fully fascist United States of America.

Or we could luck out and turn things around. But one thing I have to marvel at is what an amazing era of technology we live in. It’s only the beginning, but we’ve gotten pretty far, pretty fast.

Now, I happen to be of that part of Gen X that has never not been online at any point in their adult lives. In fact, I used a networked computer before I got my driver’s license, way back at the tender age of 15.

But… I was an adult before the founding of either Google (1998) or Wikipedia (2001), and although I wrote all of my scripts and such on computers, I still had to rely on analog research methods until the beginning of this century — mostly libraries and books.

For one black comedy set during the Civil War, my research was pretty much limited to the big book of Ken Burns The Civil War documentary, with occasional library trips and heavy use of my handy Columbia Desk Encyclopedia.

Damn, at one time, I had a huge personal reference library full of dictionaries, specific encyclopedias, writers’ reference books on various subjects that pertained to a particular genre — I think I had Crime and Science Fiction — as well as buttload of foreign language grammars and translating to English dictionaries, including ones like Old English, Hebrew, Hawaiian, Gaelic, Arabic, and Japanese.

Side note: I’ve made a sincere effort in my life time to learn ten languages besides English. I managed fluency in one (Spanish) and, through that, the ability to kind of read and understand one that I studied but could never hear the pronunciation of and another that I never studied (French and Portuguese, respectively), know more than I should but nowhere near enough of the language of the country my last name comes from (German), two for specific purposes of script writing (Italian and Norwegian), two just to try out non-Latin alphabets (Japanese and Russian), one because there seem to be a lot of tall, hot men from there (Dutch), one because the opportunity came up through a theatre company I was in (ASL, until our teacher moved), and one because it’s spoken in the country from whence came half of my genetic heritage (Irish Gaelic).

Funny story, though. Spanish and German are the only two languages that I studied in school. The rest but three were on my own, and most of those were before the internet days. At best, I managed to find recorded lessons to listen to in the car, and for a while I got pretty fluent at basic Russian, but that was about it. As for the other two, once I left school, I kind of lost my abilities in either for a long time.

I remember one particularly informative moment when I traveled to Mexico with an ex, who was himself half Mexican on his father’s side, and realized once we got down there that I couldn’t understand shit, and I couldn’t say shit beyond very simple phrases — that despite studying Spanish in school for five years.

So… I used to have to try to learn languages through books or, if I were lucky, from a human teacher, but good luck with any kind of immersion in it. Likewise, in writing any kind of reality-based fiction, the research was tedious and time-consuming.

And then came the internet. Sure, in the early days (and I was there on the ground floor) you really couldn’t look up shit. I did happen to work for one of the first companies to jump into it with both feet.

This happened to be The Community Yellow Pages, a publication for the Lesbian and Gay community started in 1969 by Jeanne Córdova, who is a piece of lesbian history herself, and whom I was fortunate enough to have known.

She started the guide as a very thin phonebook with both Yellow (commercial) and White (residential) pages, and it was a way to advertises businesses that were either gay-friendly, or owned by gay people and, probably, the white pages part was a de facto but not really acknowledged dating section. (It was eventually discontinued.)

Anyway… 1994 rolls around, the internet is just getting going and, because one of Jeanne’s (many) siblings lives near Silicon Valley and is very tapped into what’s going on, that sibling (a younger sister) convinces her that online is the way to go.

I only worked for the CYP a couple of years, but it was an interestingly schizo time, because we were simultaneously selling people on this paper edition that would come out once a year, along with this electronic thing that could be searched from anywhere and which could be updated if needed.

And… the paper version was by far the best-seller. Bonus points: at that time, we could have done the layout digitally, but didn’t, and so for the few months leading up to publication, we had an actual layout artist come in and physically paste-up the boards that would be photocopied to create the masters for the final run.

Eventually, though, the sleeping giant of the internet’s potential awakened in quick order, first with Google indexing everything, and then Wikipedia accumulating knowledge.

And say what you want about the latter, but over time the ol’ Wiki has really become a stellar example of the “wisdom of crowds” concept. Plus which, it should never be a primary source, but just a guide to finding the same, which are now also all over the internet.

So researching and writing became a lot easier, but so did learning languages, especially after the launch of Duolingo in 2012, as well as the realization that it’s possible to set devices like phones and computers into other languages — and that cars have radios, which make possible both language-learning podcasts over modern tech or, depending on language, radio stations in the target language via old tech.

So those of us with computers, tablets, phones, or other devices, have access to the biggest research library ever assembled. It definitely dwarfs the fabled Library of Alexandria, and most likely has a lot more material than the Library of Congress — which would fit on ten single terabyte hard drives, by the way.

And it’s not just books and stuff like that. It’s full of music, movies, photos, and everything else that humans have left in their wake, all of it there to access either for free or for a nominal fee.

So if we make it through this Anno Horribilis of 2020, then maybe we’ll make it further and continue to see technology make leaps and bounds that our grandparents could never have even imagined.

Friday Free-for-all #29: More questions

This originally started as me answering one random questions generated by a website, but the questions eventually got to the part where they didn’t really need long answers. So, instead, it’s turned into a slow-motion interview with multiple queries. Here are this week’s questions. Feel free to give your own answers in the comments — or ask your own!

If you had to get rid of a holiday, which would you get rid of? Why?

Also known as “How to piss someone off.” There are so many possibilities, but I’m going to have to go for Christmas. But hear me out, because this is the opposite of what the “War on Christmas” people would think.

Yes, we need a solstice holiday for sure, and one that celebrates all beliefs because most cultures have a certain reverence for the winter solstice. But we need to do three things.

One: Give Christmas back to the Christians. Let them have it, they can celebrate however they want to in private, fine. The tradeoff is that we don’t need to mention it or memorialize it at all in the secular world which, if you come to think of it, really just serves to diminish the religious meaning of the holiday in the first place. Next…

Two: We need to remove completely the idea that this winter holiday is all about buying each other shit that we don’t need. That was an invention of capitalism, and it is toxic. The idea of the winter holiday should be for groups of friends and family to variously gather together during the entire period between Thanksgiving and New Year’s for the sole purpose of being together.

Plague permitting, of course — but there are always virtual meetings.

But get together. Share a meal. Binge-watch a favorite show. Have a game night. Go hiking, or biking, or ice-skating, or to a museum. And agree to not exchange presents. Rather, exchange presence. Be there for each other, because that’s the real meaning of any holiday.

Three: Create your own private traditions, religious or secular, and share them. Reject the commercial crap that has been pushed on us for generations in the singular interest of making rich people richer. Sure, you can give someone you say you love a really expensive present, but in the end, that’s really pretty shallow. The greatest gift you can really give is yourself — your time, your attention, your love.

That’s what people need, that’s what they really want, and that’s what we should really be celebrating in the final month of every year.

What fad did you never really understand?

Although it’s been highlighted by the internet, especially since the rise of smart phones, it really isn’t anything new, but the idea of “challenges,” especially ones that can be physically dangerous, just boggles my brain-box.

The cinnamon challenge immediately comes to mind, and this one (like many of them) was actually dangerous. The idea was for someone to video themself swallowing a spoonful of ground cinnamon. One big problem, though: that’s basically like shoving a shitload of dust in your mouth, and that stuff flies into the air at the slightest provocation.

Or, in other words, you suddenly have a cloud of dust in your mouth and flying down your throat, and it also tends to clump when it gets wet (as your mouth and throat are wont to be), and so you can also suddenly wind up with very viscous clumps of spice jamming up your airways or even loose dust going into your lungs.

No matter which way, it’s not a great combo at all — and it can be fatal.

So can other stunts, like jumping out of a moving car and dancing next to it to a track by Drake. If that’s too easy, there’s always the internet fire challenge, which is just what it sounds like, and just as stupidly dangerous. If you don’t like fire, then you can always try the hot water challenge.

And there are many, many more. But, again, taking stupid dares is nothing new. Stupid human tricks from the past perpetrated by our grandparents, great-grandparents, and even great-greats included things like phone booth stuffing, swallowing live goldfish, sitting on poles (the object, not the nationality), or walking on the wings of airplanes.

Some of these still happen, by the way. But the nutshell answer to the original question is that I don’t understand any fad that involves a bunch of people doing really stupid and dangerous shit just to get attention.

What inanimate object do you wish you could eliminate from existence?

Guns. And in the broadest sense of the word — pistols, side-arms, handguns, rifles, shotguns… Okay, let’s shorthand it to “all ballistic weapons.” Note that this does not exclude useful ballistics, without which we could not put astronauts into space.

Ironically, it’s a way to make humanity more civilized by making us more primitive. You want to kill someone? Then do it the old-fashioned way — hand-to-hand, close quarters, or with a pointy weapon that has a range of one to four feet, depending on what you’re wielding.

Slingshots and bows and arrows are probably somewhat acceptable, but we’d need to determine rules of engagement on where we can aim — slings never at the head, arrows never at the head, throat, or torso.

You can stop someone with a club or a sword at close range and, provided that you also aim for those stopping points without aiming for fatalities (see above) , you’re only going to put them down, not out, so everyone lives — even you, who felt threatened enough to draw that weapon.

Or you can shoot an unarmed father of three in the back seven times for absolutely no discernible reason. And that is only one of way-too-many reasons that yes, we need to take these dick-compensators out of the hands of man-babies who absolutely don’t need them.

Sunday Nibble #4

While the internet really was born in 1989 and didn’t explode until 1993, it was born in 1969, with the first (failed) outside attempt to log on to what at started as a military network designed to survive nuclear war. And while Al Gore was derided for having claimed to invent the internet, A) he never said that, and B) he actually was instrumental in crafting legislation that led to what we have today.

Also, the internet, like GPS, was a majorly expensive government program originally designed for the military that we wound up getting back because great minds said, “Hey. The people paid for this shit, and it’s really useful, so hand it over.”

But the real point of this nibble is to remind you that today, February 16, is a hidden but important anniversary in the history of the internet. It’s considered to be the birthdate, in 1978, of the first computerized bulletin board system, or CBBS, the precursor to BBSs (the “internet” of the 80s through early 90s) as well as a head-start on the whole concept of HTML and creating a mark-up language in order to allow different computers with different operating systems in different parts of the world to “talk” to each other.

The first CBBS was basically a glorified answering machine, one user at a time via a dial-up modem that must have been painfully slow compared to now. But it got the ball rolling, and it was created by a couple of dudes who were in their early 30s at the time but who, ironically, would be derided as boomers now. Well, at least the one who’s still alive. The (recently) dead one, not so much.

So as you have your morning avocado toast, or breakfast scramble, or latte to go, or whatever it is you nosh on early on a Sunday morning, just keep in mind that today is a milestone — one among many — that led directly to your ability to read this on your phone while your car tells you how to get to where you’re going.

Whee!