The Information Age? Part one

This was originally going to be a single piece but, as often happens, I got so into the subject at hand that I had to split it. The second installment will appear tomorrow.

“When did Camilla become the queen?” a voice behind me in line at Ralphs suddenly blurts out. The clerk and I look over to see a thirtyish woman who otherwise seems well-off and intelligent holding up a magazine with a cover story title along the lines of “Elizabeth and Camilla face off over the crown!”

The course of that conversation was rather illuminating for me — and I hope for the woman — but before I can get to that part, let’s take a little detour, shall we?

The so-called Information Age, also called the Computer Age, among other names, began around the 1970s, and its major hallmark was that the gathering, processing, and dissemination of information was rapidly moving away from analog media and devices — snail mail, paper files, and typewritten documents among them.

Telephone landlines were also moving away from using analog encryption and decryption of signals, as well as rotary dials that literally told the switches what digits you’d dialed by the number of clicks.

I could swear I’ve written about it here before, but can’t find the link. But this “number of clicks” thing is why the most populous regions at the time they instigated area codes got the ones with the fewest number of “clicks”: 212, 312, and 213 going respectively to New York City, Chicago, and Los Angeles.

Digital technology had existed by this point, ever since the invention of the transistor, first demonstrated in 1947. However, the earliest computers were huge — the size of entire rooms — very expensive, and ran ridiculously hot because they used vacuum tubes instead of integrated circuits, which didn’t exist yet anyway.

In addition, they were programmed either by physically flipping switches to set parameters, or in the more “advanced” models, data was fed into them — and read out of them — using either punch-encoded paper tape, or the infamous punch card, yards of the former and tons of the latter. To store just one gigabyte of data would have taken over 16,000 typical punch cards, at 64 bytes per card.

That’s bytes, using the 8-bit standard. The highest eight-digit number in binary is 1111 1111, which equals 255. Add in the extra number zero, 0000 0000, and that is why the number 256 is so important in digital technology.

In contrast, a 16-bit system gives you 65,536, which happened to be the original number of possible colors on the first VGA monitors eons ago. Beyond 16-bit, you’d need scientific notation. A 64-bit system would give you 1.85×10^19.

These are physical limit on how big a number any sized system can generate and use at one time, as well as how many binary digits can pass through a serial bus at one time.

The likely reason that the Information Age finally took off is directly tied into the birth of what would become the Internet in October 1969. It developed on into the 1980s. At the same time, the first small-ish computers that were affordable — at first to business, and then eventually to more affluent consumers — started to come onto the market.

Those early machines didn’t do a whole lot, but they created the first generation to go digital: Gen X. To this day, that seems to be the dividing line, and I know very few Boomers who seem comfortable with computers if they were never exposed to them during their professional lives.

I’m not generalizing Boomers, though — I know Millennials who couldn’t even manage to turn on a laptop, and I’ve met octogenarians who can work their way around a computer like an expert.

And then, about a decade after the first really affordable personal computers, the Internet happened. Well, I think it was the World Wide Web (WWW) at the time, although people didn’t used to distinguish the two.

The short version is that the WWW is the stuff you see online — the web pages that actually come to your machine. The Internet, meanwhile, is the vast network of computers that hosts the WWW, along with lots of other stuff, and which makes the magic happen that gets a document on somebody’s private server in Tierra del Fuego through an elaborate route that can sometimes cross the entire planet before it shows up on your laptop while you’re on vacation in Iceland, usually in under a second.

Another way to think of it is that the WWW are the letters and packages, and the Internet is the postal service. Of course, anyone who refers to the World Wide Web nowadays will just get looked at funny. There’s a reason that the “www.” Part of a URL hasn’t been required for a long, long time.

End result? The Information Age, which has been a wonderful thing.

If I’d tried to write this article, or something like it, in 1970, with all of the above information, it would have involved one or more trips to a physical library or a few, lots of manual searches and pulling out books to look for information in a very analog way — by reading it.

Notes by hand, or by using the library’s copiers, paying per page. And if said information weren’t in the library, I would have somehow had to find someone with expertise in the field and either (gasp!) call them on the phone, or send them a neatly typed letter nicely asking for info and waiting for however long it took them to respond, if they ever did.

I could never have adulted in that age. So that’s the upside of The Age of Information.

But there are several downsides, one having to do with quantity (and quality) of information, and the other having to do with a frequent lack of engagement, which is not unrelated to the first part.

I have a terabyte hard drive in my computer, and another terabyte external drive connected to it. These are ridiculously huge amounts of data, something we couldn’t even have conceived of needing back in the 1990s.

Now, multiply those two drives of mine by five to bring it to ten terabytes, and that’s enough storage space to hold the entire collection of the U.S. Library of Congress, digitized.

The amount of information available via the Internet dwarfs that number by a lot. It’s been estimated that just the “Big Four” of sites — Google, Amazon, Microsoft, and Facebook — store at least 1.2 petabytes of data.

Note that it’s not clear from my source whether this data includes YouTube and Instagram under the umbrellas of Google and Facebook respectively.

A petabyte is two steps up from a terabyte, meaning that one petabyte is a million terabytes — and the figure above doesn’t even include all the other storage spaces out there for all the other people. Granted, a lot of smaller sites contract to Google, Amazon, or Microsoft for cloud storage, but I’m sure that many of them don’t.

Have an email account that isn’t Google or Microsoft? Multiply the storage they allow by their users and add it in. Toss in all the university and library servers, as well as private industry servers — banking, real estate, finance, healthcare, retail, manufacturing, media.

And then don’t forget publicly accessible government servers which, in every country, go from a national to regional to an administrative to a local level. In the U.S., that’s Federal, state, county, city. The U.S. has 50 states and well over 3,000 counties. A county can have any number of cities, townships, unincorporated areas, boroughs, parishes, or whatever.

It all adds up.

Again, on the one hand, it can be the greatest thing ever. I certainly love it for writing and researching, because when I want to create a link, I just need to tap a couple of keys and do a quick search.

Since I’m a stickler for getting it right if I happen to be writing a period piece, and I want to know what the weather was in a certain place on a certain day, or what was on TV on a certain day and time, boom, done. The information is out there.

Nerd stuff. But that’s what I’m into, that’s what I love to do, and I know how to filter — as in which news and websites to ignore, which to use with caution, and which to trust.

Speaking of which, the main thing that Wikipedia is good for is a broad overview, but if you want the real story, always follow their external links to sources. Yes, I will link to them if I’m only going to give a superficial dash of info, especially if it involves pop culture, but they’re the card catalog, not the book, to dredge up an old analog metaphor.

But we live in a paradox in which information very much fits the old line from The Rime of the Ancient Mariner: “Water water everywhere, nor any drop to drink.”

This referred to being stranded in the ocean, so surrounded by water, but since it’s sea water, it is too salty to be drinkable.

Modern version: “Info info everywhere, oh fuck, I’ve got to think?”

A lot of people, for various reasons, don’t or can’t take the time to filter, and each of us is getting bombarded more in an hour than, say, our grandparents would have been in a week when they were our ages.

The interesting part is that analog media — like checkout lane magazines and TV and radio ads — are still very much a part of the mix. So back to the story I started with.

(To be continued…)

Image source, Old computer that used punchcards, found in the London Science Museum, (CC) BY-SA 4.0, via Wikimedia Commons

Friday Free-for-All #16

In which I answer a random question generated by a website. Here’s this week’s question Feel free to give your own answers in the comments.

What piece of technology brings you the most joy?

This one is actually very simple. It is the lowly but very important integrated circuit, or IC. They combine a host of functions previously performed by much larger and more complicated devices — mostly transistors, resistors, and capacitors — which can create all sorts of tiny components, like logic gates, microcontrollers, microprocessors, sensors, and on and on.

In the old pre-ICs days, transistors, resistors, and capacitors all existed on a pretty large scale, as in big enough to pick up with your fingers and physically solder into place.

Before that, old school “integrated circuits” were big enough to hold in your hand and resembled very complicated lightbulbs. These were vacuum tubes, and essentially performed the same functions as a transistor — as either an amplifier or a switch. And yes, they were considered analog technology.

The way vacuum tubes worked was actually via heat. A piece of metal would be warmed up to release electrons, which was also the reason for the vacuum. This meant that there were no air molecules to get in the way as the electrons flowed from one end (the cathode) to the other (the anode), causing the current to flow in the other direction. (Not a typo. It’s a relic from an early misconception about how electricity works that was never corrected.)

The transition away from vacuum tubes to transistorized TV sets began in 1960, although the one big vacuum tube in the set — the TV screen itself — stuck around until the early 2000s.

But back to the vacuum tube function. Did it seem off that I described transistors as either amplifiers or switches? That’s probably because you might think of the former in terms of sound and the latter in terms of lights, but what we’re really talking about here is voltage.

Here’s the big secret of computers and other modern electronic devices. The way they really determine whether a bit value is 0 or 1 is not via “on” or “off” of a switch. That’s a simplification. Instead, what they really use is high or low voltage.

Now, granted, those voltages are never that “high,” being measured in milliamps, but the point is that it’s the transistor that decides either to up a voltage before passing it along, or which of an A/B input to pass along which circuit.

Meanwhile, resistors are sort of responsible for the math because they either slow down currents, so to speak, or let them pass as-is. Finally, capacitors are analogous to memory, because they store a received current for later use.

Put these all together, and that’s how you get all of those logic gates, microcontrollers, microprocessors, sensors, and on and on. And when you put all of these together, ta-da: electronics.

These can be as simple as those dollar store calculators that run on solar power and can only do four functions, or as complicated as the fastest supercomputers in the world. (Note: Quantum computers don’t count here because they are Next Gen, work in an entirely different way, and probably won’t hit consumer tech for at least another thirty years.)

So why do ICs give me joy? Come on. Look around you. Modern TVs; LCD, LED, and OLED screens; eReaders; computers; cell phones; GPS; synthesizers; MIDI; CDs, DVDs, BluRay; WiFi and BlueTooth; USB drives and peripherals; laser and inkjet printers; microwave ovens; anything with a digital display in it; home appliances that do not require giant, clunky plugs to go into the wall; devices that change to or from DST on their own; most of the sensors in your car if it was built in this century; the internet.

Now, out of that list, a trio stands out: computers, synthesizers, and MIDI, which all sort of crept into the consumer market at the same time, starting in the late 70s and on into the 80s. The funny thing, though, is that MIDI (which stands for Musical Instrument Digital Interface) is still around and mostly unchanged. Why? Because it was so incredibly simple and robust.

In a way, MIDI was the original HTML — a common language that many different devices could speak in order to reproduce information in mostly similar ways across platforms and instruments. Started with sixteen channels, it’s proven to be a ridiculously robust and backwards-compatible system.

Over time, the number of channels and bit-depth has increased, but a MIDI keyboard from way back in the early 80s will still communicate with a device using MIDI 2.0. You can’t say the same for, say, storage media and readers from different time periods. Good luck getting that early 80s 5-inch floppy disc to work with any modern device.

What’s really remarkable about MIDI is how incredibly robust it is, and how much data it can transfer in real time. Even more amazing is that MIDI has been adapted to more than just musical instruments. It can also be used for things like show control, meaning that a single MIDI system runs the lights, sound systems and, in some cases, even the practical effects in a concert or stage production.

And, again, while MIDI 1.0 was slowly tweaked over time between 1982 and 1996, it still went almost 25 years before it officially went from version 1.0 to 2.0, in January 2020. Windows 1.0 was released on November 20, 1985, although it was really just an overlay of MS-DOS. It lasted until December 9, 1987, when Windows 2.0 came out. This was also when Word and Excel first happened.

Apple has had a similar history with its OS, and in about the same period of time that MID has been around, both of them have gone through ten versions with lots of incremental changes along the way.

Now, granted, you’re not going to be doing complex calculations or spreadsheets or anything like that with MIDI, and it still doesn’t really have a GUI beyond the independent capabilities of the instruments you’re using.

However, with it, you can create art — anywhere from a simple song to a complex symphony and, if you’re so inclined, the entire stage lighting and sound plot to go along with it.

And the best part of that is that you can take your musical MIDI data, put it on whatever kind of storage device is currently the norm, then load that data back onto any other MIDI device.

Then, other than the specific capabilities of its onboard sound-generators, you’re going to hear what you wrote, as you wrote it, with the same dynamics.

For example, the following was originally composed on a fairly high-end synthesizer with really good, realistic tone generators. I had to run the MIDI file through an online MIDI to audio site that pretty much uses the default PC cheese-o-phone sounds, but the intent of what I wrote is there.

Not bad for a standard that has survived, even easily dumping its proprietary 5-pin plug and going full USB without missing a beat. Literally. Even while others haven’t been able to keep up so well.

So kudos to the creation of ICs, and eternal thanks for the computers and devices that allow me to use them to be able to research, create, and propagate much more easily than I ever could via ancient analog techniques.

I mean, come on. If I had to do this blog by typing everything out on paper, using Wite-Out or other correction fluid constantly to fix typos, then decide whether it was worth having it typeset and laid out (probably not) and debating whether to have it photocopied and mimeographed.

Then I’d have to charge all y’all to get it via the mail, maybe once a month — and sorry, my overseas fans, but you’d have to pay a lot more and would probably get it after the fact, or not at all if your postal censors said, “Oh, hell noes.”

Or, thanks to ICs, I can sit in the comfort of my own isolation on the southwest coast of the middle country in North America, access resources for research all over the planet, cobble together these ramblings, and then stick them up to be blasted into the ether to be shared with my fellow humans across the globe, and all it costs me is the internet subscription fee that I would pay anyway, whether I did this or not.

I think we call that one a win-win. And if I went back and told my first-grade self, who was just having his first music lessons on a decidedly analog instrument, in a couple of years, science is going to make this a lot more easy and interesting, he probably would have shit his pants.

Okay. He probably would have shit his pants anyway. Mainly by realizing, “Wait, what. You’re me? Dude… you’re fucking old!”

Oh well.

Image (CC BY 3.0) by user Mataresephotos.

 

Don’t make it rocket science when it’s not

So many tools

It never ceases to boggle my mind when people don’t jump on the chance to learn and fully take advantage of the amazing modern tools we’ve been handed and which are ubiquitous. If you work in any kind of office environment at all, whether it’s some stodgy traditional business or a bleeding-edge industry like tech or gaming, at the very least you’re dealing with either Microsoft’s Word, Excel, Outlook, etc., or the Apple equivalents.

If you’re using the Open Office or Chrome/Cloud versions, then this piece probably isn’t directed at you because you definitely get it. But, otherwise… really, people? These are literally the things that you use every day, and yet I constantly see very few people ever progressing beyond the merest basic ability to use any of the programs.

That is: Open document, type shit with defaults, save or send as-is.

If I open a spreadsheet you’ve worked on in an older version of Excel and see three tabs at the bottom named Sheets 1, 2, and 3, I will know that you’re an amateur. Likewise if the font is set to that hideous Calibri. Same thing in Word minus the tabs, but same crappy font, ragged aligned left, with auto-spacing before paragraphs or lines.

Word to the wise, people. The first thing you should do in Word is go in and set your default formatting so that the autospace before lines or paragraphs is 0, and line spacing is single.

Why is paper still a thing?

But this is just an intro to some recent heinous, and it’s this. I’ve managed to stumble into a situation where a lot of coworkers prefer to do things on paper, and it makes me nuts. Simple question: Why? Physical files can only be in one place, usually aren’t in the place where they’re supposed to be, and there isn’t a magic search function that can find them other than somebody maybe remembering that they worked on it recently, and where they put it. There’s also no standardization of fonts, so if someone scribbles a note in that file, there’s no guarantee that someone else will be able to read it six months later.

Not to mention that it’s just wasteful. Especially wasteful when there are so many ways to avoid it and so many resources to make that easier.

Case in point: One of the things I do regularly is enter and reconcile commission statements from various vendors, but I’ve had to do it by printing the things, manually entering the data into a spreadsheet, and then doing a careful audit to fix the inevitable errors, since some of these run to hundreds of entries.

But then I figured out how to pull the data directly from the statement, slap it into Excel, format it, and then use a few formulae to pull the new info into the old spreadsheet. The great advantages are that it uses the original data directly, so there are no entry errors to deal with. Also, the second pass just involves pulling out a copy of the original statement data and the target input by formula data, putting them side-by-side, using a few more formulae to spot errors due to differences in how names were spelled, making a few tweaks, and reconciling the thing a lot faster than before.

Pre-paperless innovation, a big statement could take me a few days (interspersed among all the other office duties) to finally balance it to zero. New method? I made it through four statements in one day, each one entered and balanced in two steps instead of about six.

The thing is, this isn’t really all that difficult, and anybody could learn to do it. One of the big helps in this process were the Excel functions INDEX and MATCH (which I’ll explain in a future post), and it took all of a two minute Google search and then reading the first good link to figure out how they worked in order to figure out how to do what I needed to do. What I needed to do: Compare the client’s first and last names and insurance plan type in one table in order to pull out a specific number from another. And this is literally all you need to do to learn how to make your office tools work for you.

Try it. Google “change the default font in Word,” or “turn off auto-correct in Word,” or “alternatives to VLOOKUP in Excel,” or any one of a number of other topics, and you’ll find the answers. It really isn’t any more complicated than reading a cookbook and making food from a recipe. Really, it’s not.

Using computers made easy

There is too much of an aura of mystery put around computers, but trust me, they are more simple than you think — and I’ve been working with them since… well, since most of my life, because I was just born at the right time. All that they ultimately understand are “Off” and “On.” “Zero” and “One.” Those are the only two states a switch can be in, that is what digital computing is, and it only gets two digits.

Maybe someday I’ll write a bit about how the electrons inside do what they do and turn it into intelligible information for humans, but for now suffice it to say that they pretty much only do a few things — input, store, and retrieve data through various devices; allow you to manipulate that data with various software programs; then allow you to re-store and output that data, again through various devices.

The nice thing about graphical user interfaces (GUIs) like Windows, OS, Android, Linux, etc., is that they tend to standardize across programs written in them, so that every program tends to use the same convention for the basics: Open, Close, Save, Save As, Print. Programs of the same type will also follow the same conventions — Format, Spellcheck, View, Layout, etc., for text editors; Image, Layer, Select, Filter, Effect, etc., for graphic design programs; Inset, Formulas, Calculate, Data, Sort, etc., for spreadsheets.

Finally, almost every program will have a Help function, whether it’s invoked via the F1 key, or by some combination of a control/alt/Apple/shift-click plus H move. Help menus, when well-done are great and, guess what? They were basically the hyperlinked documents we’ve all come to know and love via the internet, except that they’ve been around since long before the internet. Most of the time, they’ll answer the question but, if they don’t, you can always google it, as I mentioned above.

How to create job security

You may be wondering, “Okay, if my job is just doing data entry, or writing emails, or accounting, or… etc., why do I need to know so much about the software when no one else does?”

Simple. As the economy moves more and more toward service, knowledge becomes value. If you’re the one in the office who gets a reputation as the computer expert, you will get noticed, and you will save a higher-up’s cookies more than once. You’ll also earn the attention and gratitude of your co-workers if you become the one they come to when “I did something and something happened and I don’t know how to fix it,” and you know immediately upon looking that they accidentally, say, set Word to Web Layout instead of Print Layout. It’s called creating job security by taking that extra simple step that too many people refuse to. Try it!

Image Source: NASA, Apollo 11.