Wednesday Wonders: A busy day in space

Happy New Year! And happy first day of spring!

Wait, what… you say those things aren’t today, March 25th? That the latter was six days ago and the former was almost four months ago?

Well… you’d be right in 2020, but jump back in history to when the Julian calendar was still around, and things were dated differently. This led to the adoption of the new Gregorian calendar, but since it was sponsored by the Pope, not everyone switched over right away. Long story short, Catholic countries like Spain, Portugal, and Italy adopted it immediately in 1582. Protestant countries held out, so that places like England (and the colonies) didn’t switch until 1752.

That was also when England moved New Year’s day back to January 1, which is itself ironic, since it was the Catholic Church that moved the day from then to March 25 at the Council of Tours in 567, considering the prior date pagan, which was probably accurate, since the Romans had moved New Year’s from March to January 1st when they deified Julius Caesar after his assassination.

The practical reason for switching calendars was that the Julian calendar lost 11 hours a year, which added up fast, meaning that entire extra months had to be added between years to set things right again. The Gregorian calendar is much more accurate, although about 2,800 years from now it will have lost a day.

By the way, the religious reasoning for picking March 25th is that it was the Feast of the Annunciation, meaning the day that the Archangel Gabriel appeared to Mary to let her know that she was going to get knocked up by god — although it doesn’t get mentioned canonically until a century after the ol’ calendar switch-a-roo.

Anyway, the math isn’t hard to do. March 25th is exactly nine months before Christmas. And in strictly astronomical terms, the former is the first day of spring and the latter is the first day of winter. Just psychologically, the Vernal Equinox, which is now closer to the 19th or 20th, is the better New Year’s Day option because it’s when days start to get longer than nights, vegetation starts to grow anew, and nature awakes from its slumber.

Note: Your mileage in 2020 may vary.

It’s kind of ironic, then, that today marks the birth of a German astronomer and mathematician, Christopher Clavius, who was instrumental in doing the calculations necessary to figure out how much in error the Julian calendar had become, and then to come up with a calendar to fix it and a method to transition.

This is where the Catholic Church came into it, because Easter, being a moveable feast based on the Julian lunar calendar, had been slipping later and later into the year, threatening to move from the spring to summer. Clavius’s job was to bring it back toward the vernal equinox.

He succeeded to the degree of accuracy noted above — only a day off in 3,236 years. Not bad. This was also when New Year’s Day went back to January 1st, per the old Roman style, and while this is attributed to Pope Gregory XIII, I can’t help but think that Clavius had a hand in implementing the change.

I mean, come on. You’re handed a chance by the most powerful person in the western world at the time to move a major holiday off of your birthday so that your day is finally special on its own? Who wouldn’t do that given the power?

Good ol’ Chris did make other discoveries and get some nice presents, like a crater on the moon named after him, as well as the moon base in the movie 2001.

Still, even if the equinox did move away from March 25, the date still keeps bringing special things for astronomers. It was on this day in 1655 that the Dutch physicist and astronomer Christiaan Huygens discovered Saturn’s largest moon, Titan,

Huygens also has another time connection, though. Where Clavius gave us a calendar accurate to over 3,000 years, Huygens gave us a clock that was the most accurate for the next 300 years. His innovation? Put a pendulum on that thing and let it swing. He literally put the “tick tock” in clock.

Why was this possible? Because the swing of a pendulum followed the rules of physics and was absolutely periodic. Even as friction and drag slowed it down, it would cover a shorter distance but at a slower pace, so that the time between tick and tock would remain the same.

The pendulum itself would advance a gear via a ratchet that would turn the hands of the clock, and adding kinetic energy back into that pendulum was achieved through a spring, which is where that whole “winding the clock” thing came in. Tighten the spring and, as it unwinds, it drives that gear every time the pendulum briefly releases it, but thanks to physics, that pendulum will always take the exact same time to swing from A to B, whether it’s going really fast or really slow.

Back to Huygens’s discovery, though… Titan is quite a marvel itself. It is the second largest natural satellite in our solar system, taking a back seat (ironic if you know your mythology) only to Jupiter’s Ganymede. It is half again as big as our own Moon and 80% more massive. It’s even bigger than the planet Mercury, but only 40% as massive, mainly because Mercury is made of rock while Titan may have a rocky core but is mostly composed of layers of different forms of water-ice combined with ammonia, and a possible sub-surface ocean,

Titan also has a thick, nitrogen-rich atmosphere, the only other atmosphere in the solar system besides Earth’s to have so much nitrogen in it. In case you’re wondering, Earth’s atmosphere is almost 80% nitrogen — OMG, you’re breathing it right now! But this also makes the aliens’ Achilles heel in the movie Mars Attacks! kind of ridiculous, since the whole deal was that they could only survive in a nitrogen atmosphere. We have that, Mars doesn’t. Mars is mostly carbon dioxide, but not even much of that. But don’t get me started.

Despite all that, it’s still a fun film.

And Titan, next to Jupiter’s moon Europa, is one of the more likely places we might find life in our solar system.

One final bit of March 25th news in space for this day: In 1979, OV-102, aka Space Shuttle Columbia, was delivered to NASA. It was the first shuttle completed, and its delivery date, after a flight that had begun on March 24th, came four years to the day after fabrication of the fuselage began. Sadly, it was also the last shuttle to not survive its mission, so there was a strange sort of symmetry in that.

While I warned you about the Ides of March, the 25th should be full of nothing but anticipation, even in a plague year. It’s a date for exploration and discovery, whether out into the cosmos, or within the confines of whatever space you’re in right now. Make good with what you have, create all you can, and take advantage of our wonderful technology to share and connect.

After all, that’s what worked for Clavius and Huygens. They worked with the tech they had, then networked once they had an idea, and look how well that worked out.

Hint: It worked out very well, for them and for us.

Image Source: Titan, by NASA.

Momentous Monday: Backwards and in high heels

The famous astronomer Herschel was responsible for a lot of accomplishments, including expanding and organizing the catalog of nebulae and star clusters, discovering eight comets, polishing and mounting mirrors and telescopes to optimize their light-gathering powers, and keeping meticulous notes on everything.

By being awarded a Gold Medal of the Royal Astronomical Society and being named an honorary member thereof, holding a government position and receiving a salary as a scientist, Herschel became the first woman to do so.

What? Did you I think I was talking about the other one? You know — the only one most of you had heard of previously because he discovered Uranus. Oh, and he had that whole penis thing going on.

Caroline Lucretia Herschel, who was William’s younger sister by eleven years and was born on this day 270 years ago, did not have a penis, and so was ignored by history. Despite the honors she received, one of her great works, the aforementioned expansion of the New General Catalogue (NGC), was published with her brother’s name on it.

If you’re into astronomy at all, you know that the NGC is a big deal and has been constantly updated ever since.

While she lacked William’s junk, she shared his intellectual curiosity, especially when it came to space and studying the skies. It must have been genetic — William’s son John Herschel was also an astronomer of some repute — and it was his Aunt Caroline, not Dad, who gave him a huge boost.

She arranged all of the objects then in the NGC so they were grouped by similar polar coordinates — that is, at around the same number of degrees away from the celestial poles. This enabled him to systematically resurvey them, add more data about them, and discover new objects.

Caroline was not the first woman in science to be swept under history’s rug by the men. The neverending story of the erasure of women told in Hidden Figures was ancient by the time the movie came out, never mind the time that it actually happened. Caroline was in good company.

Maria Winckelmann Kirch, for example, was also an astronomer, born 80 years before Caroline and most likely the first woman to actually discover a comet. But of course history gave that honor to her husband, Gottfried Kirch, who was thirty years her senior. However, Herr Kirch himself confirms in his own notes that she was the one who found it:

“Early in the morning (about 2:00 AM) the sky was clear and starry. Some nights before, I had observed a variable star and my wife (as I slept) wanted to find and see it for herself. In so doing, she found a comet in the sky. At which time she woke me, and I found that it was indeed a comet… I was surprised that I had not seen it the night before”. [Source]

Maria’s interest and abilities in science came from a person we might think of as unlikely nowadays: a Lutheran minister, who happened to be her father. Why did he teach her? Because he believed that his daughter deserved the same education any boy at the time did, so he home-schooled her. This ended when Maria lost both of her parents when she was 13, but a neighbor and self-taught astronomer, Christoph Arnold, took her on as an apprentice and took her in as part of the family.

Getting back to Hidden Figures, though, one of the earliest “computers,” as these women of astronomy were known, was Henrietta Leavitt. Given what was considered the boring and onerous task of studying a class of stars known as Cepheid variables, she actually discovered something very important.

The length of time it takes a Cepheid to go through its brightest to darkest sequence is directly proportional to its luminosity. This means that if know the timing of that sequence, you know how bright the star is. Once you know that, you can look at how bright it appears to be from Earth and, ta-da! Using very basic laws of optics, you can then determine how far away the star is.

It’s for this reason that Cepheids are known as a “standard candle.” They are the yardsticks of the universe that allow us to measure the unmeasurable. And her boss at the time took all the credit, so I’m not even going to mention his name.

And this is why we have The Leavitt Constant and the Leavitt Telescope today.

No, just kidding. Her (male) boss, who shall still remain nameless here because, “Shame, shame,” took all of the credit for work he didn’t do, and then some dude named Edwin Hubble took that work and used to to figure out how far away various stars actually were, and so determined that the universe was A) oh so very big,  and B) expanding. He got a constant and telescope named after him. Ms. Leavitt… not so much.

There are way too many examples of women as scientific discovers being erased, with the credit being given to men, and in every scientific field. You probably wouldn’t be on the internet reading this now if no one had ever come up with the founding concepts of computer programming, aka “how to teach machines to calculate stuff for us.”

For that, you’d have to look to a woman who was basically the daughter of the David Bowie of her era, although he wasn’t a very dutiful dad. He would be Lord Byron. She would be Ada Lovelace, who was pretty much the first coder ever — and this was back in the days when computers were strictly analog, in the form of Charles Babbage’s difference and analytical engines.

The former was pretty much just an adding machine, and literally one that could only do that. So, for example, if you gave it the problem “What is two times 27,” it would find the solution by just starting with two, and then adding two to it 26 times.

The latter analytical engine was much more like a computer, with complex programming. Based on the French Jacquard loom concept, which used punched cards to control weaving, it truly mimicked all of the common parts of a modern computer as well as programming logic.

Basically, a computer does what it does by working with data in various places. There’s the slot through which you enter the data; the spot that holds the working data; the one that will pull bits out of that info, do operations on it, and put it back in other slots with the working data; and the place where it winds up, which is the user-readable output.

The analytical engine could also do all four math operations: addition, subtraction, multiplication, and division.

An analog version of this would be a clerk in a hotel lobby with a bunch of pigeonhole mail boxes behind, some with mail, some not. Guests come to the desk and ask (input), “Any mail for me?” The clerk goes to the boxes, finds the right one based on input (guest room number, most likely), then looks at the box (quaintly called PEEK in programming terms).

If the box is empty (IF(MAIL)=FALSE), the Clerk returns the answer “No.” But if it’s not empty (IF(MAIL)=TRUE), the clerk retrieves that data and gives it to the guest. Of course, the guest is picky, so tells the Clerk, “No junk mail and no bills.”

So, before handing it over, the Clerk goes through every piece, rejecting that above (IF(OR(“Junk”,”Bill”),FALSE,TRUE), while everything else is kept by the same formula. The rejected data is tossed in the recycle bin, while the rest is given to the guest — output..

Repeat the process for every guest who comes to ask.

Now, Babbage was great at creating the hardware and figuring out all of that stuff. But when it came to the software, he couldn’t quite get it, and this is where Ada Lovelace came in. She created the algorithms that made the magic happen — and then was forgotten.

By the way, Bruce Sterling and William Gibson have a wonderfully steampunk alternate history novel that revolves around the idea that Babbage and Lovelace basically launched the home computer revolution a couple of centuries early, and with the British computer industry basically becoming the PC to France’s Mac. It’s worth a read.

Three final quick examples: Nettie Maria Stevens discovered the concept of biological sex being passed through chromosomes long before anyone else; it was Lise Meitner, not Otto Hahn, who discovered nuclear fission; and, in the ultimate erasure, it was Rosalind Franklin, and neither Watson nor Crick, who determined the double helix structure of DNA.

This erasure is so pronounced and obvious throughout history that it even has a name: The Matilda Effect, named by the historian Margaret Rossiter for the suffragist Matilda Joslyn Gage.

Finally, a note on the title of this piece. It comes from a 1982 comic strip called Frank and Ernest, and it pretty much sums up the plight of women trying to compete in any male-dominated field. They have to work harder at it and are constantly getting pushed away from advancement anyway.

So to all of the women in this article, and all women who are shattering glass ceilings, I salute you. I can’t help but think that the planet would be a better place with a matriarchy.

Sunday Nibble #8: Beware the what of when now?

Caesar’s wife Calpurnia may well have told him “Cave idibus martiis” — “Beware the Ides of March” — and history proved her to be right, whether or not her warning was made up later. In fact, the real warning may have come from a politically astute seer named Spurinna, who gave a general warning with no specifics.

There are a lot of myths around Caesar’s assassination, many of them attributable to Shakespeare taking dramatic license.

And the part that always gets left out is that Caesar was just about to declare himself dictator for life, so contrary to Shakespeare, perhaps the murderous Senators really were the heroes in this scenario.

Hm. Heroic Senators. What a concept… Except that they probably acted entirely in their own self-interest, since Caesar went more after their own corruption than after the common citizen or the slave.

But forget all that. The real question is “What exactly is an ‘ides’ that Caesar had to bewar?”

Well, for one, it’s a thing you’ve been pronouncing wrong since forever, and “ides” isn’t even the original Latin. It’s “idibus Martiis.” In this case, the endings of the words basically say that the first one belongs to the second. That’s how Latin works. No apostrophe stuff for them. They had an entire case, called the genitive, which could be read in shorthand as “thing of.”

It differs even more in English in that the owned object comes before the owner. I guess the most direct, yet cumbersome, rendering in English of idibus Martiis might be “the ides which belong to March.”

Oh yeah. Extra complication. More likely than not, the thing would have been rendered in classical Latin like this: “IDIBVSMARTIIS” or, to make it even more confusing, “IDBSMRTS.”

But what you’re probably really wondering about is that whole “ides” thing, which btw is pronounced “ee-dayce” and not “eyeds.”

First off, we need to look at the history of the Roman calendar and, like many calendars from that time and place, it was lunar, not solar. It was basically a hot mess and necessitated the addition of leap month every two or three years to keep things in synch. Q.V. the Jewish calendar, which adds a leap month every… it’s complicated.

Meanwhile, terms like the ides were basically meant to pin down the phases of the moon.

The Romans had three special words for days in their calendar, one of which gave us the name for the thing. That would be kalends, which indicated the day of the New Moon, i.e. no moon visible. The ides, then, indicated the day of the full moon, which would be two weeks after the kalends. Finally, the nones designated the 1st quarter moon.

What this meant to the Romans was that the kalends was always the first of the month, the nones could be on the 7th or 5th of the month — the former in March, May, July, and October, the latter in all others; and the ides would be on the 15th of the same months mentioned above, or the 13th of the others.

What this also tells us is that Caesar was assassinated under a full moon on the 15th of March.

When it came to time-keeping ancient cultures naturally latched onto the Moon. And, in fact, in many languages, the words for moon and month are very similar. This is pretty self-evident in English.

Judaism, the religion of Rome, and (later) Islam all came to settle on the same time-keeper, choosing the Moon over the Sun. At first glance, that might seem weird. After all, the Sun definitely creates our days and nights, so why shouldn’t it have been the primary calendar starter from the beginning?

Simple. The Sun seems to be constant. The Moon is not. In fact, Shakespeare even commented on it in Romeo & Juliet:

O, swear not by the moon, the inconstant moon,

That monthly changes in her circled orb,

Lest that thy love prove likewise variable.

Ironically, it was the apparent inconstancy that led us to use the Moon to mark time. And why did the Moon seem the better choice? Because the Sun was the really inconstant one.

Let’s say that humans have already divided a day into 24 hours, but it can be any arbitrary number. Then they try to figure out another arbitrary measure, let’s call it an hour, based upon how long daylight lasts. “Okay,” they say. “Half of that day length will be light, and half dark.”

So they get about measuring, only to realize that it’s a moving target. If they use some physical constant to measure, like how long it takes X amount of water to drain from one bucket with a hole in it to another, then they may notice over time that while it’s daylight for sixteen buckets in June, it’s somehow only daylight for eight buckets in December.

            `

Well, that’s not a great way to measure things. But, on the other hand, here’s this thing up there that changes in a regular and predictable pattern, and it shouldn’t have taken too much observation to realize that the regular change took about 28 days — regardless of how long day or night were relative to each other.

So we have a winner. Start with the day the Moon disappears, mark off a point when it has fully reappeared, then put a pin in a point between invisible and totally there. That’s your regular and easy cycle, and the source of your lunar calendar.

It wasn’t until people who were keeping track of the longer phenomena — basically, how the Sun’s position and the apparent angle of the Earth’s axis also changed consistently, but over years, not months — that we also finally realized, “Crap! A lunar calendar is going to throw us off of what time it ‘really’ is.”

But… is that a valid question or concern? Does anybody really know what time it is?

How many phases of the Moon have passed since your birth? How many years on the Jewish or Muslim calendar? Is your birthdate now still in the same month it was then?

Ultimately, does it matter? We’ve come to consider the number of times the Earth circles the Sun to be the important measure, hence birthdays based on solar time. But that is totally anthropocentric, meaning to measure everything about the world based on human terms.

But… what about all the dogs I’ve known and loved who have gone from infancy to advanced senior citizen and death in about as many orbits as it took me to go from birth to driver’s license? What about the few pet rats I’ve had and loved who lasted about as long as it took me from birth to say my first words?

And what about all those turtles that look at us humans and think, “You retire at 65? Lazy-ass bitches. Grow a shell!”

In physics, time really is just what a clock reads, nothing more nor less. After all, a clock here on Earth will read a quite different time from the same clock launched into space at a large fraction of the speed of light.

Here are the salient points: While the ides of March, 44 BCE, is the date on which Julius Caesar was assassinated, all we really need to remember for practical purposes is that this day was March 15th. His wife never predicted his doom on this day, and the one seer who gave warning only said that Caesar was moving into a politically dangerous month, and he did that back in February

The real heroes in the story were kinda sorta the Senators who stabbed him to death with daggers (not swords) in an antechamber off of the Senate (not on the floor), in order to save everyone, except that they were totally acting in their own self-interest in a way that only inadvertently benefited the Plebes, Soldiers, Citizens, and Slaves.

Finally, everything got distorted to turn a dude who was probably a power-hungry and dangerous asshole into a martyr. At least his first successor, Augustus, had it a bit more together.

Getting back to calendars, though, our Roman calendar got more modern when what was originally the fifth month was renamed in honor of Caesar after his assassination, and so we got July.

Meanwhile, August was renamed for Augustus Caesar in 8 BCE. In this case, the Senate decided to make it happen, and so the sixth month took on what wasn’t even his real name, just his title. And so September, October, November, and December made sense for a while, since they meant seventh, eighth, ninth, and tenth.

It wasn’t until the winter months got names again and March was no longer new year’s month that the last four months of the year lost touch with the origin of their names.

And, finally, we had a calendar that aligned more closely with the more meaningful solar year, and only needed to be adjusted by stuffing an extra day into February every four years, and omitting that same stuffing if said leap year happened to occur on a century year (one ending in 00) that was not divisible by four.

So far, it’s worked out pretty well. And, in modern America, the only real warning we need to heed on the Ides of March is that it’s one month until tax day. Otherwise, carry on!

Wednesday Wonders: Kenneth Essex Edgeworth MC

Today would have been the 140th birthday of an Irish astronomer, economist, and all-around jack of all trades you’ve never heard of known as Kenneth Essex Edgeworth.

You probably have heard of Gerard Kuiper, though, or at least the belt named after him. Since Kuiper was of Dutch descent, that first syllable is pronounced with a long I, so it’s not “Kooper.” The first syllable rhymes with kite. (If you’re an L.A. local, it’s exactly the same as Van Nuys, and for the same reasons that I won’t get into here, because they’re complicated.)

Anyway… Kuiper was about 25 years younger than Edgeworth, died just over a year after him in 1973, and wound up with his name on something that Edgeworth originally predicted and described.

Okay, sometimes it’s referred to as the Edgeworth-Kuiper belt, attributing the discoverers slash theorists in the right order, but that’s generally mostly not the case, so that Kuiper really is kind of the Edison to Edgeworth’s Tesla.

But Edgeworth was ahead of his time in other ways. Only eight years after Pluto was discovered by Clyde Tombaugh in 1930 and declared the eighth planet, Edgeworth was expressing his doubts, saying that it was too small to be a planet, and was probably a remnant of the bits and pieces that came together to create the solar system.

He was certainly vindicated on that one, and it was part of the same ideas which gave birth to what should be called the Edgeworth Belt, but which didn’t catch on until Kuiper got in on the act in the 1950s.

Maybe a big part of the problem was that Edgeworth was more of an armchair astronomer. While he published papers, he was a theorists and not an experimenter. Then again, Albert Einstein was a theoretical physicist, not a practical one, and his theories changed the way we view the universe.

Edgeworth’s could have changed the way we view our solar system, and he also hypothesized what later became known as the Oort Cloud — named for another damn Dutch astronomer, Jan Oort, who once again came to the party long after Edgeworth proposed the idea.

When Edgeworth was a child, his family moved to the estate of his maternal uncle, who was an astronomer, and had an influence on young Kenneth. Later, the family would move to the estate of Edgeworth’s paternal grandfather, where he would develop engineering skills in his father’s workshop.

He went into the military, joining the Corps of Royal Engineers, and was posted to South Africa, where he served in the Second Boer War. His military career continued through World War I and beyond, and he retired in 1926.

However, between the Boer War and WW I, his uncle submitted his name for membership in the Royal Astronomical Society, and he was accepted for in 1903. By this point, he had already written papers on astronomy, since one of them was read at the meeting during which he was elected. He studied international economics during the Great Depression and wrote five books on the subject in the 1930s and 40s. He also published various papers on astronomy, covering subjects like the solar system, red dwarves, star formation, and redshift.

It was also at this time that he published his thoughts on Pluto, as well as the existence of both the Kuiper Belt and Oort Cloud.

After he “retired,” he published a series of letters and papers, leading to his book The Earth, the Planets and the Stars: Their Birth and Evolution, which was published in 1961. He published his autobiography, Jack of all Trades: The Story of My Life, when he was 85, in 1965, and died in Dublin in 1972, at the age of 92.

His contributions to the Kuiper Belt and Oort cloud weren’t acknowledged until 1995, although he did have an asteroid named after him in 1978, 3487 Edgeworth. Yes, a comet would have been more appropriate, but those are only named after their discoverers, and after October 10, 1972, Kenneth Edgeworth wasn’t in a position to discover anything new.

But while he was around, damn what a life. And what an unsung hero. Proof yet again that, sometimes, the ideas that sound utterly crazy at the time turn out to be the truth.

I wonder which unsung geniuses we aren’t listening to now, but whose visions will be obvious in a generation or two.

Image: Kenneth Essex Edgeworth, year unknown. Public domain.

Wednesday Wonders: Now, Voyager

+Wednesday’s theme will be science, a subject that excites me as much as history on Monday and language on Tuesday. Here’s the first installment of Wednesday Wonders — all about science.

Now, Voyager

Last week, NASA managed something pretty incredible. They managed to bring the Voyager 2 probe back online after a system glitch forced it to shut down. Basically, the craft was supposed to do a 360° roll in order to test its magnetometer.

When the maneuver didn’t happen (or right before it was going to), two separate, energy-intensive systems wound up running at the same time and the probe went into emergency shut-down to conserve energy, turning off all of its scientific instruments, in effect causing data transmission back to home to go silent.

The twin Voyager probes are already amazing enough. They were launched in 1977, with Voyager 2 actually lifting off sixteen days earlier. The reason for the backwards order at the start of the mission is that Voyager 1 was actually going to “get there first” as it were.

It was an ambitious project, taking advantage of planetary placement to use various gravitational slingshot maneuvers to allow the probes to visit all of the outer planets — Jupiter and Saturn for both probes, and Uranus and Neptune as well for Voyager 2.

Not included: Pluto, which was still considered a planet at the time. It was in a totally different part of the solar system. Also, by the time the probes got there in 1989, Pluto’s eccentric orbit had actually brought it closer to the Sun than Neptune a decade earlier, a place where it would remain until February 11, 1999. While NASA could have maneuvered Voyager 2 to visit Pluto, there was one small hitch. The necessary trajectory would have slammed it right into Neptune.

Space and force

Navigating space is a tricky thing, as it’s a very big place, and things don’t work like they do down on a solid planet. On Earth, we’re able to maneuver, whether on foot, in a wheeled vehicle, or an aircraft, because of friction and gravity. Friction and gravity conspire to hold you or your car down to the Earth. In the air, they conspire to create a sort of tug of war with the force of lift to keep a plane up there.

When you take a step forward, friction keeps your back foot in place, and the friction allows you to use your newly planted front foot to move ahead. Note that this is why it’s so hard to walk on ice. It’s a low-friction surface.

The same principle works with cars (which also don’t do well on ice) with the treads on the tires gripping the road to pull you forward or stop you when you hit the brakes — which also work with friction.

Turning a car works the same way, but with one important trick that was discovered early on. If both wheels on opposite sides are on the same axle, turning the wheels does not result in a smooth turn of the vehicle. The axles need to be independent for one simple reason. The outside wheel has to travel farther to make the same turn, meaning that it has to spin faster.

Faster spin, lower friction, vehicle turns. While the idea of a differential gear doing the same thing in other mechanisms dates back to the 1st century BCE, the idea of doing it in wheeled vehicles wasn’t patented until 1827. I won’t explain it in full here because others have done a better job, but suffice it to say that a differential is designed to transfer power from the engine to the wheels at a relative rate dependent upon which way they’re aimed in a very simple and elegant way.

Above the Earth, think of the air as the surface of the road and an airplane’s wings as the wheels. The differential action is provided by flaps which block airflow and slow the wing. So… if you want to turn right, you slow down the right wing by lifting the flaps, essentially accelerating the left wing around the plane, and vice versa for a left turn.

In space, no one can feel you turn

When it comes to space, throw out everything in the last six paragraphs, because you don’t get any kind of friction to use, and gravity only comes into play in certain situations. Bookmark for later, though, that gravity did play a really big part in navigating the Voyager probes.

So, because no friction, sorry, but dog-fights in space are not possible. Hell, spacecraft don’t even need wings at all. The only reason that the Space Shuttle had them was because it had to land in an atmosphere, and even then they were stubby and weird, and even NASA engineers dubbed the thing a flying brick.

Without friction, constant acceleration is not necessary. One push starts you moving, and you’ll just keep going until you get a push in the opposite direction or you slam into something — which is just a really big push in the opposite direction with more disastrous results.

Hell, this is Newton’s first law of motion in action. “Every object persists in its state of rest or uniform motion — in a straight line unless it is compelled to change that state by forces impressed on it.” Push an object out in the vacuum of space, and it will keep on going straight until such point that another force is impressed upon it.

Want to turn right or left? Then you need to fire some sort of thruster in the direction opposite to the one you want to turn — booster on the right to turn left, or on the left to turn right. Want to slow down? Then you need to fire that thruster forward.

Fun fact: there’s no such thing as deceleration. There’s only acceleration in the other direction.

Also, if you keep that rear thruster going, your craft is going to keep on accelerating, and over time, this can really add up. For example, Voyager 2 is currently traveling at 15.4 kilometers (9.57 miles) per second — meaning that for it to take a trip from L.A. to New York would take five minutes.

Far and away

At the moment, though, this probe is 11.5 billion miles away, which is as long as four million trips between L.A. and New York. It’s also just over 17 light hours away, meaning that a message to and response from takes one day and ten hours.

And you thought your S.O. was blowing you off when it took them twenty minutes to reply to your text. Please!

But consider that challenge. Not only is the target so far away, but NASA is aiming at an antenna only 3.66 meters (12 feet) in diameter, and one that’s moving away so fast. Now, granted, we’re not talking “dead on target” here because radio waves can spread out and be much bigger than the target. Still… it is an impressive feat.

The more impressive part, though? We’re talking about technology that is over forty years old and still functioning and, in fact, providing valuable data and going beyond its design specs. Can you point to one piece of tech that you own and still use that’s anywhere near that old? Hell, you’re probably not anywhere near that old, but did your parents or grandparents leave you any tech from the late 70s that you still use? Probably not unless you’re one of those people still inexplicably into vinyl (why?)

But NASA has a track record of making its stuff last well beyond its shelf-life. None of the Mars rovers were supposed to keep on going like they have, for example, but Opportunity, intended to only last 90 days, kept on going for fifteen years, and the NASA Mars probes that actually made it all seem to last longer than intended.

In the case of Voyager, the big limit is its power supply, provided by plutonium-238 in the form of plutonium oxide. The natural decay of this highly radioactive element generates heat, which is then used to drive a bi-metallic thermoelectric generator. At the beginning, it provided 470 Watts of 30 volt DC power, but as of 1997 this had fallen to 335 Watts.

It’s interesting to note NASA’s estimates from over 20 years ago: “As power continues to decrease, power loads on the spacecraft must also decrease. Current estimates (1998) are that increasingly limited instrument operations can be carried out at least until 2020. [Emphasis added].”

Nerds get it done.

Never underestimate the ability of highly motivated engineers to find workarounds, though, and we’ve probably got at least another five years in Voyager 2, if not more. How do they do it? The same way that you conserve your phone’s battery when you forgot your charger and you hit 15%: power save mode. By selectively turning stuff off — exactly the same way your phone’s power-saver mode does it by shutting down apps, going into dark mode, turning off fingerprint and face-scan recognition, and so on. All of the essential features are still there. Only the bells and whistles are gone.

And still, the durability of NASA stuff astounds. Even when they’ve turned off the heaters for various detectors, plunging them into very sub-zero temperatures, they have often continued to function way beyond the conditions they were designed and tested for.

NASA keeps getting better. Nineteen years after the Voyagers, New Horizons was launched, and it managed to reach Pluto’s orbit and famously photograph that not-a-planet object only 9½ years after lift-off — and with Pluto farther out — as opposed to Voyager’s 12 years.

Upward and onward, and that isn’t even touching upon the utter value of every bit of information that every one of these probes sends us. We may leave this planet in such bad shape that space will be the only way to save the human race, and NASA is paving the way in figuring out how to do that.

Pretty cool, huh?

Shoot the Moon

Previously, I covered a couple of big conspiracy theories, and why they are generally such an impossible idea. As noted there, it’s really hard for people to keep secrets, and the bigger a conspiracy, the faster it falls, which is why we happen to know about the real ones.

But people will see and believe what they want to, and so conspiracy theories exist. Here’s another famous one that just isn’t true.

We never landed on the Moon

While this one might seem like a modern conspiracy theory, it’s actually almost as old as the lunar landings, and was first promulgated by a man named Bill Kaysing, in his self-published 1976 book called We Never Went to the Moon: America’s Thirty Billion Dollar Swindle.

Of course, the James Bond film Diamonds Are Forever featured its own “Moon landing was fake” gag in 1971, and the whole thing probably caught on because it was an era when trust in government was at its lowest, what with Vietnam, Kent State and, by mid-decade, Watergate all crashing down at once. Ironically, the last one was a true conspiracy that fell apart quickly.

More fuel was added to the fire by the 1976 film Capricorn One, which postulated a manned mission to Mars that was faked by the government to avoid losing face with the USSR because the mission just wasn’t ready. Of course, the same film also hung a lantern on the biggest problem with huge government conspiracies. In order to cover it up, the plan was to kill the “astronauts” before they left the soundstage, then announce that they had died in a tragic accident upon re-entry.

Despite it being a 70s film — an era when the hero did not always win — this one did pull victory over villainy as the plot is discovered and the astronauts eventually saved, popping up at the announcement of their own deaths Tom Sawyer style to reveal the whole plot. Hell, there were even three “dead” people entering their own funeral in both.

The film definitely used the main motive that Moon Hoaxers give for the landing being faked: We weren’t ready for it, but we had to make the Soviets think that we were, and it all began when President John F. Kennedy gave a speech to a joint session of Congress on the 146th day of the new decade of the 1960s, May 25, 1961. His goal was simple: To put a (hu)man on the Moon before the last day of the decade. His motives were obvious. The Russians were already ahead of us in the “space race,” having launched the first satellite, Sputnik, and putting the first man into space. They also put the first woman in space, beating us by exactly twenty years and two days.

If you’d like to see an incredible film that documents the prequel to this speech in the days from the first attempts to break the sound barrier to finally getting our own astronauts into orbit, check out the book and/or film versions of The Right Stuff by Tom Wolfe, which documents both the amazing and absurd involved in this process.

It also illuminates the true dilemma for the American space program. For a time, it looked like the USSR was getting ahead, and especially as Kennedy was assassinated and things got worse in Vietnam (which was a proxy hot war between the two sides in the Cold War) the idea of getting to the Moon first became a sort of goal for a moral victory.

Did you ever wonder why NASA’s command center for all lunar operations wound up in Houston? Look no further than Vice-President, then President, Lyndon Baines Johnson who, like JFK before him, preferred to be known by the initials LBJ… among other things. Johnson?

Did I mention that LBJ was from Texas, so that it was almost a slam-dunk that the Space Center would wind up there? As for why the launch center wound up in Cape Canaveral, Florida, there are two good reasons for it. One is that it allows for launches over a lot of open water, meaning that crashes or aborted take-offs won’t happen over land or populated areas. Second, it was (at the time) the part of the U.S. closest to the equator, and the equator is much friendlier to getting us into space.

And for everyone rightly pointing out that Hawaii is surrounded by a lot more water and is closer to the equator because it’s our southernmost state, you are absolutely correct, except that Hawaii hadn’t quite become a state yet at the time that Cape Canaveral begun operations. Note that Puerto Rico is also farther south than Florida and slightly farther south than Hawaii, but we didn’t put our launch site there either.

I’m guessing that “really freaking heavy equipment” and “transportation by ship over substantial distances” aren’t a great combo when doing a budget for a governmental program. That, and helping elected officials in territories — you know, the ones who don’t get to vote in Congress — really doesn’t bring back any benefit to Wasghington D.C.

Which really brings up another way to question the Moon Hoax conspiracy. If it was a fake, why go to all of the trouble of making sure the sites are in locations with political and scientific advantages? If it were just for show, they could have put the control center anywhere and put the launch site near D.C. or New York City or somewhere else flashy that would draw huge crowds to watch the rockets go up.

As for why people believe this theory, it’s simple. They don’t understand science or physics. There are a lot of misconceptions in everything the Hoaxers claim; way too many for this piece, so I’ll refer you to the brilliant 2001 takedown of a Fox documentary claiming that it was all true by the amazing “bad” Astronomer Phil Plait. (In fact, this particular article is the one that launched him into internet fame and success in the first place.)

But perhaps the most bizarre take on the whole Moon Landing Hoax is this: the shots on the Moon were created by none other than… Stanley Kubrick. This was another idea to fall out of the sadly challenged brain of Kaysing, but others ran with it. Someone even went so far in 2015 to fake a video they claimed was Kubrick confessing to it. Hey, easy to do after the person you ‘re besmirching has died, right?

Still, it gets even weirder, as some true believers claim that Kubrick stuffed The Shining with clues basically saying, “Hey… I confess. I faked the Moon Landing.” And yes, some people do believe it.

This theory at least achieved one good thing. It let a septuagenarian who’d actually been to the Moon (Buzz Aldrin) punch a Moon Landing denying asshole in the face and get away with it. To quote the linked article, “The Los Angeles County District Attorney’s office has declined to file charges.”

That’s the best possible outcome, really. If only Buzz had said, right before the punch, “Bang! Zoom! Straight to the Moon.”

Whole lot of shaking goin’ on?

(Warning: Betteridge’s Law alert in effect.)

Damn. Puerto Rico has been getting pounded by quakes over the last month to the point that they have visibly changed the landscape. Why so many earthquakes? Well, as they say in real estate, it’s all about location, location, and location. The island happens to be situated on top of or next to various tectonic plates and mini-plates, and it’s the collision of these pieces of the Earth’s crust that cause quakes in the first place. Well, the ones that aren’t man-made, anyway.

Puerto Rico isn’t alone in this, either. A look at significant earthquakes over the last 30 days shows the image of a very unsettled Earth. Now, it would be easy to buy into an interesting astronomical fact being the cause. That is, the Earth reaches its closest point to the Sun, perihelion, in January. This year, it was January 4th, with the centers of the Earth and Sun being only about 91.4 million miles apart. On July 4th, they will be at their most distant, at about 94.5 million miles.

Now, true, that’s only a little over a 3% difference, but that distance is about 390 times the diameter of the Earth, and enormous masses are involved on both ends. Perihelion is also the point in the Earth’s orbit when it reaches its maximum velocity, which is what flings it to aphelion, where it slows, reaches its minimum velocity, and comes flying back into a smaller orbit, which the Sun slingshots back out. Lather, rinse, repeat.

Of course, the difference between maximum and minimum velocity is only about sixth tenths of a mile per second, but, again, we’re dealing with some pretty big objects here. And, anecdotally, I can tell you that the biggest earthquake I’ve ever experienced was in January, and so was Japan’s, a year to the day later, and now Puerto Rico is shaking apart, and it must be connected, right?

Right… except that it’s not. Earthquakes are not driven by orbital mechanics or the weather or any other factors like that, and any belief in “earthquake weather” or “earthquake season” are pure confirmation bias and nothing more nor less.

However… there’s one thing to keep in mind about this time of year. We are closer to the Sun, and so get more heat from it, right at the time when it’s winter in the Northern Hemisphere, but summer in the Southern Hemisphere. And why is that the case? Because of the way the Earth is tilted. Winter is the season when its axis is titled away from the Sun. Summer is when it’s tilted toward. Spring and Fall are the seasons where the axis is mostly straight up and down.

So… in the Northern Hemisphere, we get winter when we are closest to the Sun and summer when we’re farthest away. In the Southern Hemisphere, it’s exactly the opposite, and this is where we can see events in our solar system having an effect down here. Mainly Australia is burning.

Why? Climate change, hotter temperatures, drier forests, extreme weather (thinking thunderstorms with lightning that can start a fire), and human elements, although far from the “200 arsonists” dreamt up by the anti-climate change crowd. More like 24 actual arsonists, and then a bunch of idiots who may or may not have started fires, but at least did something that might have. And, anyway, claiming that arson and accident don’t add to the concept of anthropogenic climate change is a bit of a stretch. Humans did it? All that smoke is going to screw up the environment. And the burning would have stopped a lot sooner if the hotter climate hadn’t pre-baked the forests.

But… it’s hard to avoid confirmation bias when the earthquake alert app on my phone has been ridiculously busy since at least January 4th. The good news is that it’s easy to survive a quake with warning, and if you’re not living in buildings basically made out of mud, stone, and hope.

Just remember this: A) Do NOT get into a doorway. That’s outdated Boomer advice. Instead, squat down next to a heavy piece of incompressible furniture, like a sturdy armoire or a sofa, or barring that, right next to your bed, on your knees, rolled over, hands covering the back of your neck and head.

Once the shaking has stopped, if you can, grab your loved ones and go-bag (you have one, right?) get outside, shut off your gas if necessary, and escape to shelter, which could be your car if it wasn’t smashed flat in the collapse of a Dingbat style apartment. People, really, don’t live in them. Also try avoiding buildings that are four to eight stories tall, because they tend to sway at resonant frequencies in sync with seismic waves, and so sway harder and collapse more often.

The good news is that in a lot of places prone to earthquakes, things have been upgraded to a ridiculous and safe degree. The bad news? In a lot of places they haven’t.  Fun fact: Most of the U.S. and Canada reside on a single tectonic plate, so are not naturally susceptible to earthquakes. Not fun fact: Fracking completely fracks with that, and creates seismic events (aka earthquakes) in places that they should not be. Less fun fact: the tectonic plate with a lot of Southern California and half of the Bay Area is not the same one as the rest of North America.

Consequently, while people in other parts of the country grow up dreading tornadoes or floods, earthquakes have been my lifetime bugaboo. Good news, though. I’ve survived 100% of the ones I’ve been in… and I’ve accepted the fact that, for now, they are 100% unpredictable.