The concept of zero might seem completely intuitive to modern minds. You can’t get to any multiple of 10, 100, 1,000 or so on without it, for one thing. It’s also often the bottom number on various gauges or dials — think speedometer or volume setting — or at least a middle point on things like thermometers or equalizers.
And yet, when humans first started to math, the concept of zero didn’t even exist. Why? Because it wasn’t necessary. The origin of human math had nothing to do with science or geometry or any of that. It was all about commerce.
Math began with counting, which began in the marketplace. What happened here was simple. Somebody with something to sell would set up a stand. Somebody looking to buy would approach. The latter would use whatever was legal tender in trade in exchange for what the former had to offer.
That legal tender could be precious metal stamped in some sort of official fashion or, earlier than that, it could tools, jewelry, stones, or other commodities. For example, one person might be offering a lamb for six chickens.
In order to make the exchange, two things were necessary after the price was set. The seller had to be able to count out the number of things on offer and the two of them had to calculate the price, based on cost per unit times the number of units.
Hello, integers, which are those whole numbers with no decimal places. And hello the idea of multiplication, except that it wasn’t necessary per se. Multiplication is just repeated addition. Remember this, because it, along with the idea that division is just repeated subtraction, will be important later.
So the seller agrees that one lamb costs six chicken and the buyer wants four lambs. The seller counts out the four lambs and sets them aside in a pen. The buyer counts out six chickens for each lamb, but there’s never any multiplication. They might even do each transaction one at a time.
The end result, though, is that the seller winds up with four fewer lambs and twenty-four more chickens.
What he doesn’t know is that the buyer is going to use three of those lambs to buy a cow, and then set up a very profitable business selling milk, dairy products and, thanks to his neighbor, calves for veal.
Now what’s the one thing that never enters into these transactions at all?
The seller cannot offer to give you zero lambs. The buyer cannot offer to pay with zero chickens. In the context of commerce, zero is meaningless because it’s not countable. You cannot have zero number of things.
And so math cruised on for millennia without any idea of zero.
The Sumerians did have sort of a placeholder for zero by around 3000 BCE, but it was a character used between digits in cuneiform writing to represent an empty place in the counting. Babylonians accounted for this zero but did not have a character for it. They would leave a gap, so that 402 would be written as 4 2. However, there would be no distinction between 42 and 420, which would both be written as 42.
This would probably make stoners who love Douglas Adams’ writings very happy.
The Mayans invented zero independently around 4 CE, but it wasn’t until the mid-5th century that Hindu mathematicians developed the idea. This was picked up by Arab mathematicians and it would have spread to the West except for the unfortunate thing called the Crusades.
Western mathematicians were all ready to embrace it, but since what were actually Hindu numerals were known as Arabic numerals by this point, the Catholic Church said, “Are you kidding? No good ideas can come from our enemies,” so the concept of zero was considered the devil’s work for a while.
In case you think that people can’t be that stupid about numbers for purely ideological reasons, a recent survey showed a surprising number of people opposed to teaching Arabic numerals in schools — even though they are the familiar digits we’ve all used for centuries.
Since the Hindus started using it in serious math, though, zero has proven itself to be invaluable. It provides a point at which numbering scales can change — you can’t go from positive to negative without passing through it, after all — and it serves as a universal error warning whenever a formula winds up trying to make it the divisor in an equation.
There are also some fun questions you can ask about zero. Don’t worry. There’s very little actual math involved in learning the answers. Except for the last one.
Is zero an even number?
At first glance, this seems like it’s unanswerable because zero has no numerical value. Like 1 being sort of a prime but not, it feels like zero would be neither odd nor even. But as soon as we look at the definition of an even number… well, let’s look at that.
The first definition of an even number: It’s evenly divisible by 2. You can check that out with any random even number. For example, 14/2 = 7, or 8/2 = 4. The result can be either odd or even, and prime or not, as those two examples show. And some numbers can be divided by 2 more than once — 4/2 = 2.
So is zero divisible by 2? Oh yes, and an infinite number of times: 0/2 = 0. Lather, rinse, repeat.
Another property of an even number: It’s a multiple of 2. Again, it doesn’t matter whether you start with an odd or even number. The result will always be even: 16 x 2 = 32; 47 x 2 = 94, and so on.
And what happens with zero? We get 0 x 2 = 0, and so on. And since the first step indicated that zero is probably even, it’s still even.
One other determinate of an even number: It never changes the odd/even status of whatever number you add it to. The sum of two even numbers is an even number; the sum of an even and odd is odd. (I’ll leave it to you to figure out the rule of the sum of two odd numbers, which should be obvious by now.)
Now, what number never changes the status of whatever it’s added to? That’s right — our old friend zero. So, yet again, it acts like an even number.
The final test of an even number: On the whole number line, it appears between two odd integers — for example, 16 comes between 15 and 17. As for zero? Its neighbors are 1 and -1, which are both odd.
You can’t do that on television (or anywhere else)
Now, there are two things you cannot do with zero, one famously and one lesser-known. The first is that you cannot divide by zero. And no, this does not give you infinity. It give you… well, it just breaks math, period.
Division by zero, by the way, happens to be one of the proofs that travel at the speed of light is impossible. (It does not say you can’t go faster, though, as long as you skip that one troublesome point between positive and negative.)
Remember when I mentioned that multiplication is just repeated addition and division is repeated subtraction? Well, this leave multiplying by zero perfectly fine, because if you add any integer zero times, you get 0. Meanwhile, if you start with zero, no matter how many times you add it, you still get 0.
But let’s look at what happens when you try to subtract zero and figure out how many times you can. Well, guess what? No matter how many times you subtract zero, you still have the original number, so you can subtract 0 from 1 every femtosecond of every day since the Big Bang and you still will not have an answer by the time the whole thing fizzles out in cosmic entropy in a few trillion years.
But… that number is not equal to infinity. Why? Because, again, it breaks math. If dividing by zero equals infinity, then 1/0 equals infinity, and so does 2/0. If both numbers over the same divisor equal the same result, then you’ve just “proven” that 1=2. In fact, you’ve just proven that any number, whole, fractional, rational, transcendent, or not, equals every other number.
So… math breaks. The preferred result of division over zero is “Undefinied.”
Finally, there’s the idea that you cannot raise 0 to the power of 0. Basically, anything to the power of zero equals 1, and anything to the power of 1 equals itself. The rest follows the familiar squares and cubes and so on.
So, in theory 0 to the power of 0 equals one, but here’s the quick debunk of that. Another way to get to something to the power of 0 starts with the power of 1 — any number to the power of 1 is that number. So 2^1 = 2, 5^1 = 5, and so on.
And if you divide any number to the power of one by itself, you do get that number to the power of zero, so you get 1. Why? Because when you divide one number with an exponent over another, you subtract the exponents on the bottom from the ones on top.
So 2^1/2^1 gives us the same thing as 2/2, which is 1.
You probably see the problem coming here. While 0^1 may or may not be equal to 1, as soon as you write 0^1/0^1 it becomes irreducible because of our old bugaboo division by zero yet again.
So zero to the power of zero remains undefined as well.
How to get from zero to one
Before I get to the 0 to the power of 0 problem, here’s a very interesting one. There’s a mathematical function called a factorial, which is represented by an exclamation mark. What it means is that you take the number before that mark and multiply it by every integer less than it down to one.
It’s very useful in things like statistics and calculating odds. Here’s an example. The expression 5! means to multiply 5 by the integers below it, so you get 5 x 4 x 3 x 2 x 1. This works out to 20 x 6 x 1, or 120.
Now it should be obvious, but one way to go from X! to the number below it is to calculate X!/X. Why? Because you’re removing the top term. 5!/5 removes the 5 and, in effect, gives you the digits for 4!: 4 x 3 x 2 x 1. That works out to 24, which happens to be 120/5.
This is all great, and then you get to 1!. And if you want to calculate 0!, you need 1!/1. And what does that work out to?
Well, it happens to be 1/1, or 1, meaning that 0! equals 1. Of course, you can’t go from 0! to -1! because you wind up dividing by 0,
Of course, there are other, much more complicated reasons that 0! = 1, but I’ll leave that explanation to the fabulous Professor James Grime of Numberphile to explain. Also, kudos to Numberphile for all the ideas reiterated here today. They are a great resource.