As a young man Ronald Reagan worked as a lifeguard at a riverfront park in Dixon, Illinois. He took the job seriously, and in the course of several years he pulled seventy-seven people out of trouble in the Rock River. He knew it was seventy-seven because he kept a tally, in notches cut in a log.
As president, Reagan liked to joke about his age. He was the oldest person to hold the office until then, and he employed humor to deflect concern that he was too old for the job. He would speak of George Washington as a contemporary.
What he didn’t say was that in notching that log to keep count, he was channeling someone who preceded him not by two centuries but by forty millennia. Bones from South Africa dating from 44,000 B.C. show notch marks resembling Reagan’s. What the notch-maker was tallying is impossible to know, but, like Reagan’s, they required some effort to cut, and so they presumably recorded repetitions or multiples important to the notcher.
Mathematics began with counting, and tally marks were the first written numbers. The practice of grouping marks by fives or tens or twenties seems to have arisen in various places. Five is suggested by the fingers on a hand, ten by two hands, twenty by hands and feet.
A grouping scheme makes handling numbers larger than a handful comparatively easy. Recognizing three groups of five is quicker than counting to fifteen. And it saves having to coin lots of words for numbers. A scheme based on five requires only five distinct names for numbers: one, two, three, four, five. Beyond that are five-and-one, five-and-two, five-and-three, five-and-four, two fives, two fives and one, etc.
In the scheme that became most common, the base number was ten. This required ten distinct number-words, but it kept the number of groups from getting out of hand. In English and other languages, distinct words developed for certain numbers beyond ten, but they were usually derived from the first ten numbers. “Eleven” is from an Old German phrase meaning “one over [ten]”. “Twelve” is “two over.” “Thirteen” is “three and ten,” and so on. After twenty—“two tens”—things are pretty straightforward. Coinages for “hundred,” “thousand,” “million” and so on guarantee convenient counting as far as anyone could want to go.
From counting followed arithmetic: combining and manipulating numbers. A farmer with two pigs bought four more and wound up with six. Addition can be accomplished simply by counting further: three, four, five, six. But repetition caused clever farmers to note patterns and develop shortcuts, leading to the equivalent of addition tables.
Subtraction was the opposite of addition. A farmer with six pigs sold four and was left with two. This can be reckoned by counting backward, or by undoing addition. But again, patterns emerged, and people who did this for a living learned subtraction tables.
Subtraction threw a kink in the thinking, though. Adding two numbers of pigs always produces a number of pigs. But subtracting pigs from pigs sometimes yields . . . what? A farmer with three pigs sells three pigs, and what is he left with? No pigs. Is that a number of pigs?
Not at first sight. Rather it is an absence of pigs. And a number isn’t required to indicate absence. Indeed, for the vast majority of the period during which humans have been counting, they had no number to indicate absence or nothingness. It seemed as pointless to talk about such a concept as to assign a color to the wind or a taste to silence.
On current evidence, it seems that not until about the time of Christ did anyone devise a numerical symbol to signify what we call zero. The English zero ultimately comes from the Arabic sifr, reflecting the Middle Eastern origins of the idea.
A symbol signifying an empty column in a place-sensitive number—like 0 in 203—appeared much earlier, in Mesopotamia. But this symbol doesn’t seem to have been thought of as a number itself.
Nor did the system of numbers that spread farthest in classical times, the Roman numeral system, have a symbol for zero. There is no place-value (ones column, tens column, hundreds column, etc.) in the Roman system. When Romans ran out of pigs, they didn’t have any pigs; they didn’t have zero pigs.
Zero is subtle concept. It was the first number to have no physical manifestation; it is purely abstract. Many people had difficulty grasping it. Some found it distasteful or threatening; a few called it the work of the devil.
But it was handy. A pig merchant keeping tabs on customers might record that Abel bought 6 this month, Babel bought 3, and Cleo bought 0. At the end of the year the merchant would add up the monthly numbers to get the annual totals for each customer, without having to make special allowance for months where customers bought none.
And once zero was accepted, other abstractions followed. If the merchant bought a pig from Abel, the merchant might mark a -1 in Abel’s column. Needless to say, Abel can’t have a negative pig; the -1 is an accounting fiction. But it’s a useful fiction, again allowing the merchant merely to add the monthly numbers to get the annual total (with the addition of -1 being the same as subtracting 1).
At this point mathematics was off and running. It grew more and more abstract. Today most cutting-edge mathematics is far removed from the physical world. Which isn’t to say that it’s not useful. Concepts treated for decades as abstractions by mathematicians have repeatedly turned out to be just what physicists, computer scientists, statisticians, chemists and others required to understand and model the real world.
Sometimes a lot can come from nothing.
Typo: "linguisits" should be "linguists."
RE: Brands' interesting essay "Nothing to brag about: The invention of zero," I just now reread it.
Brands correctly points out that all languages seem to have some logic in their counting systems. However, I'm reminded of my Russian classes at UT ages ago and how the Russian word for "40" was "сорок," which doesn't follow the rule. The dictionary says "Inherited from Old East Slavic сорокъ (sorokŭ, “a bunch of 40 sable pelts; forty”). Another interesting word (and I'm showing my age again) is the archaic (in this context) word "bit": a unit of 12 1/2 cents. I still remember people saying things like "can I borrow two bits from you?" It's still used (what linguisits call a "fossilized form") in the cheerlead: "Two bits, four bits, six bits a dollar! All for [fill in name of team], stand up and holler!"