Making History of Mathematics Relevant
Isaac Newton’s Shamefully Unpublished
Calculus Book
(0r: The
way the commercial world treats the genius.)
[This article, though now slightly revised, was
first printed in the MAA (Mathematical Association of America) journal, Forum, in 1994 or 1995.]
On an
"educational" radio broadcast last year, a man billed as a computer
expert (hereinafter: "the Expert") was explaining about electronic
publishing and the Internet. Many books, he explained, will not even appear on
paper at all in future years, but will exist catalogued and described in
computer network listings, to be sampled or downloaded, maybe to be read at
leisure on one's own monitor, or maybe printed on one's home printer.
All of which was true, of
course, though not exactly news to anyone with experience of computers.
But then he got more
interesting. He pointed out that these developments would reduce the role of
what today are called publishers, companies (like Random House and McGraw-Hill)
that pay writers and cause their books to be printed, advertised, distributed
and sold. Thinking only of profit, they have often stifled ideas that could
change the world,
"... and publishers have been making such
mistakes ever since one turned down Newton's Calculus on the grounds that
nobody needed a new calculus book..."
At that point I turned off
the radio to record verbatim as much as I could remember of this astonishing
thing the Expert had said. These few lines resemble one of those puzzles in the
Sunday comics which ask, "Find all the mistakes in the following
picture." Let us count.
Isaac Newton flourished in
the last part of the Seventeenth Century. He was from an early age the most
famous scientist in the world. He could publish what he wanted, immediately. He
was actually often reluctant to publish his work, and it took a lot of
persuasion from his friend Halley to get him to collect and organize the system
of mathematical physics he called the Principia,
rejected by nobody and welcomed by an expectant scientific world in 1687.
Thus:
Mistake #1 The Expert's idea that some
publisher "turned down" a book of Newton’s -- any book of Newton's -- is ludicrous.
Now what was this famous
turn-down about? A "calculus book"? In Newton's time there were no
calculus books, not even one by Newton. Newton was one of the inventors of a
branch of mathematics which these days we call "calculus," but that
sort of work was appearing in papers and books -- and personal letters --
concerned with what the authors actually thought of as geometry or algebra, or
scientific matters such as optics, astronomy and alchemy, having a mathematical
component.
The very idea of "a
calculus book" is more recent than even Newton's Principia, late as that was in Newton's life. The first
approximation was that of l'Hospital in 1696, but even that was not a textbook
in the modern, collegiate sense, and it was one of a kind. A publisher's
calculation that "nobody needed another calculus book" could have no
meaning before about 1900, when science and engineering students were beginning
to grow numerous enough in American colleges to provide a steady market for
such texts as a genre. In Europe the notion came even later. Thus:
Mistake #2 A sentiment of the form
"So and so's calculus book was turned down [by some publishers] because
they thought there were already enough calculus books" describes nothing
that could have happened before the Twentieth Century.
In passing, and in
particular, let us include:
Mistake #3 It is false to say Newton
ever wrote a Calculus.
Finally, to put the matter
into an even wider context, the whole story as told by the Expert would never
have been invented by him had he not himself assumed his audience to believe a
popular and mischievous axiom of 20th Century journalism, which is that
geniuses are generally misunderstood, not honored in their time. Otherwise he
would not so readily have accepted (or invented) the idea that a publisher
would turn down a Newton "calculus book." Newton was a genius;
geniuses are not recognized in their time; therefore Newton, unrecognized, goes
unpublished while some silly nephew of the publisher gets printed instead.
Stands to reason.
But this picture of
unrecognized genius is false. Sure, there are rare cases of it: Van Gogh and
Kafka come to mind. In the world of mathematics, however, only angle-trisectors
and circle-squarers seem to suffer from such neglect. Like Archimedes and
Michelangelo before him, and Beethoven and Einstein to follow, Newton’s was from
the beginning a name to conjure with. Thus:
Mistake #4 Contrary to the Expert's unspoken
assumption, geniuses are generally known as such during their own times.
The Expert probably didn't
worry too much about whether his story was true or not; he figured it could be
true, which was good enough. The story illustrated what to him seemed an
important truth: that publishers can make mistakes in judging the value of a
new manuscript, and that electronic publishing will help prevent such mistakes
from holding up progress. So the story contributes to real truth, as The Expert
saw it, and that's the only kind of truth that counts.
Most liars justify
themselves this way. This is a common excuse. The literal truth, they often
think, will be misunderstood anyhow; people will only be confused by it. What
would you rather have on a noon-time radio science spot: an unimportant fact
about the way mathematical publishing worked in the year 1696, or an important,
a deep truth, about the evils of commercial publishing, and how computers will
help avoid that? Maybe it wasn't exactly Newton; it was bound to have happened
to somebody. Who really cares?
So, in pursuit of the higher
truth, our Expert conjured up a seventeenth century populated by students of
"calculus," to whom a publishing industry was evidently supplying a
rich choice of textbooks. Within that seventeenth century he conjured up a
Calculus written in vain by a Newton whose works, like those of most geniuses,
were ignored by practical men. It is hard to pack so much misinformation into a
few words, but the Expert did it.
Ralph A. Raimi, March 1,
1995