The Calculus Gallery: Masterpieces from Newton to Lebesgue by William Dunham (Princeton University Press, 2004).
I love this book. It's extremely hard to find something like it. On the one hand, it is a book about math that is written with a sense of drama as well as a sense of and concern with history, so that you are reading a story more than an exposition of mathematical fact. This distinguishes it from a scholarly article or textbook. But on the other, it is not afraid of mathematical detail, so you actually get to see what is being talked about! This differs from most popular accounts of math, which tell the story in passionate detail but cloak the actual content of the story in vagueries.
Dunham doesn't mean for The Calculus Gallery to be a history of calculus - he specifically disclaims this in the introduction. He means for it to be a showcase of beautiful mathematical breakthroughs (hence "gallery"). So the chapters are named after mathematicians, the way exhibits in an art gallery are often organized around artists.
But the historical narrative is inescapable, since each mathematician's breakthroughs only make sense in relation to the state of the art in his time, and this is what made the book such a revelation to me. I was in my second year of teaching an AP calculus class when I read it, and it permanently changed the way I looked at the subject.
Points my students found confusing were revealed to be tensions driving the history of calculus! For example, modern treatments of calculus in textbooks define everything in terms of a precise definition of limits, but everyone who actually uses calculus also has to get comfortable thinking about infinitely small quantities. Some people (for example, the author of this review) take to infinitely small quantities right away, like the little mystical artifacts that plug into the machinery of calculus and give it its tremendous power, but many people quite rightly balk at such a ridiculous idea. Something infinitely small but not zero? Absurd!
Before I read The Calculus Gallery, I saw the technical definition of the limit as nothing but a sort of gussied-up way of talking about the infinitely small. What I learned by reading it is that no, no, no! The modern definition of the limit is the way that was finally found, after much searching, to define everything in calculus without the infinitely small, so as to answer the objections once and for all time of those who could never stomach the idea. The mystical objects were not necessary after all - the machinery runs fine on the good, wholesome logic of finite quantities (though the little trinkets do sometimes provide handy mental shortcuts, as any regular user of calculus will tell you).
Furthermore, this way out of the quandary of infinitesimals was not found until the 19th century! People say "Newton and Leibniz invented calculus" but the modern idea of a limit, on which the entire architecture is now based, was totally unknown to them, and would not be developed for another 150 years.
If my harping on the pivotal importance of "the modern idea of a limit" has piqued any curiosity in you, you should read The Calculus Gallery and watch it grow - from the time of Newton and Leibniz, before the need for such an idea was even perceived; through the fabulous critique of calculus in 1734 by the philosopher and theologian Bishop George Berkeley; through the attempt by Lagrange to answer Berkeley's criticisms; to the inception of the way out by D'Alembert, its nurturing in its infancy and childhood by Cauchy, and its eventual development into a fully mature idea by Weierstrass.
To be sure, this story is not only found in The Calculus Gallery. Other treatments of the history of calculus also contain it. But Dunham's book does it a great justice by its balance of story and mathematical detail. And this is just one (though a very important one) of the book's many fascinating subplots.
I do have one objection: I don't like fawning over great mathematicians of the past. Nothing wrong with being impressed, but Dunham has already taken care of this by having the guts to put real mathematics in his book: the theorems (which are uniformly stunning) can speak for themselves. Now and again I experienced the book's tone of reverence for its human subjects to grow tiresome. Lay people are already too obsequious about mathematics in the presence of its priests, as though mathematics were made of magic rather than logic; no need to invite them to genuflect any lower.
But this one objection is outweighed manyfold: the book is special. In terms of the aforementioned balance of history and actual math, I don't know another one like it.©2009 Ben Blum-Smith, 2009