Calculus: The True Chronology
“What is the Fundamental Theorem of Calculus?……
Differentiation is easy; Integration is hard”
– Ted Bunn
In case you are wondering, no matter how true it might seem, that’s not the actual Fundamental Theorem of Calculus. Fortunately though, this post is primarily concerned with the birth of differential calculus. As I glance down at my prompt, it reads, “….while derivatives were being intuitively used in the 1600’s by Fermat, the rigorous notion of continuity was not used until the 1800’s….” Surprisingly, this statement is only partially true; we can actually trace back the intuitive use of calculus at least back to Archimedes, who attempted to find what we now call the “tangent” to a curve. There were then a series of breakthroughs in what is now classified as Calculus, in the Middle East and India. It is, however, integral (pun intended) to point out that these were isolated attempts and none made an effort to generalize the concepts, as was done in the 17th century.
No discussion of Calculus is complete without the mention of Isaac Newton, and neither will this one be. In 1966, while escaping from the malignant plague, Newton was interested in solving problems in relation to the movement of physical bodies which led him to discover (some would say, “invent” but let’s leave that debate out for this post) calculus – which he did not publish till 1687 in what is arguably the most significant book on science and math, the Philosophia Naturalis Principia Mathematica. But Newton’s methods of finding, what he referred to as, the “fluxion” i.e. the derivative of a function, were not constructed on what we imagine to be a function today. Instead, he dealt solely with mechanical functions, i.e. functions followed by mechanical bodies, which were inherently continuous and “well-behaved” (there was no Quantum Mechanics back in the day to crush his intuition). Meanwhile his contemporary, Wilhelm Leibniz, also (independently) discovered calculus and published his results before Newton (which led to what is called the “Calculus Controversy” of the 17th century). Leibniz is also credited for formulating the modern notation of the integral sign and the use of “d” to denote a differential.
It is important to note here that what differentiated (also pun intended) the work of these two minds from earlier works was that they realized that “integration” and “differentiation” (as we call them today) are inverse problems and both are achieved by starting at two arbitrary distances on a curve and letting the distances get “infinitesimally” small. But the choice of the word “infinitesimal” was both a feature and a bug in their works. There existed no rigorous definition of the concept; perhaps the closest definition was when Newton defined infinitesimal as a number greater than 0, but less than the absolute value of any real number.
Interestingly, for two centuries after the work of Newton and Leibniz, Mathematicians and Physicists got significant mileage out of these formulations, despite their lack in rigor. In fact, many would argue that it was the very lack of rigor that gave the likes of Euler, Fourier, and Laplace the leeway they required to make progress. The discussion of the fundamentals of calculus was hence reduced to a secondary and philosophical problem. It is then no surprise that the first attack on calculus came from a philosopher, Bishop Berkeley, who wrote in his “Discourse Addressed to an Infidel Mathematician”, regarding the process of finding the derivative of a polynomial,
“If we consider , taking the ratio of the differences , then simplifying to , then letting h vanish, we obtain . But is h zero? If it is, we cannot meaningfully divide by it; if it is not zero, we have no right to throw it away. The quantity h might have signified either an increment or nothing. But then, which of these soever you make it signify, you must argue consistently with such its signification.” [1]
Clearly, the empire of Mathematics was under attack and the bastions of the field were under moral obligation to defend it. However, that was not the only reason that a rigorous construction of calculus was needed. For one, without a rigorous definition of a “limit” and “continuity”, the prospects of calculus seemed to be reaching a limit (this is no longer a pun – its just irony) with Lagrange calling higher mathematics “decadent”[1].
Moreover, the 17th and 18th century work on the algebra of inequalities provided a fertile ground for the basis of calculus to develop (think back to all the definitions and theorems we have learnt so far: they are more often than not, inequalities). In the eighteenth century the work on inequalities had been crucial for the study of approximations. Indeed, it was in studying the bounds for errors in series approximations that Cauchy first realized the stark similarity to the concept of a limit, which was then thought of as nothing more than a bound that could be “approached closer and closer, though not surpassed”. For many of us (at least for me), this is the intuitive image of limit; Cauchy thought about making this a rigorous statement using the algebra of inequalities he had worked with in studying error bounds. It was this that first gave birth to what we now use as the “epsilon-definition” of a limit. In fact, the letter “epsilon ()” was initially used by Cauchy to symbolize “erreur” – or error!
The stage was now set to exploit these techniques, and the mathematical atmosphere (many would not have cared for it in 18th century), to rigorously define the concept of “continuity”. Cauchy capitalized on this opportunity, “A function f(x) is continuous on a given interval if for each x in that interval if the absolute value of the difference f(x+a) – f(x) decreases indefinitely with a“. While accurate, this still defers in rigor from the modern “epsilon-delta ()” definition of continuity which was first described by Bolzano in 1817 and then further refined by Karl Weirstrass to take its modern form, in all its glory:
“A function is continuous at a point c in A if, for all , there exists a such that whenever (and x is in A), it follows that “[2]
Whether Bolzano was influenced by Cauchy’s definition is a convoluted debate, one that we avoid here. There is, however, a certain irony in the fact that the very “” symbol once used by Cauchy to describe errors has today come to epitomize precision and rigor in calculus!
Note: In favor of academic honesty, I’m obliged to inform the reader that the mathematical formalisms that appear in this post are adapted to the topology of the real line. The treatment of the general form of continuity over metric spaces can be found in [3], and references therein.
References
[1] Anderson, Marlow, Victor J. Katz, and Robin J. Wilson. Who Gave You the Epsilon?: And Other Tales of Mathematical History. Washington, DC: Mathematical Association of America, 2009. Pdf.
(And references therein)
[2] Abbott, Stephen. “Chapter 4.” Understanding Analysis. New York: Springer, 2015. N. pag. Print.
[3] Atanasov, Atanas. TOPOLOGY AND CONTINUITY (n.d.): n. pag. Harvard Mathematics Department. Web.
Leave a Reply
You must be logged in to post a comment.