Let's start with Newton: he claims to have developed the method of "fluxions" in 1666 (when he was 23, not bad for a post-grad, eh?); however, there's not much to defend this claim other than word of mouth and a small endnote in a publication). Over 20 years later, Newton published his research in his most famous work,

*Principia*. His notation for the derivative (which he called the fluxion) was . For two derivatives, he put two dots; however, this notation becomes cumbersome after four derivatives (although not much research goes on currently with derivatives beyond three, excepting a few remainder theorems). Further, his method of integration (for which he noted as the "inverse" of differentiation and called "fluents"), was honestly pretty sloppy and unstandardized.

On to Leibniz: He created much of the notation that we use today for calculus. The earliest publications of his work with differential analysis dates to around 1677. Further, his notation was highly standardized and versatile. For instance, his derivative notation is one of the primary notations used today is the following: He also invented the notation for integration. Pretty savvy dude.The "prime" notation (f', f'', f''', f^(4), etc.) was invented by Lagrange, who earned his doctorate with a slim 19 page thesis and is often dubbed the "Prince of Mathematics."

In my opinion (everyone's got one), Leibniz takes the cake. Newton's notation was clunky and not published until the 1680s.

## 1 comment:

I find this really funny because for someone like me, who enjoys other subjects over math, it's like two guys arguing over who created a torture device. I will admit, however, that this torture device we still use today is incredibly important. For example, you can use derivatives to quickly calculate velocity and acceleration if you are given a position - that is, if you're into that sort of thing.

Post a Comment