I couldn't agree more. People forget that the calculus of Newton and Leibniz was based on infinitesimals, not the "rigorous" epsilon-delta proofs and the concept of a limit. Calculus was found by intuition and experimentation (Archimedes' method of exhaustion to find pi), and not following the implications of "What happens if I arbitrarily define a limit?".
Current calculus education is like teaching kids about color theory, photons, how the eye works without just letting them fingerpaint. Those details will come, but the vast majority of people just remember calculus as a painful memorization exercise.
As an aside, e is the same way. Most people have it taught as an abstract limit concept: lim n->inf (1 + 1/n)^n without realizing it's actually about growth (and that's how it was discovered):
Excellent article. I noticed recursively growing series all seemed to have the same ratio, but I didn't realize this was related to e. However, I did see e show up all over the place, which I didn't understand. Anyways, it's obvious I'll need to read more of your articles. I'd much prefer to understand the intuition behind mathematics than just learn the formulae so I can "get things done."
Regarding calc, I had a strange experience where I kept insisting calculus only logically works if it is based on infinitesimals, which I thought was obvious, but the people I was talking with insisted that it was based on limits. They couldn't understand that an infinitely small value has to be more than zero if an infinity of them is to add up to anything more than zero.
In general, my math education seems to have been particularly bad at dealing with infinity.
If d is an infinitesimal value, I'm not sure what d * infinity would be. If d is defined as 1 / infinity, then it would equal 1, but that doesn't seem right.
This textbook teaches calculus using the infinitesimal approach. To do rigorous infinitesimal calculus, you have to define the hyperreal number system.
The tricky thing is that infinity isn't a number as we normally think of numbers. There are different kinds of infinity, so infinity / infinity does not necessarily equal 1. It could be any number between 1 and infinity, inclusive.
To get this intuition, think of the integers. There are obviously an infinite number of them. Now think of the rationals. There are now an infinite number of numbers between 1 and 2, so there are more rational numbers than there are integers; in fact infinitely more.
I believe set theory is the only area of mathematics where different orders of infinite actually mean anything.
So, in set theory, cardinality is a measure of the elements of a set. The cardinality of the infinite set of natural numbers is aleph-0. An infinite set has cardinality aleph-0 if it can be put in one to one correspondence with the naturals. http://en.wikipedia.org/wiki/Bijection This counterintuitively includes the rational numbers also. http://en.wikipedia.org/wiki/Cantor%27s_diagonal_argument
The real numbers can't be put into one-to-one correspondence with the rational numbers. So the infinite of the reals truly is bigger than the infinite of the rational numbers.
You don't think mathematics is unified, i.e. something true in one area may not be true in another? I don't think a lot of mathematicians believe this, and it isn't clear to me that "infinity" refers to two different things in the two different realms (the other possibility).
I couldn't disagree more. Just because a method "works", does not mean a rigorous approach isn't required. It seems you are also forgetting that most science was started as "intuition" but at some point point a rigorous approach was required to actually turn it into science.
From the field of mathematics, set theory comes to mind.
You forget that you are not teaching it to kids who are supposed to have fun with it - you are teaching it to people who will presumably use that info in a few years, and most won't have another chance to learn it again.
I also don't know exactly how it is taught in the US, but here (Israel) there are calculus courses which deal with with the practical side for physics students and such, and there are courses which deal with rigorous theory and proofs for match students and the like.
I have to add I'm taking my third calculus course this semester, and so far the second one was my favorite subject (Math major).
I think we have a false dichotomy here. What we call 'intuition' and 'rigor' are not necessarily opposed. I think the problem with math classes is not any lack or excess of rigor, but that they focus too much on details instead of the general principles from which those details are derived. Complicated formulas and procedures are presented for students to memorize without any understanding of their origin. It's mostly 'how,' a little 'what,' and never 'why.'
Oh, on that part I agree in principle. Maybe these things are taught differently here, but there is a strong enough focus on the "why" part for me to feel comfortable with it...
Current calculus education is like teaching kids about color theory, photons, how the eye works without just letting them fingerpaint. Those details will come, but the vast majority of people just remember calculus as a painful memorization exercise.
As an aside, e is the same way. Most people have it taught as an abstract limit concept: lim n->inf (1 + 1/n)^n without realizing it's actually about growth (and that's how it was discovered):
http://betterexplained.com/articles/an-intuitive-guide-to-ex...