TFA was about mathematics, not computer programs.
Mathematically, the limit as b approaches 0 of a/b is defined to be +/- INF depending whether a and b have matching signs. The limit represents the value that a/b asymptotically approaches as b approaches 0. a/b for b=0 is still undefined.
For a good example of why this needs to be undefined, consider that limit as b approaches zero of a/b is both +INF and -INF depending on whether b is "approaching" from the side that matches a's sign or the opposite side. At the exact singularity where b=0 +INF and -INF are both equally valid answers, which is a contradiction.
also in case you weren't aware, "NaN" stands for "not a number".
Pony is what prompted TFA to consider whether or not 1/0 should be defined. It's not what the article is about. Obviously anybody who writes a compiler can define / to have a specified behavior for a zero divisor; TFA is about whether that's correct. There's nothing significant about IEEE 754 choosing to define an operation that's nominally undefined, as it does not have any bearing on whether or not that behavior is correct.
For a good example of why this needs to be undefined, consider that limit as b approaches zero of a/b is both +INF and -INF depending on whether b is "approaching" from the side that matches a's sign or the opposite side. At the exact singularity where b=0 +INF and -INF are both equally valid answers, which is a contradiction.
also in case you weren't aware, "NaN" stands for "not a number".