Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You lose simple properties of division, such as (a + b)/c = a/c + b/c.

If you set 1/0 = 1 (as the author claimed causes no inconsistency), then (2 + 1)/0 = 2/0 + 1/0 is false

If you come to rely on division by zero behaving a certain way (as happens to all features/bugs of any language), then suddenly silly decisions about how to implement a formula can cause wildly different behavior. Good luck refactoring!



(a + b)/c = a/c + b/c for c ≠ 0. Otherwise it has no meaning.

Therefore (2 + 1)/0 = 2/0 + 1/0 is not even false, it just means nothing.

BTW, you didn't mention Infinity explicitly, but it would lead to the same inconsistency, if you think about it:

(2 - 1)/0 = 2/0 - 1/0

:)


If you define a/0 to be a value, then you can ask whether (2 + 1)/0 = 2/0 + 1/0. This is not meaningless since both sides are defined. If you define a/0 = 2, then the equation is false.

Using "infinity" (which is not how it's actually done in math) requires you to rule out the possibility of operations like infinity - infinity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: