You lose simple properties of division, such as (a + b)/c = a/c + b/c.
If you set 1/0 = 1 (as the author claimed causes no inconsistency), then (2 + 1)/0 = 2/0 + 1/0 is false
If you come to rely on division by zero behaving a certain way (as happens to all features/bugs of any language), then suddenly silly decisions about how to implement a formula can cause wildly different behavior. Good luck refactoring!
If you define a/0 to be a value, then you can ask whether (2 + 1)/0 = 2/0 + 1/0. This is not meaningless since both sides are defined. If you define a/0 = 2, then the equation is false.
Using "infinity" (which is not how it's actually done in math) requires you to rule out the possibility of operations like infinity - infinity.
If you set 1/0 = 1 (as the author claimed causes no inconsistency), then (2 + 1)/0 = 2/0 + 1/0 is false
If you come to rely on division by zero behaving a certain way (as happens to all features/bugs of any language), then suddenly silly decisions about how to implement a formula can cause wildly different behavior. Good luck refactoring!