Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The only thing that truly matters is this:

When software engineers make mistakes dividing by 0 and end up with Exceptions being raised or NaNs being output, they'll usually blame themselves.

When the results are wrong numbers all over the place, they'll blame the language.

There are 2 cases when people are going to "use" x/0:

1. They made a mistake.

2. They KNOW that x/0 returns 0 and they take it as a shortcut for (y == 0 ? 0 : x/y)

Is that shortcut useful? No. Is it dangerous? Yes. Hence, this is a bad idea.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: