Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It’s saying that Pony is mathematically wrong. This is objectively false.

Pff. The author wants to show off their knowledge of fields by defining a "division" operator where 1/0 = 0. Absolutely fine. I could define "addition" where 1 + 2 = 7. Totally fine.

What I can't do is write a programming language where I use the universally recognised "+" symbols for this operation, call it "addition" and claim that it's totally reasonable.

Under the standard definition of division implied by '/' it is mathematically wrong.

What they obviously should have done is use a different symbol, say `/!`. Obviously now they've done the classic thing and made the obvious choice unsafe and the safe choice unobvious (`/?`).



It's a question of usefulness. If in your problem domain "1+2=7" is the most useful definition, then by all means do that. Why does the semicolon terminate statements and not the universally agreed upon period? Why does the period denote member access? Why is multiplication not denoted by the universally agreed [middle dot / cross character] (strike out the one that is not universally agreed in your country). The design and semantics of a programming language ought to be in service of the programs we wish to express, and informed by our decades of experience in human ergonomics. Blind reverence to religions of yore does us no good. Mathematical notation itself has gone through centuries of development and is not universal, with papers within the same field using different notation depending on what strikes the author's fancy. To treat it as sacred and immutable is to behave most un-mathematically. Hell, you can still get into a nice hours-long argument about whether or not the set of natural numbers includes zero or not (neither side will accept defeat, even though there is clearly a right answer)!


> What I can't do is write a programming language where I use the universally recognised "+" symbols for this operation, call it "addition" and claim that it's totally reasonable.

As a programmer, you're right: we have standard expectations around how computers do mathematics.

As a pedant: Why not? Commonly considered 'reasonable' things surrounding addition in programming languages are:

* (Particularly for older programming languages): If we let Z = X + Y, where X > 0 and Y > 0, any of the following can be true: Z < X, Z < Y, (Z - X) < Y. Which we commonly know as 'wrap around'.

* I haven't yet encountered a language which solves this issue: X + Y has no result for sufficiently large values for X and Y (any integer whose binary representation exceeds the storage capacity of the machine the code runs on will do). Depending on whether or not the language supports integer promotion and arbitrary precision integers the values of X and Y don't even have to be particularly large.

* Non-integer addition. You're lucky if 0.3 = 0.1 + 0.2, good luck trying to to get anything sensible out of X + 0.2, where X = (2 ^ 128) + 0.1.


> I haven't yet encountered a language which solves this issue:

Well, Python supports arbitrary precision integers. And some other niche languages (Sail is one I know).

I don't think "running out of memory" counts as a caveat because it still won't give the wrong answer.

For floats, I don't think it's actually unreasonable to use different operators there. I vaguely recall some languages use +. or .+ or something for float addition.

Fair point about wrapping.


> Well, Python supports arbitrary precision integers. And some other niche languages (Sail is one I know).

As a Lisper, I very carefully chose an example to account for arbitrary-precision integers (so X + X where X is, say, 8^8^8^8 (remember, exponentiation is right-associative, 8^8^8^8 = 8^(8^(8^8)))).

> I don't think "running out of memory" counts as a caveat because it still won't give the wrong answer.

Being pedantic, it doesn't give the _correct_ answer either, because in mathematics 'ran out of memory' is not the correct answer for any addition.


Right, but you can never guarantee giving the correct answer. What if someone unplugs the power mid-computation? That's basically where running out of memory is (for a modern desktop system anyway).

The best you do is "not the wrong answer".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: