Creating a small wrapper type over a slice/map is trivial in Go (`type X[T] []T`) , and then you can define the range functions as methods on that slice type. If they allowed generic instance methods it would be even simpler.
That's not the only point of methods, even if it's the only one the designers of Go envisioned. Another very relevant purpose is method chaining syntax. That is, with instance methods you can write a.b().c(), with functions you have to write c(b(a)). This turns out to be extremely relevant for longer chains.
Of course, other than generic methods, this could also be supported by just supporting universal function call syntax. That is, the compiler could simply take f(x, a) and x.f(a) to be perfectly equivalent, regardless of whether f is a method of x's type or a free floating method. There is some minor complication because of backwards compatibility, but that's easily fixed (the syntax you use can prefer the function f vs the method f if there is any ambiguity).
On the other hand, generic methods can be extremely useful in their own right, for other reasons as well. Having generic methods in an interface, such that a type has to have a generic method to implement that interface, is perfectly reasonable as a feature request - it wouldn't contradict anything in the spirit of Go. Of course, the implementation can have problems and trade-offs, I'm not claiming this is an easy feature to implement. But I don't think it's excluded.
> Creating a small wrapper type over a slice/map is trivial in Go
And yet it's specifically one thing rsc did not want. Further issues described in the rangefunc proposals:
- it would require the desugaring to run off of method-set analysis of userland types, something which does not currently exist
- it severely complicates resource management around the iterator, as you need a 3-step iteration for resource-bearing iterators (acquire iterator, defer cleanup, perform iteration)
And one not actually listed explicitly: for the limited amount of optimisations the Go compiler does, internal iteration is a lot easier to optimise as it pretty much inlines down to a `for` loop, the termination of which is much easier to analyse than bouncing through a bunch of pull calls.
Not only that, but `for range` works off of underlying type, so this is already valid Go:
import "fmt"
type Foo []int
func main() {
f := Foo([]int{1, 2, 3})
for _, v := range f {
fmt.Println(v)
}
}
One could approach it the other way. Once more projects adopt all kinds of wrapper functions and types, the deficiency of Go will become a more widespread knowledge as compiler will get progressively less able to cope with added abstractions.
Hopefully it will put the common fallacy of "Go or Rust" to rest as the weight class and capabilities are on opposite ends of spectrum, with much closer comparison being "Rust or C# or Swift or Kotlin" if one looks for a Rust alternative that makes a tradeoff of not forcing many small decision to reduce decision fatigue by conceding to a reasonable extent certain areas Rust excels at.
In any case, for its touted simplicity Go sure doesn't look like a simple and straightforward to follow language anymore.
The 3-step approach to iteration is also well solved[0] in C#, and works even in rather complex cases like `File.ReadLines(...)` where the line iterator internally handles IO, file handle acquisition and disposal. Just `foreach (var msg in File.ReadLines("messages.jsonl"))` and you won't be able to make a footgun out of it.
This also applies to the usage chained with filter/map/etc.
var messages = File
.ReadLines("messages.jsonl")
.Select(line => JsonSerializer.Deserialize<Message>(line))
.ToArray();
Any new feature requires something that currently doesn't exist in the compiler. Just because the change would be larger doesn't make it worse, or at least this can't be the only argument.
And if you do implement interface-based iteration with an Iterator interface, it's not hard to also add a ClosableIterator interface and have the loop handle the auto-close as well.