Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Go Proposal: first-class support for sorting slices (github.com/golang)
89 points by giovannibajo1 on Aug 16, 2016 | hide | past | favorite | 104 comments


So the current approach is providing some popular functions which would normally require generics to implement as built-ins?


I was just ranting about this approach elsewhere. Go and QBasic are broken in the same way: both lack power in the core language and have special syntax for facilities that really ought to be normal calls into the standard library. In Go's case, we have goroutines, error flag omission, maps, and so on; in QBasic, we have LINE.

A language's core syntax should not privilege its standard library above other libraries.


A language's core syntax should not privilege its standard library above other libraries.

I'm not so sure. Smalltalk had this in spades. There is a downside to this. Giving a bunch of 20-somethings the full power to basically change everything can result in code-bases which suffer from the chaos of over exuberant hubris. Go is deliberately favoring a certain set of conventions. To do this, they are also deliberately making it harder to change the language from within itself.


Encouraging consistency across the language ecosystem doesn't have to conflict with layering. The problem with having what are properly library routines in the language core syntax is that it's a layering violation: it makes the language more complex, harder to reason about, and harder to learn because it creates special cases in the grammar and core semantics. (For example, note the inconsistent behavior of indexing a nil map vs. indexing a nil slice.)

The way to discourage people from "changing everything" is to make the standard library ubiquitous and to use encapsulation to prevent people from changing its internals. If Go had generic maps in the standard library, and you couldn't get access to their internals, then the effect would be exactly the same as the built-in map that Go has today. You see this in other languages: very few people create custom dictionaries in C#, for example.


It's not about changing everything, it's about being able to make abstractions that aren't second class citizens. By giving this power, you also increase the consistency of the language.


By giving this power, you also increase the consistency of the language.

Maybe. That's like the apocryphal story about the student asking a question if a particular proof step is really obvious, so the professor goes across the hall, derives stuff on the other blackboard for the next 30 minutes, then comes back into the lecture hall saying, "Yes, it's obvious."

Maybe you increase the consistency of the language from one point of view, but what happens from the perspective of each individual project? What if this leads to projects where templates have been used to create 3 different template based "little languages" to make X, Y, and Z easier? You had 3 problems, but now you have 6. The oft written reply to that in these debates, is to limit the power of the template system -- but is that a robust goal under the group dynamics of the language community? I think not.

One subliminal goal in the design of Go seems to be about privileging certain conventions to avoid a babel of roll-your-own conventions in large projects. This is all across the language and even in the toolchain.


The problem with templates is that they're ad-hoc: they can be specialized, and they're only checked at instantiation time. C++'s type level is literally a dynamic language, and template errors are literally stack traces, only made worse by backtracking, due to the deepest unification rule and SFINAE.

But bona fide parametric polymorphism doesn't have this problem. ML's design is a constructive proof that you can have a reasonable degree of abstraction without compromising usability.


It's a lot easier to use lints, binding coding standards, compiler options, and so on to limit your team to a subset of the language than it is to bolt more power later onto a language like Go.

Being opinionated works well for some teams --- I get that. You don't need the language spec to be opinionated when you can get the same result through mechanisms that don't affect everyone.

It's just like when we're writing code: don't make global changes to achieve some local effect


It's a lot easier to use lints, binding coding standards, compiler options, and so on to limit your team to a subset of the language than it is to bolt more power later onto a language like Go.

This deserves a bitter laugh. For large, long-lived codebases, where there have been effectively many different teams and a number of different managers, the standards, coding styles, and tooling are almost certain to change. If you know of a large corporate codebase where this is not true, please tell me about it. Hell, you need to write a paper about this and start giving talks at conferences!

It seems like the Go maintainers are trying to solve this by moving such conventions from the level of individual project to the level of the language community. I think there is merit in this. For one thing, I suspect this will create greater social and intellectual cohesion across the language community as a whole.


> For large, long-lived codebases, where there have been effectively many different teams and a number of different managers, the standards, coding styles, and tooling are almost certain to change

Sure, but I've never been convinced consistency matters at that scale. I think it's enough to be locally consistent; Windows doesn't seem to be hurt by its codebase containing practically every style you can imagine.

> level of the language community

Sure, but if you make the language too Procrustean, you risk making that community smaller, even if it's more cohesive. Personally, I'd rather have a bigger community, even if the members don't agree on everything.


Sure, but I've never been convinced consistency matters at that scale. I think it's enough to be locally consistent

In C++, it can result in the occasional memory management paradox to work through. (In addition to the background cognitive load for watching out for such.)


> For large, long-lived codebases, where there have been effectively many different teams and a number of different managers, the standards, coding styles, and tooling are almost certain to change.

This presupposes that change is bad. In many (most?) codebases, the change in style is good. Gecko, for example, has survived as long as it has because of the gradual migration away from '90s Don Box-style componentry toward modern C++.


This presupposes that change is bad. In many (most?) codebases, the change in style is good.

Sure, such a thing can be good, as in, it's a good trade off. Pulling a tooth can be good in this sense, as it's better than getting further infections from an abscess. But here's the thing about trade-offs -- really you want to avoid having to make them in the first place. (Dental hygiene.)

I'm working in a C++ code base right now. It's like an archaeological dig, with each "layer" corresponding to a major revision of C++ style. It's "good" that parts of the codebase are more modern, but the price for this is the cognitive load of language mode switching. (And occasionally, memory management conundrums.)


I can't tell whether you're arguing (a) that people who originated whatever C++ codebase you're working on should have had better foresight or (b) nobody should have even tried migrating to a more modern style. I think both (a) and (b) are untenable: (a) is equivalent to "people shouldn't make mistakes"—i.e. directly at odds with the real world—and (b) is a recipe for stagnation.


I can't tell whether you're arguing (a) that people who originated whatever C++ codebase you're working on should have had better foresight

To quote Theodore Roosevelt, "A man's hindsight is as good as his foresight, and if he doesn't use it, it's a darnedsight!" I'm saying that the Go maintainers should take a look at the historical pitfalls of other languages, like C++, and try to use that information to chart a smoother path. From listening to Rob Pike talk about his motivations, I think that's exactly what happened with Go.


You're being very nonspecific again, but I'll assume that your specific criticism is that C++ should have standardized STL collections sooner. I agree, but it doesn't imply that it should have baked them directly into the language syntax!

I can't think of a single language that had problems with incompatible core collections due to having them be in the library as opposed to directly in the language. The closest thing I can think of is Haskell's explosion of String types, but one of the lessons to take from that is precisely the opposite of what Go chose—it's pernicious partially because the suboptimal String is baked into the language, and the overloaded strings extension is necessary to correct this!


I'll assume that your specific criticism is that C++ should have standardized STL collections sooner.

You assumed wrong.

I can't think of a single language that had problems with incompatible core collections due to having them be in the library

So? Who said that? Not me! And actually, Smalltalk did have some collection incompatibilities across vendors. (#at:put: returning the inserted value in one impementation while it returned the collection in other ones.) But the implementations of those weren't completely in the library. (You could treat them as such, however.)


The privileging of Go's stdlib is a core design choice. The idea is that your implementation code is simple, and the language takes care of library code in whatever way is best. If you don't like it, use another language.


I understand the want to avoid over architected designs, but relying on a single party (who is not associated with your product or business need) to bless every abstraction you can use in your code base seems like a recipe for pain and awkwardness. Definitely limits me from wanting to make significant investments in go.



Right, so now I'm forking it and maintaining my own language to add a feature I just get for free in any number of reasonable competitors..


Rust is ready, free, and open. Use it and stop complaining about Go.


Rust has manual memory management. Most applications where you'd use Go don't need nor want the added complexity of managing memory, ownership, learning curve, etc. It's not really a reasonable exchange.


In my case it's python, Rust has a bit more of a learning curve / productivity hit. I'm still complaining though because I like what Go is doing with concurrency and simplicity, but I think it would be almost perfect if it just trusted the user a bit more.


You say you like the simplicity, yet you are complaining about it.


Besides the fact that I think it's less simple to have ad-hoc compiler extensions for the standard library types that can't be replicated in userspace, so the alternatives are code generation and type erasure, it doesn't have to be an absolute principal! We can have generics without sliding towards some slippery slope of abstraction! You don't allow some kind of generics one day and wake up the next day with custom operators and monads. The buck really can stop at this one pain point.


Okay, that's nice, but the Go developers already picked their side in the debate many years ago. If you don't like it, use one of the many other tools available.


I think it would be almost perfect if it just trusted the user a bit more.

If it trusted the user just a bit more, it would be making many of the same mistakes other languages make, which Go is trying to avoid. You're free to go and use some of those other languages.


I don't understand this attitude. I do use other languages, and I can tell you type polymorphism is not a mistake. Complicated architectures full of abstraction are a mistake, which you can limit without sacrificing the ability for me to make a type-safe generic function.


I can tell you type polymorphism is not a mistake.

It's a particular trade-off. Some opine that it's not the right trade off. You disagree.

Complicated architectures full of abstraction are a mistake, which you can limit without sacrificing the ability for me to make a type-safe generic function.

From what I've seen, the other mechanisms for limiting over-exuberant abstraction don't work well enough over the long term, at scale.


> From what I've seen, the other mechanisms for limiting over-exuberant abstraction don't work well enough over the long term, at scale.

Can you name a specific example of people using ML-style generics (i.e. no typeclasses, no module system) to achieve "over-exuberant abstraction"?


Nope. You got me there. I'm not so sure that would be cheap to implement in Golang, though. Also, your comment prompted me to find and read this:

http://people.cs.uchicago.edu/~jacobm/pubs/templates.html


> If it trusted the user just a bit more, it would be making many of the same mistakes other languages make, which Go is trying to avoid.

Can you explain with an example that would apply to Go how introducing generics to a language was making a mistake? Be specific.


Can you explain with an example that would apply to Go how introducing generics to a language was making a mistake? Be specific.

These things are epiphenomenal, and have to do with what happens in large codebases over a long time, with lots of programmers. It's a fallacy to suppose that neat StackOverflow sized examples are some kind of a evidence gold standard. The problems I've encountered with C++ templates have to do with the interaction of several things at once, in places I'd have to dig out of version control, in codebases I can't share. So no, I'm not signing up for doing that work for you for free.


I'm not asking for examples of bad interactions between C++ templates and other features of C++--there are tons of those. I'm asking for examples of languages in which implementing simple generics was a bad idea in retrospect.

(I don't think there are any such examples, because simple ML-style generics yield a lot of power for negligible drawback, and I hope that future versions of Go add them.)


I won't happen, there was this presentation some time ago that the language design was done and their focus was improving the runtime and tooling.

Hence why I decided to stop arguing about Go's lack of generics and rather advocate it for those that search for a C + GC with improved type safety.

For the rest of us there are better options.


So you want other people to maintain something for you and do it the way you like? Obviously, it's not simple to just fork a project like Go and maintain it but the point is that it is not legally impossible. A group of people inclined enough to "fix" Go for themselves are completely free to do so.

BTW, you might be interested in https://oden-lang.org/


No, I want the language to have a standard way to extend it that's compatible with the core. Generics is a standard way to do type polymorphism without having the ship people another compiler.


Well, the Go developers and most of its real users (not bloggers who dabble) don't want to go this route. So seek your solution elsewhere.


> So seek your solution elsewhere.

Yeah, how dare he have an opinion that runs counter to the known truth. He has attacked the Body. He is not one with Landru.


Well, that's how C++ started. I'm honestly surprised nobody's forked Go or made a less obnoxiously opinionated front-end.


C++ implementations were commercial products. Same with C before it, and many (possibly most) languages used in industry. Because platforms were less uniform, and even where they were porting still took much more effort than it does today.

It's really hard to justify making a new go, python, lua, etc. implementation when the primary ones are stable and support a large number of platforms.

And adding language features by altering syntax/semantics, you might as well produce an entire new language rather than deal with the headaches of being partially compatible and having to track the original over time.


I think you overestimate the number of people who care that much about Go. It's not very widely used.

It would also be a massive effort to add generics to it at this point, because of the design choices the team made early on.


Not sure what counts for "widely used", but it's now in the top20 TIOBE: http://www.tiobe.com/tiobe-index/


"It's not very widely used."

Um. Well ok, maybe but not likely, as pointed out by a sibling comment, but it is used by a large fraction of companies that deal with services on an enormous scale - Google (obvi), Dropbox, Cloudflare... here, better than copy/paste: https://github.com/golang/go/wiki/GoUsers

You're going to recognize an awful lot of those companies.

Edit: not sure if it counts, but the number of companies using software written in Go in mission-critical ops is enormous (see: Docker).


> It would also be a massive effort to add generics to it at this point

Not really. The easiest approach would take like a day for PoC.


What approach would that be?


That would be meta/templating approach.


Oden: experimental, statically-typed functional language, built for Go ecosystem

https://news.ycombinator.com/item?id=11183836


Thanks for the link, this looks pretty cool. Like Scala was on the JVM, this could be a sort of escape hatch for some people on the team.


I promise this isn't a troll question, but isn't the opinionatedness (which is a matter of taste) one of Go's biggest features? Once you remove that, what do you get over Elixir or Erlang, on a technical level? Static linking and no VM?


I think the opinionatedness doesn't have to stop, just the opinions can change in the face of evidence that a particular abstraction is more useful than the archetectural hazards it presents developers.


Can you quantify this, then?


That's a good question.

Maybe I could do some kind of poll of people hitting this or other rough snags that would be solved by generics or something similar? But I am afraid that (rightfully so) go fans would feel the poll could be overrun by people who only casually dabbled in go. Same with counting the many complaints of people every time this topic is brought up- the general response seems to be that they don't want that kind of programmer here anyways, so go away.


Go++?


C++ started as a tool that generates C files that can be compiled by a standard C compiler, not as a fork of anything.


Something we're slowly coming to terms with is the consequence of having open source without open governance.

It's not clear to me that the language would be better if the project stewards accepted more feature requests, though. It is incredibly difficult to hold the line against feeping creatures and a noisy minority when the benefit of simplicity is diffuse.


forks are free, stop waiting for other people to do your work for you


I think they should at least implement a few builtin functions (that can be "real" generics) for common stuff that manipulate slices and/or maps.

I'm thinking things like min/max, which are really distracting and are concepts that are more easily expressed as a concise function rather than an explicit loop.


Or use code generation via the generate package.

Go is a great C replacement for any use case where using a GC is an affordable option.

Other than that, there are lots of other languages with AOT compilation to native code and better abstractions.


If I can use GC, why would I pick go over any of the other compiles-to-native languages with vastly friendlier type systems and language features? Rust, D, Haskell, etc. all offer similar performance profiles and are much nicer for the programmer.


Me too, however I do see Go as having a sweet spot for those that don't want to move beyond C, but can accept having a GC around and more type safety.

For everyone else, there are better options.


Most likely you wouldn't but many seems to prefer it over Rust/D/Haskell.


"vastly friendlier type systems"? The type systems of all of these languages are more "bondage & discipline" than Go's. Stricter, more complex.


If they're stricter, it's only because they rule out more kinds of incorrect behavior. Go's type system is actually strict in a useless way, because it doesn't allow you to write correct parametric code (for example, sorting generic slices).


If they're stricter, it's only because they rule out more kinds of incorrect behavior.

Sounds pretty bondage & discipline to me.


Well then bust out the gimp suit, because I want my program to run correctly.


I think that could also improve morale in your office. Might be worth your while giving it a try!


> Stricter,

That's exactly what a helpful language does:

(0) Define a region in the design space that contains the program you want.

(1) Conveniently tell you when you have accidentally stepped outside of this region.

> more complex.

How are you measuring this? I know of two good measures, and neither favors Go:

(0) The size of a formal semantics. Go's particular feature set suggests looking at Featherweight Java (a subset of pre-generics Java specifically designed to be amenable to formalization), and, well, FJ's ratio of static guarantees to language size is very low compared to most typed lambda calculi (on which ML and Haskell are based).

(1) The number of special cases in the language's design. Here Go fails miserably, due to the sheer number of built-in types and functions that require hardcoded support.


Can you give an example of what exactly generate does? All I can tell is that it me you call other programs like yacc to preprocess some files. So just a replacement for make?


It's basically the storing of a verbatim one-line command as a line comment in a .go source file rather than its traditional home of a line in a shell script or makefile.

Imagine you found it intolerable to have an interpreter-dependant -- not even /bin/sh -- source file in your package directory (said another way, anything but .go source files in the package directory), but at the same time needed to somehow sneak in the functionality of a rudimentary shell script. The answer? Weave it into a .go source file and the Go toolchain will be able to pick up on and execute it (on demand, never automatically).


Yes, that's the approach Go takes.


It seems like due to lack of generics I see more and more interface{} and reflection stuff in code bases. This is getting ridiculous.

I've written a lot of Go. Tried my best to stick to what language provides but at some point you say "screw it" and start fighting with the language. That is never a good sign.


Why hasn't it happened to me? I've been writing Go for over 3 years, and I prefer the language stay as is. I'm more concerned about porting it to more platforms. Perhaps that's a reason I'm motivated to desire less language changes.


I'm curious, what platform do you want to go to support that it doesn't already? Platform support seems pretty good already, at least for my needs.


A really big and popular platform where Go isn't officially supported yet is the web (browser).

I'm trying to push that direction as much as I can by contributing to GopherJS [0], which is IMO our best bet until wasm [1] comes out and has adds support for GC languages [2].

I agree the rest of the platform support is absolutely fantastic. We have the big 3 desktop OSes, as well as iOS and Android already. But so much of what we do today is on the web, and I don't want to keep using JavaScript or to keep crossing the Go <-> JS boundary inconsistencies. That's not the kind of future I want to invest in.

[0] https://github.com/gopherjs/gopherjs#readme

[1] https://webassembly.github.io/

[2] https://github.com/WebAssembly/design/blob/master/FutureFeat...


>Why hasn't it happened to me?

Well, don't know about your case, but I've seen e.g. people who are religious about being DRY and dislike any kind of useless boilerplate, and others just copy paste code with wild abandon and couldn't care at all.


I used to care about DRY to a _very_ great degree [0], and I still do. But I've come to realize DRY is much better to apply for high level _concepts_, ideas, and architectures, not unexported helper code.

It's unhelpful to try to apply it to little snippets of code and helpers. It's much less expensive to copy a snippet that's needed and duplicate it 1-2 times before starting to worry about factoring it out. By the time it's repeated 3+ times, you'll have a much better idea how to structure it.

[0] https://github.com/shurcooL/Conception#motivation


Also, I'm generally fine with language tradeoffs, If there is a good reason that I need to copy paste something, so be it. I'll do it.

Go and Generics issue on the other hand, feels a little different to me. It is a solved problem. Go does not have an inherent show stopper for generics. It's just not there because of.. What? Stubbornness? I'm not sure. But then it irritates me to copy paste stuff.


> It is a solved problem.

This is debatable. Is it a problem that's solved well? Is it solved in the best way possible and there could never be a better way to do generics?

I think not. I think it's solved adequately, but not great. Compared to the other facilities that Go offers, generics are typically pretty messy and add significant complexity to the language, tools, parsing, reasoning, compilation times, etc.

> It's just not there because of.. What? Stubbornness? I'm not sure. But then it irritates me to copy paste stuff.

This statement is ill-informed. Have you seen https://github.com/golang/go/issues/15292 and https://github.com/golang/proposal/blob/master/design/15292-...? Have you considered all the ways in which it will affect the language and the trade-offs?

Once generics are added to Go, they will forever be there, together with all the disadvantages and missed opportunities, and we'll have to live with them. That's not a thing to be taken lightly.

In comparison, maybe copying a little here and there isn't all that bad.


yep, I wrote this :

https://github.com/Mparaiso/lodash-go

which is in theory "runtime type safe" in the sense that is yields an error if types do not match , but it uses reflection which leads to a huge performance hit.

The irony is that Go is a perfectly capable functional language when one opts out of Go type system. (Don't use that package, this is not idiomatic Go).



Yes, if you look at the source code I use unification to tell whether types match or not.


You're doing a lot of repeated work though. I was trying to point out this function, which does unification in a generic context, given some function type: https://godoc.org/github.com/BurntSushi/ty#Check ... For example, it reduces a lot of the reflection boiler plate: https://github.com/BurntSushi/ty/blob/master/fun/list.go#L80


interface{} is the new void*.

There are far better ways to handle generics in modern languages, which preserve type safety and aren't very cumbersome.


> interface{} is the new void*.

More like Java's Object, especially since downcasts are mandatorily runtime-checked. And the results are exactly what one could have expected from the first 10 years of java (if slightly subdued as Java only had a single magical blessed buildin type) given the language is in more or less the same class.


More like C#'s "object" than Java's "Object".


>interface{} is the new void*

There is an important difference between interface{} and void* which makes it a lot less reckless. AFAIK interface{} contains not only a pointer to data but also a reference to the type of the data. Therefore you can cast that data safely to the original type (or use "switch" on the type). if origData, ok := data.(OriginalType); ok { /safe to use origData here*/ }


Another approach to this (this is used to good effect in LINQ) is to skip the `Less(i,j)` and just provide a value extractor, except golang makes that super verbose because you'd have to do `func(v interface{}) interface{}`

The idea would be if you have `type foo struct { ID int, Rank int }` you could provide `func(v interface{}) interface{} { return v.(foo).Rank }` and it would sort by rank.

Some level of late binding / genericity would be necessary to make this approach not as verbose.


> Another approach to this (this is used to good effect in LINQ)

Also in Python, it's called "key functions". Of course since Go doesn't have tuples and/or user-defined ordering, it either only works for simple comparisons (e.g. by rank but not by rank and name) or you have to hand-roll some weird-ass coercion into whatever key type the language uses.

> you'd have to do `func(v interface{}) interface{}`

And that doesn't actually work, interface{} is not orderable. Only integers, floats and strings are "naturally" orderable.


Also time.Time (which could be thought of as two int64s)

Even string ordering there is a lot left up to the implementation as far as how it is done. Do capitals come first? Does capitalization not matter? etc.

For non-intrinsic types you would provide the equivalent `less` function, but in my own programming the bulk of the things I order by are intrinsic.


I may be an outlier here, but I'm not bothered at all by the way sorting is currently done. It's good enough. The "tedious" type names are mostly unexported anyway, and Len and Swap methods are just a bit of copy-paste.

I don't really think we should tweak it, it's not broken. A code-generation tool would be nice instead.


>I may be an outlier here, but I'm not bothered at all by the way sorting is currently done. It's good enough.

I fear that's also the motto for a lot of Go's design choices.


Don't fear. Be assured!


That interface{} parameter makes me cringe, but it makes sense.


Go needs functions that support parametric types for collections. I'm not even talking about generics here as user defined types. Let's forget about generics or "the ability for developers to implement their own type safe containers".

Let's talk about the fact that functions in Go could support type parameters in signatures , something like :

   func Map<V,W>(slice []V,func(element V)W)[]W {
      // ...
   }

What we have here is a guarantee that at compile time, this code is safe. We didn't introduce generics, slice is a Go slice of V and the result is a go slice of W.

The code should then be used this way :

    result := Map<string,string>([]string{"a","b"},
        func(e string)string{ return e+"foo" } )
This is completely type safe, no reflection is used, the compiler knows all the types at compile time and no generic type was introduced. This is a good trade off and merely syntactic sugar that would enable developers to get rid of interface {} parameter + type assertions.


This is nice, but generic functions are still generics.


Go already has generic functions : append, copy ... that are type safe at compile time, how do you think they are implemented in the source code of Go's compiler?

Just like I said. Go compiler finds "append" token, and go look up if both arguments and the return type match, not at run time, at compile time.


No disagreement. I was just pointing out that what you were asking for is in fact user-defined generics.


In fact, if I understand spriggan3's proposal correctly, that's how Java's generics work. The approach is usually called type-erasure. (https://docs.oracle.com/javase/tutorial/java/generics/erasur...)


> naming a new type (e.g. "widgetsByName") is tedious to many

More than tedium, it's a bit sinister because the convention of creating 'parallel' slice types with names like that blurs the line between what's a noun and what's a verb. With a type name like that, one would expect a value of that type to always be sorted, right? Nope! A "widgetsByName" is just a flower in the breeze hoping that the sort.Sort bumble-bee makes contact.


Yes.

Brad is correct, the current approach is tedious.


<off-topic>Since when did Go use Github for official proposals? Does it also use the Github repo for official development or is it still just a mirror?</div>


The new proposal process is documented here, and goes through opening a GitHub issue: https://github.com/golang/proposal

Go "partly" uses GitHub. The git repo is a mirror, and code review / pull requests is done through Gerrit, but the issue repo is the official one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: