GHC Medium-Term Priorities

I’m very much with @Bodigrim here. If I was hypothetically on the steering committee, I would not care about any of the topics discussed much at all. I would ask for every proposal only: Does this break existing code? Does it stop GHC accepting code it accepted before? If the answer is yes, I would demand it has to be behind a language flag to be enabled. Never by default.

Applying to the steering committee as someone who was near no interest in new language features in GHC, seems … wrong? Most of the time trying to read the proposal would be a chore I would not enjoy at all, not likely understand or even understand the implications. And again, I wouldn’t really care about the proposal outside of “does it make the compiler stop accepting code it accepted before?” and if so demanding to to be behind a language flag.

So the Steering Committee might need to consider there to be a strong selection bias.

6 Likes

So I guess I should just apply to make sure my voice is heard?

However the announcement explicitly tells me I should not apply as I most likely wouldn’t be able to:

  • to understand such language extension proposals

And I also do not

  • have expertise in language design and implementation, in either Haskell or related languages, which they can share with us.

And as I mentioned before my contribution to each and every proposal would be identical: does it break existing code? If so, it must be behind a language pragma.

@int-index would you suggest that I apply?

2 Likes

It takes effort to understand proposals that are outside of your area of expertise. For example, it took me a while to figure out what was going on in the proposal about exception backtraces. But don’t sell yourself short (I’ve seen your GHC contributions). Besides, the onus of writing a well-motivated proposal with a clear specification lies with the proposal author.

That is one of the possible qualifying properties, the announcement lists other. It’s ||, not &&.

Apparently, this is exactly what had to be said to prevent the situation with simplified subsumption. Now it’s guarded behind NoDeepSubsumption. So, yes, it would have been valuable if you said it and insisted on it.

It is your decision. But you would do the Haskell community a service if you did.

8 Likes

Well then. I have applied.

15 Likes

I think that some clarification is perhaps in order. The document contains the following:

Many commenters have rightly focused on the fact that reducing the amount of work required for a releases is important (e.g. by automating as much as possible). However, I think that this text did not communicate an important aspect of the work necessary for each release, which is that backports of bug fixes are time-consuming and difficult. Right now, there are four branches: the main development branch, the 9.6 pre-release branch, the 9.4 maintenance branch, and the 9.2 maintenance branch. Keeping track of which bugs apply to which, and ensuring that fixes are correctly backported even when the internals have changed a fair bit, is difficult and expensive, and that doesn’t seem to be easily solvable by release automation.

Of course, none of these caveats apply to pre-releases and nightlies - those would be much easier to do more frequently and more prominently, and together with head.hackage, they can also serve as a basis for evaluating language proposals against large non-open-source codebases. But making that feasible for most people will certainly require some tool development and UX improvements.

5 Likes

I think we will be able to cover a lot of the requirements from folks who desire more frequent releases via nightlies, indeed.

I should have a discussion with GHC developers about this topic and figure out if we can flesh out a proposal. Maybe not in the short term, but somewhen not too far away.

Maybe that will make some of the discussed ideas about changing release cycle obsolete and allow us to satisfy both parties.

3 Likes

Perhaps a silly suggestion, but is it more tractable to put less energy into backporting and more energy into automating upgrades with retrie?

In a fantasy world someone who wants an update in GHC 9.4 that is on 9.0 would be able to seamlessly upgrade to 9.4.

That is perhaps an even more difficult problem and not even possible, but there is room for adjusting the levers.

As an example without too much thought behind it: reducing how many versions receive backports in exchange for ensuring upgrades between major versions are 90% automated.

However my memory of the stability working groups goals (if it’s still a thing) tell me this wouldn’t be a very popular idea.

1 Like

…you mean something like:

Also, and since this is a fantasy, to “seamlessly upgrade” GHC tends to suggest that its documentation will be upgraded in similar fashion - maybe one can adapt the ol’ Commentator for use with Glasgow Haskell…

I don’t think so. The fact that GHC and base are tightly coupled right now, combined with the fact that some libraries couple compatibility with new versions of base to their own major releases, means that automating upgrades is probably infeasible in general unless those underlying conditions change. And code compatibility is not everything - a performance regression in a new compiler, or support for a platform that exists in an older version but was dropped in a new version, can make an older branch attractive for a while. Bugfixes on those branches will continue to be useful, I suspect.

It’s definitely still a thing! We’re interested in any technique that has promise for making maintenance of code easier over time.

You are exactly right… In fact I’ve personally been bitten by this in the past. It just wasn’t top of mind when I wrote the comment :slight_smile:

It’s definitely still a thing! We’re interested in any technique that has promise for making maintenance of code easier over time.

Great to hear!

There’s a real danger in underestimating how serious regular breaking changes between minor versions is.

LLVM was used as an example elsewhere, but it’s a flawed comparison. For a start, it’s not a programming language or a compiler, but most importantly, it is not a direct dependency on language ecosystems. The associated languages are relatively stable to the point that I can run 20-year-old Java code and C libraries without modification.

I don’t know enough about the conflicts between stability and progress or bug fixes, but I’m struggling to see how it’s good to aim for a volatile spec/standard/compiler behaviour.

I can understand occasional breaking changes to fix fundamental flaws or bugs. Still, if they are regular occurrences then that would suggest that far too many experimental features were treated as stabilized features.

I suspect this might be easier to resolve if it’s valued enough.

As a newcomer, this outlook is alarming as I just can’t see how this could be even remotely viable for a production language.

I’d like to think I could write a library today and not need to update it every year to support the latest changes.

Or maybe I’ve misunderstood something (I am new after all).

7 Likes

Really good overview, thanks. Would it be possible to pin such a summary at an easy to find place, either in this forum or on the Haskell.org website? Maybe even include long-term goals and an overview of what is being worked on?

2 Likes