GHC Medium-Term Priorities

Well, I can at least say that I did not suggest this. I think I’ve outlined many times that improving release quality consists of:

  • better release coordination
  • better communication with stakeholders
  • sourcing opinions and providing pre-releases that can be widely tested
  • improving the bus factor of GHC/tooling development

GHC developers in fact told me that slowing down releases will be more work for them.

What I wished for are higher quality releases and less of them. There may be arguments why less releases could improve release quality, but that also depends on many factors.

The main reason some of us wish for less releases is not so much the quality, but the toll it takes on tooling. The end user can simply ignore a couple of broken point releases (there are a lot in the 8.10 series).

From the outside these things (tooling) may not look like a lot of work. Or it may seem all this can be easily automated in some way. But that is not the case.

So, I guess we agree we want to improve release quality.

There are ways to solve this: Nightlies. These will only have rudimentary tooling support (e.g. no curated bindists, no prebuilt HLS binaries, etc.).

In an alternative fantasy world… this would have been communicated to the community earlier, because it was known to GHC developers that it’s a (non-trivial) breaking change. A couple of key people would have been enough to get the feedback “no no”. It did not have to get exposed to the entire “world” to figure out it wasn’t a good idea.

This is what I’ve tried to describe earlier in this thread: community management, involving of relevant stakeholders (during development and release), managing expectations, etc.

All of this takes time and effort of course, but that’s the cost you pay for improving release quality.

I’m worried there may even be less communication and more isolation with higher frequency, because you’ll get feedback anyway post release and can just go and revert.

Do we really need to expose the code to ALL users to get feedback? I don’t think so.

Nightlies are a great way to solve this balancing act of differing requirements.

So what’s the main problem with Nightlies? I guess it’s the fact that old code stops compiling with newer compilers all the time and people can’t reasonably test it on a real world project.

So we’re back to the stability and backwards compatibility discussion.

2 Likes

FWIW I was not referring to you.

So, I guess we agree we want to improve release quality.

Yes, I think so.

Nightlies are a great way to solve this balancing act of differing requirements.

Yes, I think so too, as long as there’s a way to consume them that is not significantly harder than acutal releases, otherwise very few people will do it.

Anyway, I take your point of view seriously under advisement. You are an expert in distribution and release management, and I am not.

1 Like

That is the idea behind the GHC Steering Committee, whose members are meant to represent various interests: education, research, industry. And the simplified subsumption was accepted (#287), so apparently the key people you are referring to have not nominated themselves to join the committee for one reason or another, where their voices would have been heard.

This mistake can be corrected. The committee seeks new members regularly.

3 Likes

Thanks @tomjaguarpaw and @hasufell for your detailed thoughts as always!

On the second point:

  • Currently, are GHC releases held back at release candidate stage until they have tooling support (as far as, say, HLS), notwithstanding issues like 9.2.5? Is this something that could be done?

In my view, likely a minority view [edit: maybe not], a working HLS is a prerequisite for a functioning ecosystem to be built around a version of GHC. I agree that it will often (always?) require code changes to HLS to support updated GHCs. However, if that can be done in the space of days, it seems wise from an ecosystem perspective to leave GHC at RC stage (or maybe with some “bleeding-edge” label on it) for a few days until those code changes have been made.

6 Likes

FWIW, I share this view.

4 Likes

That’s also my opinion.

I find this statement a bit sly: there are plenty people in the Haskell community (me included), who bear no interest in programming language design, dependent types or type systems in general, or endless syntactic sugar debates. Such people would not find themselves comfortable within GHC Steering Committee and will be necessarily underrepresented. The public outcry with regards to the simplified subsumption was a clear indicator how detached GHC Steering Committee is and how little involvement the general public has with its proceedings. And this is not a fault of the public, because the community is never wrong.

Upd.: Detachment from users is an expected property of a committee (e. g., I’m not saying that CLC is any better). This is unlikely something that can be truly fixed, just something to bear in mind and acknowledge, not dismiss under a pretense that anyone can be elected.

9 Likes

It is an invitation, not a dismissal. The committee is running a call for nominations as we speak (until February 11th), see the announcement. EDIT: I got confused about the date, that was in 2022.

The community is not uniform. Some people are fine with breaking changes, some are not. The public outcry comes from the latter group, and I am trying to offer a solution. My apologies that the solution isn’t all rainbows and unicorns. I, for one, am perfectly fine with breaking changes such as simplified subsumption, but people who aren’t need to speak up during the decision-making process, not after.

1 Like

I’m very much with @Bodigrim here. If I was hypothetically on the steering committee, I would not care about any of the topics discussed much at all. I would ask for every proposal only: Does this break existing code? Does it stop GHC accepting code it accepted before? If the answer is yes, I would demand it has to be behind a language flag to be enabled. Never by default.

Applying to the steering committee as someone who was near no interest in new language features in GHC, seems … wrong? Most of the time trying to read the proposal would be a chore I would not enjoy at all, not likely understand or even understand the implications. And again, I wouldn’t really care about the proposal outside of “does it make the compiler stop accepting code it accepted before?” and if so demanding to to be behind a language flag.

So the Steering Committee might need to consider there to be a strong selection bias.

6 Likes

So I guess I should just apply to make sure my voice is heard?

However the announcement explicitly tells me I should not apply as I most likely wouldn’t be able to:

  • to understand such language extension proposals

And I also do not

  • have expertise in language design and implementation, in either Haskell or related languages, which they can share with us.

And as I mentioned before my contribution to each and every proposal would be identical: does it break existing code? If so, it must be behind a language pragma.

@int-index would you suggest that I apply?

2 Likes

It takes effort to understand proposals that are outside of your area of expertise. For example, it took me a while to figure out what was going on in the proposal about exception backtraces. But don’t sell yourself short (I’ve seen your GHC contributions). Besides, the onus of writing a well-motivated proposal with a clear specification lies with the proposal author.

That is one of the possible qualifying properties, the announcement lists other. It’s ||, not &&.

Apparently, this is exactly what had to be said to prevent the situation with simplified subsumption. Now it’s guarded behind NoDeepSubsumption. So, yes, it would have been valuable if you said it and insisted on it.

It is your decision. But you would do the Haskell community a service if you did.

8 Likes

Well then. I have applied.

15 Likes

I think that some clarification is perhaps in order. The document contains the following:

Many commenters have rightly focused on the fact that reducing the amount of work required for a releases is important (e.g. by automating as much as possible). However, I think that this text did not communicate an important aspect of the work necessary for each release, which is that backports of bug fixes are time-consuming and difficult. Right now, there are four branches: the main development branch, the 9.6 pre-release branch, the 9.4 maintenance branch, and the 9.2 maintenance branch. Keeping track of which bugs apply to which, and ensuring that fixes are correctly backported even when the internals have changed a fair bit, is difficult and expensive, and that doesn’t seem to be easily solvable by release automation.

Of course, none of these caveats apply to pre-releases and nightlies - those would be much easier to do more frequently and more prominently, and together with head.hackage, they can also serve as a basis for evaluating language proposals against large non-open-source codebases. But making that feasible for most people will certainly require some tool development and UX improvements.

5 Likes

I think we will be able to cover a lot of the requirements from folks who desire more frequent releases via nightlies, indeed.

I should have a discussion with GHC developers about this topic and figure out if we can flesh out a proposal. Maybe not in the short term, but somewhen not too far away.

Maybe that will make some of the discussed ideas about changing release cycle obsolete and allow us to satisfy both parties.

3 Likes

Perhaps a silly suggestion, but is it more tractable to put less energy into backporting and more energy into automating upgrades with retrie?

In a fantasy world someone who wants an update in GHC 9.4 that is on 9.0 would be able to seamlessly upgrade to 9.4.

That is perhaps an even more difficult problem and not even possible, but there is room for adjusting the levers.

As an example without too much thought behind it: reducing how many versions receive backports in exchange for ensuring upgrades between major versions are 90% automated.

However my memory of the stability working groups goals (if it’s still a thing) tell me this wouldn’t be a very popular idea.

1 Like

…you mean something like:

Also, and since this is a fantasy, to “seamlessly upgrade” GHC tends to suggest that its documentation will be upgraded in similar fashion - maybe one can adapt the ol’ Commentator for use with Glasgow Haskell…

I don’t think so. The fact that GHC and base are tightly coupled right now, combined with the fact that some libraries couple compatibility with new versions of base to their own major releases, means that automating upgrades is probably infeasible in general unless those underlying conditions change. And code compatibility is not everything - a performance regression in a new compiler, or support for a platform that exists in an older version but was dropped in a new version, can make an older branch attractive for a while. Bugfixes on those branches will continue to be useful, I suspect.

It’s definitely still a thing! We’re interested in any technique that has promise for making maintenance of code easier over time.

You are exactly right… In fact I’ve personally been bitten by this in the past. It just wasn’t top of mind when I wrote the comment :slight_smile:

It’s definitely still a thing! We’re interested in any technique that has promise for making maintenance of code easier over time.

Great to hear!

There’s a real danger in underestimating how serious regular breaking changes between minor versions is.

LLVM was used as an example elsewhere, but it’s a flawed comparison. For a start, it’s not a programming language or a compiler, but most importantly, it is not a direct dependency on language ecosystems. The associated languages are relatively stable to the point that I can run 20-year-old Java code and C libraries without modification.

I don’t know enough about the conflicts between stability and progress or bug fixes, but I’m struggling to see how it’s good to aim for a volatile spec/standard/compiler behaviour.

I can understand occasional breaking changes to fix fundamental flaws or bugs. Still, if they are regular occurrences then that would suggest that far too many experimental features were treated as stabilized features.

I suspect this might be easier to resolve if it’s valued enough.

As a newcomer, this outlook is alarming as I just can’t see how this could be even remotely viable for a production language.

I’d like to think I could write a library today and not need to update it every year to support the latest changes.

Or maybe I’ve misunderstood something (I am new after all).

7 Likes

Really good overview, thanks. Would it be possible to pin such a summary at an easy to find place, either in this forum or on the Haskell.org website? Maybe even include long-term goals and an overview of what is being worked on?

2 Likes