The evolution of GHC

…perhaps there should be a “Pre-Pre-Pre-HFTP: Decoupling Haskell and GHC” - back when Haskell 98 was the standard, you had choices:

  • got an idea for a Haskell extension? GHC would be the implementation for you.
  • interested in teaching Haskell? Hugs98 was the simplest implementation, followed by NHC and HBC (of course, your opinion may differ ;-).
  • wanting to use Haskell in production? You probably used and maintained your own private installation of GHC.

Out of the those and other Haskell implementations, it was GHC alone which successfully transitioned to Haskell 2010. That singular status, combined with its active use as a platform for research, meant that for most people, “GHC” was synonymous with “Haskell”.
(and yes: I know people can use GHC’s -XHaskell98 and -XHaskell2010 options - our continual presence here and on other forums would indicate doing that isn’t a viable solution).

So here we are, expecting a project historically driven by research and evolution to now also provide stability for practical use and study. To me, it now seems like expecting a working laboratory to simultaneously be capable of mass production - if you know, or can think of a way to make that situation work well while keeping everyone happy civil, please contact the Haskell Foundation today :-D.

3 Likes

I see no contradiction here. It is a natural evolution of any successful project from being driven by state-of-the-art research to being useful in a practical setting. There were 10x-100x less professional Haskell developers back in 2010 and the shape of the language was much less defined, no surprise that there was better ROI for rapid evolution. But no matter which technical solutions you come up with (e. g., decoupling base and GHC, or better head.hackage, or better matrix builder), nowadays there is a huge cost associated with breaking things, which is fundamentally social.

Haskell developers rarely complain about technical complexity of fixing breaking changes. Indeed, Haskell is marvellous for refactoring. What bothers people is that their code is simply broken again and again. That’s why calls to make rapid evolution “easier” from technical perspective are likely to cause an overall negative community effect, because they make breaking changes perceived as a less discouraged activity.

2 Likes

…apart from people being bothered by their code being broken again and again because that state-of-the-art research continues at speed. You want GHC to evolve slower - I suspect most other people would see that as a call for maturity (e.g. like GCC rather than LLVM). But @Ericson2314 wants to maintain the pace of that state-of-the-art research, and therein lies the contradiction, and the predicament GHC is in now.

But if what you wrote is correct:

There were 10x-100x less professional Haskell developers back in 2010 [than now]

…then maybe some of that 10x-100x increase since 2010 can be directed towards an alternate Haskell implementation which can be allowed to mature (evolve slowly) - GHC can then go back to being the platform for start-of-the art research, breakages and all.

But if people really are bothered about their code being repeatedly broken then:

  • they’re either not being bothered enough,
  • or they’re isn’t enough of them being bothered

…to the point of starting a new project of some variety (it doesn’t have to be an alternate Haskell implementation; that’s just my opinion) - has there really been a 10x-100x increase in Haskell devs since 2010?

This had already happened in the past, right? stack was born out of such frustration. Are we really looking forward to push a similar schism further, up to a fork of GHC?

  • Like I said, that presumes an alternative Haskell implementation is the only solution - if someone has a better idea, now is definitely the time to hear about it!

  • If it was as simple as “just forking GHC”, I suspect it would have already happened - as a project, GHC is now just too large to be successfully forked.

  • Is the current project to add Rust to GCC a “schism” for Rust? I’m guessing that depends on how may Rust devs there are now, as compared to 2010 2016 (IIRC the year Rust v.1 was released).

So no: it seems more likely that an alternate Haskell implementation would start afresh.

1 Like

This is a bit more heterdox of me, but I have a theory that the non-modularity is also increasing bad for research, too. As GHC gets more complex without being more modular, this dramatically increases cost of doing experiments. The idea with a really modular GHC is that it is cheaper to do experiments, run temporary forks etc., so we get more research and more stability! Decoupling base and GHC is just one part of this effort.

5 Likes

…I’m thinking more “confinement” than “stability”:

  • right now, a breaking change in base affects almost everyone;
  • if base was split into e.g. base-alpha, base-beta, and base-gamma, then the same breaking change would probably only affect, let’s say, base-beta: then users of base-alpha and base-gamma would be unaffected.
  • All going well, base-alpha, base-beta, and base-gamma will be approximately one-third the size of base: as you say, that will almost certainly encourage more research in each package, and presumably more breakage.

So which would you prefer:

  • base being broken,
  • or base-alpha, base-beta, and base-gamma being broken much more often, quite possibly at the same time?

I think both our preferences would be the same: none of the above.

It doesn’t really matter if it’s one “large” break, or frequent mutiple “small” breaks - either way, users would still be exposed to breakage. That each individual break is confined to a smaller package will be of little comfort to them.

A break-up of base into smaller packages only enhances confinement. As for stability, most of those packages would have to remain unchanged for long periods of time, with breakage only occuring in a few of them at any particular time.

So what would be the chances of that actually happening? That is an interesting question…

  • or base-alpha, base-beta, and base-gamma being broken much more often, quite possibly at the same time?

I agree confinement is the most important thing we get. I also agree the most-used parts of base change the least. But to the extent one can mix old and new versions of base-* packages, “at the same time” starts to mean less.

In other words, even if we break parts of GHC and base roughly at the same time, users being able to pick and choose which breakage they want to deal with first makes it still better!

1 Like

AFAICT, the issue here isn’t about breakage directly, and the problem isn’t about the fact that breaking changes are a thing… Nor is the problem about academic vs production use of the language, nor the rate at which those “groups” wish to see change… the problem is WRT how we roll out breaking changes, and the effect that has as it ripples out across the ecosystem. More precisely, the issue is about how the existing arrangement leads to a more painful experience, and that improvements to the process of releasing changes is constrained due to the arrangement of the packages/projects.

EDIT: my comments were in the context of the original discussion on splitting base/ghc.

1 Like

FWIW, while I am not one to consider forking GHC, very highly-respected and otherwise competent haskell people I’ve had the pleasure of working with have discussed forking GHC many times over the last 5 years or so. They would get to the point of discussing a fork out of frustration for GHC-HQ as a project led by people who did not share the same values or goals. They got to that point having exhausted the other avenues of possibility. However it would not happen, as they would acknowledge that it was too much work, and would likely be an uphill battle in the community, while also not addressing the cultural/organizational issues in the community of volunteers who maintain Haskell’s infrastructure. At least some of these highly-respected haskellers have given up and moved on, which is really bad for the ecosystem at large (though most of us haven’t yet figured out that’s the case).

IMHO, for the infrastructure that supports Haskell, the only way forward is to fix our organizational and structural issues, both as a community of people, as well as our codebases.

EDIT: my comments were in the context of the original discussion on splitting base/ghc.

1 Like

There were 10x-100x less professional Haskell developers back in 2010 [than now]

I find this claim extremely dubious. Is there real, reputable and representative statistics backing up the claim? It seems more of the case that haskell climbed down, not up, in the community size. That fact should have been one of the rationale of appearance of HF.

Would it be possible to move this conversation out to its own thread?

…and continuing on from there.


Thank you, @jaror.

2 Likes

(I wasn’t going to comment on this thread, which looks like too much navel-gazing; but I couldn’t help noticing some claims that just aren’t true.)

Haskell 2010 standard is very little different to H98. There was no ‘transition’ needed; and Hugs supports pretty much all of the standard (introducing variables in guards is the main exception).

Furthermore Hugs supports most of the beyond-H2010 features in GHC at that time: MultiParam Type Classes, FunDeps, Overlapping Instances. (GADTs being the main exception.)

Hugs features a ok-ish records system, which GHC still hasn’t got round to. And indeed for some people (Purescript with row polymorphism and extensible records, for example) H2010/GHC’s records so-called system is embarrassing enough to abandon Haskell.

So for me “GHC” is not synonymous with “Haskell”. For me, a key attraction was that Haskell is a ‘small’ but powerful language. GHC today with all its features is not small, and frankly very little more powerful than Hugs/GHC extensions as at 2006. Indeed GHC is now so bloated, and the single namespace/anti-punning proposals make it not Haskell, in the same way that ALGOL 68 is not ALGOL and C++ is not C.

I think you’re pulling that claim out of your behind. My impression is there’s fewer active in the Haskell community than 10 years ago. If they exist, these professional developers are being remarkably quiet about it.

There’s squads of CS students who’ve touched Haskell, asked newbie questions, then finished their semester having ‘done FP’ and just disappear. (Probably wondering what that out-of-body experience was all about. They’ve maybe gone on to be professional developers, but not in Haskell.)

eh? The only shape of the language that’s defined is the H98/2010 Reports (which include the standard libraries). For everything beyond you need to read the docos for each extension piecemeal. For the FTP library changes I haven’t found any definition: I continually fall over surprises.

But there hasn’t been evolution. The urgent concerns of the active community as of 10 years ago are still outstanding. There still isn’t a records system; Overlapping (orphan instances) is still hazardous; the combo of Overlapping+FunDeps was proved to be hugely powerful by Kiselyov et al 2004, but the difficulties they ran into are still a running sore. (There’s a whole clump of tickets. I’ve found ways to fix those in Hugs, so I’ve implemented all of Kiselyov et al, but I’ve avoided the incoherence they had to battle with.)

The active contributors who had high hopes for GHC as of 10 years ago are still active in other FP languages. They’ve just given up with Haskell. I don’t think it was the breaking changes or the shuffling of deckchairs in the libraries that dried up their interest.

I still prefer to program in Hugs – for the occasional Haskeller/recreational mathematician, it’s just so much more fun. And looking into current and proposed GHC features, I see only more bloatware. So I’ve un-installed GHC 9. I see no likelihood I’ll be coming back – about which I’m sad.

1 Like

AntC2:
I am sorry for bikeshedding, but I want to ask. Where do you think people go after abandoning haskell? I do not see simple FP language.
Tbh, in entire FP scene, only (complex) Scala seems alive at dying state. IIRC microsoft no longer care about F#.
(I hope it is not something like python, but trend nowadays do tell that it might be)

It is surprising to hear that Hugs is still alive. I wonder why it is coming up less often nowadays.

Professional developers will go to whatever language they can get a job. Not necessarily FP. OTOH other languages keep stealing Haskell/FP features – it’s debatable whether they need a FP mindset.

I said ‘small’ not “simple”. Although MPTCs/Flexible Instances/Flexible Constraints has a small syntactic footprint extending H2010, I wouldn’t claim the semantics is “simple”. In contrast Type Families adds a whole 'nother syntactic construct, you still need MPTCs/etc, ScopedTypeVariables, AmbiguousTypes seem to be sleepwalking into ‘unnecessary complexity’.

SML is still going strong. Scala seemed to be a strong contender as of a few years ago, but seems to be falling back.

There’s several experimental/research Haskell-like languages. I mentioned Purescript above. Key members of the Hugs development team went on to Habit – whose ‘Instance Chains’ and tweaks to FunDeps are one way to address the concerns I mentioned. (The Habit team have collaborated with Richard Eisenberg on a few papers.)

It’s no longer supported/indeed its distro has bitrotted. I wouldn’t have gone back to it, but for me the GHC experience is just getting more awful. And I wouldn’t have tried hacking inside Hugs (I’m no sort of compiler/language expert), but it seemed the only way to make progress, since GHC is not addressing anything of what I want.

And I suppose when it comes to records systems, Lenses is what everybody gets excited about. Lenses is exactly the bloated nightmare, opposite of what I want for a records system. YMMV. Hugs’ Trex is close to what I want; I’ve hacked it to get a bit closer. I know what I want to do next (along the lines of the last/speculative section in the Gaster+Jones paper); I fear it’s beyond my hacking skills.

In some ways, it doesn’t matter if we lost them as a contributor.

Oh meh, I wanted to say small rather than simple.

I never heard of SML over here where I live though, so I highly doubt the “going strong” thing. Racket might be great, tho I do not know much about it. (Also iirc racket is dynamically typed, so a distinct design space)

Purescript is indeed great candidate, tho considering its age and that it is not getting much known… I don’t think it would ever get anywhere in 5~10 years.

Perhaps typed FP is dying, and rightfully so.

Btw, I think lens solves different problem to record system - notably, mutating inner records. Perhaps purity is just bad nonsense like others present, and it is better to allow mutation in this case.

You know, if anything, what would happen in the future will likely be wave of untyped easy-to-learn languages and no-code frameworks. The cost reduction is simply irresistable for companies

  • Adaptation of FP features is combination of trying to grow languages(so that contenders won’t follow? lol) + FOMO anyway. Saw the python thread regarding pattern matching? XD

Yea, it also doesn’t matter if FP as a whole die out as people slowly moves on from it. Who cares if remaining ppl dedicated to the paradigm gets unemployed and removed from the software scene?

Sorry, I misunderstood you and put such a harsh remark…

One of the issues up-thread seems to be lack of volunteers. Your point being that windbags like me who criticise but don’t develop in GHC/don’t maintain libraries are exactly the sort of (non-)‘contributors’ who GHC wants to lose? Then GHC is achieving win-win: I don’t want GHC any more; GHC doesn’t want me any more.

1 Like

I don’t think FP is dying. I’d say in some forms it’s getting more popular, certainly going mainstream.

I do think Agda/Idris/Lean took away some “cool dank language for smart people” reputation for the young hotshots on one hand, and Rust took away the “please just give me some language to use at work that doesn’t suck!” crowd on the other. I know people who would mess with both those and find Haskell an “awkward middle ground” between beauty and practicality. (To be clear, I personally don’t think Rust is more or less practical than Haskell.)

None of the above I am really mad/sad/worried about. I welcome the competition, and think it will push us in good directions.

2 Likes