The evolution of GHC

…I’m thinking more “confinement” than “stability”:

  • right now, a breaking change in base affects almost everyone;
  • if base was split into e.g. base-alpha, base-beta, and base-gamma, then the same breaking change would probably only affect, let’s say, base-beta: then users of base-alpha and base-gamma would be unaffected.
  • All going well, base-alpha, base-beta, and base-gamma will be approximately one-third the size of base: as you say, that will almost certainly encourage more research in each package, and presumably more breakage.

So which would you prefer:

  • base being broken,
  • or base-alpha, base-beta, and base-gamma being broken much more often, quite possibly at the same time?

I think both our preferences would be the same: none of the above.

It doesn’t really matter if it’s one “large” break, or frequent mutiple “small” breaks - either way, users would still be exposed to breakage. That each individual break is confined to a smaller package will be of little comfort to them.

A break-up of base into smaller packages only enhances confinement. As for stability, most of those packages would have to remain unchanged for long periods of time, with breakage only occuring in a few of them at any particular time.

So what would be the chances of that actually happening? That is an interesting question…

  • or base-alpha, base-beta, and base-gamma being broken much more often, quite possibly at the same time?

I agree confinement is the most important thing we get. I also agree the most-used parts of base change the least. But to the extent one can mix old and new versions of base-* packages, “at the same time” starts to mean less.

In other words, even if we break parts of GHC and base roughly at the same time, users being able to pick and choose which breakage they want to deal with first makes it still better!

1 Like

AFAICT, the issue here isn’t about breakage directly, and the problem isn’t about the fact that breaking changes are a thing… Nor is the problem about academic vs production use of the language, nor the rate at which those “groups” wish to see change… the problem is WRT how we roll out breaking changes, and the effect that has as it ripples out across the ecosystem. More precisely, the issue is about how the existing arrangement leads to a more painful experience, and that improvements to the process of releasing changes is constrained due to the arrangement of the packages/projects.

EDIT: my comments were in the context of the original discussion on splitting base/ghc.

1 Like

FWIW, while I am not one to consider forking GHC, very highly-respected and otherwise competent haskell people I’ve had the pleasure of working with have discussed forking GHC many times over the last 5 years or so. They would get to the point of discussing a fork out of frustration for GHC-HQ as a project led by people who did not share the same values or goals. They got to that point having exhausted the other avenues of possibility. However it would not happen, as they would acknowledge that it was too much work, and would likely be an uphill battle in the community, while also not addressing the cultural/organizational issues in the community of volunteers who maintain Haskell’s infrastructure. At least some of these highly-respected haskellers have given up and moved on, which is really bad for the ecosystem at large (though most of us haven’t yet figured out that’s the case).

IMHO, for the infrastructure that supports Haskell, the only way forward is to fix our organizational and structural issues, both as a community of people, as well as our codebases.

EDIT: my comments were in the context of the original discussion on splitting base/ghc.

1 Like

There were 10x-100x less professional Haskell developers back in 2010 [than now]

I find this claim extremely dubious. Is there real, reputable and representative statistics backing up the claim? It seems more of the case that haskell climbed down, not up, in the community size. That fact should have been one of the rationale of appearance of HF.

Would it be possible to move this conversation out to its own thread?

…and continuing on from there.


Thank you, @jaror.

2 Likes

(I wasn’t going to comment on this thread, which looks like too much navel-gazing; but I couldn’t help noticing some claims that just aren’t true.)

Haskell 2010 standard is very little different to H98. There was no ‘transition’ needed; and Hugs supports pretty much all of the standard (introducing variables in guards is the main exception).

Furthermore Hugs supports most of the beyond-H2010 features in GHC at that time: MultiParam Type Classes, FunDeps, Overlapping Instances. (GADTs being the main exception.)

Hugs features a ok-ish records system, which GHC still hasn’t got round to. And indeed for some people (Purescript with row polymorphism and extensible records, for example) H2010/GHC’s records so-called system is embarrassing enough to abandon Haskell.

So for me “GHC” is not synonymous with “Haskell”. For me, a key attraction was that Haskell is a ‘small’ but powerful language. GHC today with all its features is not small, and frankly very little more powerful than Hugs/GHC extensions as at 2006. Indeed GHC is now so bloated, and the single namespace/anti-punning proposals make it not Haskell, in the same way that ALGOL 68 is not ALGOL and C++ is not C.

I think you’re pulling that claim out of your behind. My impression is there’s fewer active in the Haskell community than 10 years ago. If they exist, these professional developers are being remarkably quiet about it.

There’s squads of CS students who’ve touched Haskell, asked newbie questions, then finished their semester having ‘done FP’ and just disappear. (Probably wondering what that out-of-body experience was all about. They’ve maybe gone on to be professional developers, but not in Haskell.)

eh? The only shape of the language that’s defined is the H98/2010 Reports (which include the standard libraries). For everything beyond you need to read the docos for each extension piecemeal. For the FTP library changes I haven’t found any definition: I continually fall over surprises.

But there hasn’t been evolution. The urgent concerns of the active community as of 10 years ago are still outstanding. There still isn’t a records system; Overlapping (orphan instances) is still hazardous; the combo of Overlapping+FunDeps was proved to be hugely powerful by Kiselyov et al 2004, but the difficulties they ran into are still a running sore. (There’s a whole clump of tickets. I’ve found ways to fix those in Hugs, so I’ve implemented all of Kiselyov et al, but I’ve avoided the incoherence they had to battle with.)

The active contributors who had high hopes for GHC as of 10 years ago are still active in other FP languages. They’ve just given up with Haskell. I don’t think it was the breaking changes or the shuffling of deckchairs in the libraries that dried up their interest.

I still prefer to program in Hugs – for the occasional Haskeller/recreational mathematician, it’s just so much more fun. And looking into current and proposed GHC features, I see only more bloatware. So I’ve un-installed GHC 9. I see no likelihood I’ll be coming back – about which I’m sad.

1 Like

AntC2:
I am sorry for bikeshedding, but I want to ask. Where do you think people go after abandoning haskell? I do not see simple FP language.
Tbh, in entire FP scene, only (complex) Scala seems alive at dying state. IIRC microsoft no longer care about F#.
(I hope it is not something like python, but trend nowadays do tell that it might be)

It is surprising to hear that Hugs is still alive. I wonder why it is coming up less often nowadays.

Professional developers will go to whatever language they can get a job. Not necessarily FP. OTOH other languages keep stealing Haskell/FP features – it’s debatable whether they need a FP mindset.

I said ‘small’ not “simple”. Although MPTCs/Flexible Instances/Flexible Constraints has a small syntactic footprint extending H2010, I wouldn’t claim the semantics is “simple”. In contrast Type Families adds a whole 'nother syntactic construct, you still need MPTCs/etc, ScopedTypeVariables, AmbiguousTypes seem to be sleepwalking into ‘unnecessary complexity’.

SML is still going strong. Scala seemed to be a strong contender as of a few years ago, but seems to be falling back.

There’s several experimental/research Haskell-like languages. I mentioned Purescript above. Key members of the Hugs development team went on to Habit – whose ‘Instance Chains’ and tweaks to FunDeps are one way to address the concerns I mentioned. (The Habit team have collaborated with Richard Eisenberg on a few papers.)

It’s no longer supported/indeed its distro has bitrotted. I wouldn’t have gone back to it, but for me the GHC experience is just getting more awful. And I wouldn’t have tried hacking inside Hugs (I’m no sort of compiler/language expert), but it seemed the only way to make progress, since GHC is not addressing anything of what I want.

And I suppose when it comes to records systems, Lenses is what everybody gets excited about. Lenses is exactly the bloated nightmare, opposite of what I want for a records system. YMMV. Hugs’ Trex is close to what I want; I’ve hacked it to get a bit closer. I know what I want to do next (along the lines of the last/speculative section in the Gaster+Jones paper); I fear it’s beyond my hacking skills.

In some ways, it doesn’t matter if we lost them as a contributor.

Oh meh, I wanted to say small rather than simple.

I never heard of SML over here where I live though, so I highly doubt the “going strong” thing. Racket might be great, tho I do not know much about it. (Also iirc racket is dynamically typed, so a distinct design space)

Purescript is indeed great candidate, tho considering its age and that it is not getting much known… I don’t think it would ever get anywhere in 5~10 years.

Perhaps typed FP is dying, and rightfully so.

Btw, I think lens solves different problem to record system - notably, mutating inner records. Perhaps purity is just bad nonsense like others present, and it is better to allow mutation in this case.

You know, if anything, what would happen in the future will likely be wave of untyped easy-to-learn languages and no-code frameworks. The cost reduction is simply irresistable for companies

  • Adaptation of FP features is combination of trying to grow languages(so that contenders won’t follow? lol) + FOMO anyway. Saw the python thread regarding pattern matching? XD

Yea, it also doesn’t matter if FP as a whole die out as people slowly moves on from it. Who cares if remaining ppl dedicated to the paradigm gets unemployed and removed from the software scene?

Sorry, I misunderstood you and put such a harsh remark…

One of the issues up-thread seems to be lack of volunteers. Your point being that windbags like me who criticise but don’t develop in GHC/don’t maintain libraries are exactly the sort of (non-)‘contributors’ who GHC wants to lose? Then GHC is achieving win-win: I don’t want GHC any more; GHC doesn’t want me any more.

1 Like

I don’t think FP is dying. I’d say in some forms it’s getting more popular, certainly going mainstream.

I do think Agda/Idris/Lean took away some “cool dank language for smart people” reputation for the young hotshots on one hand, and Rust took away the “please just give me some language to use at work that doesn’t suck!” crowd on the other. I know people who would mess with both those and find Haskell an “awkward middle ground” between beauty and practicality. (To be clear, I personally don’t think Rust is more or less practical than Haskell.)

None of the above I am really mad/sad/worried about. I welcome the competition, and think it will push us in good directions.

2 Likes

Do you have concrete data on this claim? It is certain that FP languages are going down on the trend, with only trace of features adopted to the OOP languages.

The trend of simpler, easier-to-learn languages is also quite visible. Perhaps not in the US central scene, but it has already dominated the fringes.

If anything… the struggle of the various FP languages seems to be an effort to revert the dwindling posture in software scene.

I do not have any data.

People have being trying to do no-code/low-code things forever. I think it eventually will catch on, but I don’t see what makes this time better. Frankly, I have a hard time understanding the industry’s latest trends. Cloud is hugely overpriced, microservices are terrible, etc. etc. (I could go on about why I think everything is so screwy right now, but that would be veering off-topic.)

Rust Typescript and Swift, pattern matching in python, generics in Go, official static type checkers for Python and Ruby… our ideas are clearly catching on in the mainstream in some form. People have said for years “Haskell’s ideas will win, but not Haskell itself”. But I fail to see why compromise is inevitable, vs a sample of good things in more popular languages will just cultivate people’s tastes, and make them yearn for more.

1 Like

I see, tho one thing I noticed is that it is becoming more pervasive. There are already several no-code frameworks in my small country - the write-once manage-never solutions, most of which are developed in recent 5 years.

With advent of AI, more of no-code sols would become possible. In the meantime, languages like python could dominate the scene.

Well, me too. However, I do think we need to keep up with it. In the end, capitalism says that what is more efficient in terms of cost is better. Do not forget that people are often irrational, and that should be reflected in the efficiency analysis.

According to many users of the languages, pattern matching and static type checkers in python and other languages are FOMO and desperate attempt to ward off competition. I do not see how it shows growing popularity of FP.

… Perhaps I did too much of a rant… I guess I wanted to get this frustration off my chest.
I dislike when the world does not align with my directions, but that is likely part of my stubborn nature.

That’s not exactly how I would describe what I had in mind. More like: if someone was interested in Haskell and then came to the conclusion that it was not worth their time, the Haskell community loses out.

According to many users of the languages, pattern matching and static type checkers in python and other languages are FOMO and desperate attempt to ward off competition. I do not see how it shows growing popularity of FP.

Ah, but if they are feeling FOMO, then surely we’re doing things right!

In the end, capitalism says that what is more efficient in terms of cost is better.

Hehehe if only. I would argue we are in an speculative asset bubble such that many price signals are being drowned out. This makes me skeptical that what I see around me are enduring trends, not passing fads.

Lot of low-code stuff feels like lipstick on a pig to me, when the real productivity killer is not coding being hard, but the mountain of tech debt we work upon. Or at least, we could make coding easier better if all the bullshit was cleared away first.

I can’t but hope the economic conditions will change dramatically. I do the work I do assuming that hope will be borne out, and the tortoise will pull ahead of the hare.

1 Like

Hm, I thought FOMO meant “Fear of missing out, when it is simply an unworthy passing trend” - often used along with “FAD”. Anyway, let’s hope we are right: features like pattern matching are indeed useful and important.

Interesting, does the market action affect the entire economy in such a huge degree? If it is so… I hate the market now. (Stock market, right?)