Single-compiler vs. multi-compilers (or. implementation-defined vs. standardized) languages

Yes, but the reason an alternative on-par compiler doesn’t exist is not lack of resources.

I’m not sure SCB has a lot of interest in the new language features of GHC 9.x (at least I don’t as a Mu developer). Mu doesn’t even support all of 8.10.7 language extensions.

So if such a GHC 8.10.7 fork existed and was well and lively, maybe that would be a much more interesting base/target (maintaining patches, backends or depending on GHC API is much less work if the compiler is prioritizing stability… ask HLS devs).

That’s ofc my opinion.

2 Likes

I had the same thoughts before. But at least GHC 9.2 has such powerful RTS features which make pre-GHC 9.2 compilers look quite modest:

  • Info table profiling: this option e.g. tells the specific code line that contains space leaks. It’s now impossible to imagine how we lived before that. Blog post for more info.
  • The -Fd option for returning memory to OS. Before that option, Haskell programs could consume 4x (!!!) more memory than they actually use. Also blog post with more explanation.

Improvements to RTS will come along breaking changes and language changes. Unfortunately, they’re not decoupled enough at this stage so if you want to benefit from ground-breaking observability and performance improvements, you need to eat a bitter pill and pay the upgrade cost.

On the brighter side, such runtime behaviour improvements justify the upgrade code quite well.

6 Likes

…and now a growing number of Haskell users have made the judgement that the “upgrade cost” is now too high, otherwise they probably wouldn’t still be using version 8.x.y now - if it’s not being used and it’s expensive, is it really there? Furthermore, there’s already been a variety of new features which have been transferred to older GHC versions.

However if there was another Haskell compiler, that process could go in both directions: patches which mend problems in the other compiler could also prove useful in GHC.

Is it really growing? I remember 7.10 was quite popular for a long time while GHC 8 was already released.

What are really the blocking breaking changes now that DeepSubsumption is back? I can only find:

  • 8.8 removes the fail method from the Monad class
  • 9.0 makes TH splices separation points for constraint solving passes
  • 9.2 simplifies subsumption, but old behavior can be restored by using DeepSubsumption

All other breaking changes seem unlikely in practice. But I guess they could still be likely to occur somewhere in the transitive closure of all dependencies of your projects.

That is also good to note: making the compiler stable is one thing, but you also need to deal with the ecosystem. Is it really possible to stabilize a large enough part of the ecosystem?

Also, while making that list of breaking changes I thought that it would be very helpful if we had a comprehensive list of breaking changes for all GHC versions and perhaps also for other popular libraries. Every breaking change could include example error messages and migration strategies. That sounds a lot like the error messages index to me. Maybe this is a good idea for HF @david-christiansen or the stability working group?

Forking at 8.10 would give us another Eta. A laudable initiative, but it’ll fall behind too fast, feature-wise, to sustain interest.

3 Likes

I was also thinking of conway’s law, but for opposite reasons! I really don’t think the standardized vs implementation barrier is what stands in the way of addressing cross-cutting concerns. Its not causation, but mutual correlation. If you have one (funded) team seeking to develop a successful language, then you will get a tooling and support ecosystem that is integrated as part of a single effort. If you have different groups working on the compiler, the editor support, the build system, etc. then you will naturally get OS-level process boundaries as a way to enforce modularity between the various components of the respective teams. Also, it so happens that if you have one funded team seeking to develop a successful language you will tend to get an implementation-defined language, because that is the default and only in some fairly rare cases is there enough compelling reason to drive a standardization process to fully take place.

(And typically that happens best once a language is “settled” – i.e. when there is not a core team trying to drive it forward any more).

There are plenty of mainly implementation defined languages that have none of the tooling benefits we see from the integrated structure of racket or rust – e.g., python, perl, ruby, etc!

To be really provocative about it, I’d even pose that the evolution of Racket has been somewhat ret-conned. It was not first a language that came along with a development environment. Rather, it was first an IDE for scheme, that grew an integrated ecosystem to such a degree that it eventually became a language!

1 Like

A group of volunteers can just start backporting things into ghc-8.10 branch of GHC. If they make significant progress and maintain required level of quality, I imagine it should be possible to make more releases in 8.10 series. This is magnitudes less effort than forking a compiler, and still I do not believe it will ever happen.

4 Likes

…yes: trying to find such a group of volunteers that haven’t already grown tired of constantly chasing after the “innovation inferno” that GHC has turned into would be difficult. It’s probably easier for them to e.g. just grab the “plus-sized” copy of Hugs, make the necessary changes, build and install it instead (most likely for their own personal use) - 75% of the features for 75% less hassle…

But who knows: in the future, those HLS devs, also thoroughly-fatigued with the status quo, may choose to first build the HLI (Haskell Language Interpreter), to investigate the potential for developing their own compiler.

The fact that Hugs was happily left to R.I.P. many years ago without much attempt to resuscitate suggests that your estimate of “innovation inferno” of GHC is very much wrong.

2 Likes

I believe improvements to RTS are quite orthogonal to surface language features (also in terms of GHC codebase), so I don’t see a problem with backporting them.

But GHC devs might know more.

My very personal estimate of GHC’s innovation: the only feature of value since Hugs was left (late 2006) is Pattern Synonyms - and even those I think the idea is promising, but the detailed implementation is unfriendly to those declaring Synonyms.

There were plenty of innovations as of 2006 that were left half-finished or even withdrawn: FunDeps, Overlapping Instances, pre-1999 Datatype Contexts, TREX – which purescript has implemented more-or-less.

I agree with the distaste expressed above for GHC 9.0 onwards: of course performance improvements are always valuable; the ‘features’ are just making GHC’s syntax too hard to keep up with. I’m no longer even trying.

At work we are very happy after upgrading to 9.2. The upgrade was not difficult and we only saw benefits.

I would not like to live without RecordDot again.

The negativity against 9.2 puzzles me.

11 Likes

I guess there are three kinds of people: Those who happily use 9.2, those who are working on upgrading to it, and those who gave up and spend more time on the forums :wink:

10 Likes

You forgot (at least) three other kinds of people:

  • Those who think Haskell has ended up a being language for “propeller/egg heads” or “elitists” and have quietly gone elsewhere;

  • Those who are no longer interested in the “busy-work” of rewriting legacy code to keep up with the latest fads “innovations” from Glasgow Haskell Central and have quietly gone elsewhere;

  • Those who have seen other single-implementation languages arrive and depart:

    […] since the compiler was written in LML [with only one implementation remaining] it was more or less doomed to dwindle.

    …and are understandably cautious about being too dependent on another one: Haskell.

  • (…there are probably others ;-)

If that was true, why would Hugs be abandoned since 2006? If your longing for Hugs was not just a sweet nostalgia, someone would keep the lights on.

2 Likes

…actually, some people still do (even if it is just essential maintenance). And who knows what could happen when the second Rust compiler arrives:

At least Hugs is written in a standardised language with at least two major implementations ;-)

Sorry, but even Miranda gets more updates :wink:

1 Like

Miranda(R), like Erlang, is a language with “commercial backing” rather than just being sponsored (to varying levels).

But it’s nice to know it’s still getting some attention after all these decades… :-)

(…wait: what language is Miranda(R) implemented in again?)

Miranda(R), like Erlang, is a language with “commercial backing” rather than just being sponsored [link to Haskell Foundation website] (to varying levels).

“just being sponsored” is also a weird way to say “has over $500k/year in funding from commercial backers”.

Haskell is financially backed. There are multiple companies that add features to GHC constantly, including:

  • Well-Typed (backed by Hasura, IOG, etc.) has at least 6 employees that commit to GHC often from a cursory look
  • A bunch of companies funded HLS and donated employee hours
  • IOG is merging GHCJS upstream
  • Tweag has done a whole bunch of stuff on GHC
  • Serokell funds a bunch of work on GHC
  • The list goes on: Obsidian, Meta, Mercury…
1 Like