Single-compiler vs. multi-compilers (or. implementation-defined vs. standardized) languages

A lot of alternative compilers have already be mentioned, like Hugs, uhc, nhc, Helium etc. But I want to mention one additional compiler: haskell-src-exts. Well, technically it isn’t a compiler, it is “only” a parser, but it was the foundation of a lot of great compiler-like Haskell tooling. We had renamers, mutation testing frameworks, source-to-source supercompilers, linters, refactoring tools etc. (The list of reverse dependencies of haskell-src-exts is here). One of the ideas was to have a suite of libraries that tooling can build on, the haskell-suite.

Among all this tooling that was developed, most became unmaintained, some, like hlint, invested the effort and switched to the ghc-lib-parser, but none of those still relying on haskell-src-exts work reliably on all of modern GHC-Haskell code.

In my opinion, we need a good replacement for the haskell-src-exts ecosystem more than we need another “Haskell source code to machine instructions” compiler. Nowadays it is already possible to use the ghc-lib-parser directly, but it is still much more inconvenient than what haskell-src-exts provided. Trying to write tooling which relies on any of the later AST representations, like renamed or typechecked Haskell code, by relying directly on the GHC Api, is both extremely difficult and also difficult to maintain.

If we dont have the necessary engineering power to maintain “just” another Haskell parsing library, I don’t see how another Haskell compiler which is sufficiently feature-rich to support modern Haskell libraries could be sustained.

But to end on a very optimistic note: I am looking with great enthusiasm at a lot of the effort that people are spending in making GHC more modular. Similarly, the splitting of cabal into separate libraries is making it easier to develop tooling for cabal files. I really liked David’s Haskell Symposium keynote about focusing on tooling in the next decade, and I think a more library-oriented approach to our compiling infrastructure is the way to get there.

5 Likes

Facebook is running a GHC fork afaik (or heavily patched one).

Standard Chartered has its own (incompatible) Mu compiler.

Industry has enough resources, if they really want to. To me it just seems there are no combined efforts and so they either do their own thing or stick to GHC and accept the shortcomings wrt their priorities.

A company that employs 50+ Haskell devs should do anything they can to improve compilation time, because it directly correlates with developer productivity. Interactivity and short feedback loops are immensely under prioritized in my opinion.

I think HF could very well contact major industry users and get feedback about their priorities wrt Haskell compiler.

Otherwise we’re just discussing this ad hoc for the 10th time with no interesting data.

1 Like

Interestingly, they are trying switch to GHC for their front end (parsing & type checking) because they don’t want to maintain a full compiler stack themselves.

6 Likes

Yes, but the reason an alternative on-par compiler doesn’t exist is not lack of resources.

I’m not sure SCB has a lot of interest in the new language features of GHC 9.x (at least I don’t as a Mu developer). Mu doesn’t even support all of 8.10.7 language extensions.

So if such a GHC 8.10.7 fork existed and was well and lively, maybe that would be a much more interesting base/target (maintaining patches, backends or depending on GHC API is much less work if the compiler is prioritizing stability… ask HLS devs).

That’s ofc my opinion.

2 Likes

I had the same thoughts before. But at least GHC 9.2 has such powerful RTS features which make pre-GHC 9.2 compilers look quite modest:

  • Info table profiling: this option e.g. tells the specific code line that contains space leaks. It’s now impossible to imagine how we lived before that. Blog post for more info.
  • The -Fd option for returning memory to OS. Before that option, Haskell programs could consume 4x (!!!) more memory than they actually use. Also blog post with more explanation.

Improvements to RTS will come along breaking changes and language changes. Unfortunately, they’re not decoupled enough at this stage so if you want to benefit from ground-breaking observability and performance improvements, you need to eat a bitter pill and pay the upgrade cost.

On the brighter side, such runtime behaviour improvements justify the upgrade code quite well.

6 Likes

…and now a growing number of Haskell users have made the judgement that the “upgrade cost” is now too high, otherwise they probably wouldn’t still be using version 8.x.y now - if it’s not being used and it’s expensive, is it really there? Furthermore, there’s already been a variety of new features which have been transferred to older GHC versions.

However if there was another Haskell compiler, that process could go in both directions: patches which mend problems in the other compiler could also prove useful in GHC.

Is it really growing? I remember 7.10 was quite popular for a long time while GHC 8 was already released.

What are really the blocking breaking changes now that DeepSubsumption is back? I can only find:

  • 8.8 removes the fail method from the Monad class
  • 9.0 makes TH splices separation points for constraint solving passes
  • 9.2 simplifies subsumption, but old behavior can be restored by using DeepSubsumption

All other breaking changes seem unlikely in practice. But I guess they could still be likely to occur somewhere in the transitive closure of all dependencies of your projects.

That is also good to note: making the compiler stable is one thing, but you also need to deal with the ecosystem. Is it really possible to stabilize a large enough part of the ecosystem?

Also, while making that list of breaking changes I thought that it would be very helpful if we had a comprehensive list of breaking changes for all GHC versions and perhaps also for other popular libraries. Every breaking change could include example error messages and migration strategies. That sounds a lot like the error messages index to me. Maybe this is a good idea for HF @david-christiansen or the stability working group?

Forking at 8.10 would give us another Eta. A laudable initiative, but it’ll fall behind too fast, feature-wise, to sustain interest.

3 Likes

I was also thinking of conway’s law, but for opposite reasons! I really don’t think the standardized vs implementation barrier is what stands in the way of addressing cross-cutting concerns. Its not causation, but mutual correlation. If you have one (funded) team seeking to develop a successful language, then you will get a tooling and support ecosystem that is integrated as part of a single effort. If you have different groups working on the compiler, the editor support, the build system, etc. then you will naturally get OS-level process boundaries as a way to enforce modularity between the various components of the respective teams. Also, it so happens that if you have one funded team seeking to develop a successful language you will tend to get an implementation-defined language, because that is the default and only in some fairly rare cases is there enough compelling reason to drive a standardization process to fully take place.

(And typically that happens best once a language is “settled” – i.e. when there is not a core team trying to drive it forward any more).

There are plenty of mainly implementation defined languages that have none of the tooling benefits we see from the integrated structure of racket or rust – e.g., python, perl, ruby, etc!

To be really provocative about it, I’d even pose that the evolution of Racket has been somewhat ret-conned. It was not first a language that came along with a development environment. Rather, it was first an IDE for scheme, that grew an integrated ecosystem to such a degree that it eventually became a language!

1 Like

A group of volunteers can just start backporting things into ghc-8.10 branch of GHC. If they make significant progress and maintain required level of quality, I imagine it should be possible to make more releases in 8.10 series. This is magnitudes less effort than forking a compiler, and still I do not believe it will ever happen.

4 Likes

…yes: trying to find such a group of volunteers that haven’t already grown tired of constantly chasing after the “innovation inferno” that GHC has turned into would be difficult. It’s probably easier for them to e.g. just grab the “plus-sized” copy of Hugs, make the necessary changes, build and install it instead (most likely for their own personal use) - 75% of the features for 75% less hassle…

But who knows: in the future, those HLS devs, also thoroughly-fatigued with the status quo, may choose to first build the HLI (Haskell Language Interpreter), to investigate the potential for developing their own compiler.

The fact that Hugs was happily left to R.I.P. many years ago without much attempt to resuscitate suggests that your estimate of “innovation inferno” of GHC is very much wrong.

2 Likes

I believe improvements to RTS are quite orthogonal to surface language features (also in terms of GHC codebase), so I don’t see a problem with backporting them.

But GHC devs might know more.

My very personal estimate of GHC’s innovation: the only feature of value since Hugs was left (late 2006) is Pattern Synonyms - and even those I think the idea is promising, but the detailed implementation is unfriendly to those declaring Synonyms.

There were plenty of innovations as of 2006 that were left half-finished or even withdrawn: FunDeps, Overlapping Instances, pre-1999 Datatype Contexts, TREX – which purescript has implemented more-or-less.

I agree with the distaste expressed above for GHC 9.0 onwards: of course performance improvements are always valuable; the ‘features’ are just making GHC’s syntax too hard to keep up with. I’m no longer even trying.

At work we are very happy after upgrading to 9.2. The upgrade was not difficult and we only saw benefits.

I would not like to live without RecordDot again.

The negativity against 9.2 puzzles me.

11 Likes

I guess there are three kinds of people: Those who happily use 9.2, those who are working on upgrading to it, and those who gave up and spend more time on the forums :wink:

10 Likes

You forgot (at least) three other kinds of people:

  • Those who think Haskell has ended up a being language for “propeller/egg heads” or “elitists” and have quietly gone elsewhere;

  • Those who are no longer interested in the “busy-work” of rewriting legacy code to keep up with the latest fads “innovations” from Glasgow Haskell Central and have quietly gone elsewhere;

  • Those who have seen other single-implementation languages arrive and depart:

    […] since the compiler was written in LML [with only one implementation remaining] it was more or less doomed to dwindle.

    …and are understandably cautious about being too dependent on another one: Haskell.

  • (…there are probably others ;-)

If that was true, why would Hugs be abandoned since 2006? If your longing for Hugs was not just a sweet nostalgia, someone would keep the lights on.

2 Likes

…actually, some people still do (even if it is just essential maintenance). And who knows what could happen when the second Rust compiler arrives:

At least Hugs is written in a standardised language with at least two major implementations ;-)