8 months of OCaml after 8 years of Haskell in production

Please, can we not start this Neo thing again :slight_smile:


But wouldn’t a successor to (Glasgow) Haskell be the ultimate innovation? If not, another option is a new Haskell compiler…

Basically, you think Haskell adoption (especially among industries) will increase if there were less frictions. Unfortunately it works the other around. That friction will diminishes overnight if one of the big company decided to use Haskell.

The problem if I am right is, it means that big companies don’t believe in Haskell (even though we know there are aware of it existences), and we are wrong : Haskell is not the silver bullet we want to think it is.

1 Like

I think the friction is more philosophical than technical tbh. Better tools won’t resolve that.

But we don’t need anyone and everyone to adopt Haskell! There’s a saying: “Marketing is deciding who not to sell to” :grin:

1 Like
  • hiring is hard
  • talented Haskell juniors will stay 1-2 years max in your company, before they transition to an employer you can’t compete with

So it’s both not hard for someone to hire and not that hard to find a job (which seems many people have an issue with)?

Excellent points about other aspects. For me, the main fun of the Haskell is how elegantly one can solve problems in it. Basically, how amazingly good a good software design looks in Haskell.

Unfortunately, people rarely write about simple Haskell that does the work well as it’s not that interesting. But people love to write about crazy experiments. Which is great and pushes progress forward, but it leads others to think that this is Haskell. Then people start using experimental effect systems or lenses everywhere, and we get:

  • onboarding new Haskellers is always challenging: people are into different corners of the language and what seems like an easy to understand codebase for some, is a nightmare for others

I think Haskell lacks a body of industrial usage wisdom. Everyone knows that C++ template metaprogramming leads to overcomplication, but not everyone knows which Haskell features/approaches should or should not be used in production systems.

Haskell’s expressiveness is a double-edged sword: one can both implement an amazingly simple and an amazingly complicated solution. And this is something that should be covered more in a Haskell world (“software engineering fun”?).

One can jump to OCaml in search of simplicity, but that’s like throwing the baby out with the bathwater.


Heh I wrote a little (over-simplified) gist about the tension that arises from this in general (not just juniors) years ago PLs-misaligned-incentives.md · GitHub

1 Like

That friction will diminishes overnight if one of the big company decided to use Haskell

IIUC, the current OCaml splash is mostly thanks to Jane Street backing a lot of new features.

It’s quite fun that about 15 years ago OCaml was basically a dead language. I could write a similar “4 months of Haskell after 4 years of OCaml” when I made an opposite switch back then:

  • Haskell has a parallel runtime,
  • more features,
  • more libraries,
  • more tools,
  • bigger and more active community,
  • nicer syntax,
  • type classes, lazy evaluation, …
  • slightly slower in some sequential programs, but we could fix it with parallelism

Only parallelism aspect has changed recently.

What’s interesting is that more features lead to unnecessary over-complication and that’s the point I’m completely agree with @ChShersh. I’m not yet sure what to do with this.


Fair enough, but in the case of Haskell, it is easy to limit features in a enterprise context by limiting authorized extensions.


…just remember to get ear-plugs or noise-cancelling ear/head-phones when the bikeshedding begins at each code review: "this would be so much easier if we could just use $LANG_EXTENSION…"

1 Like

Unfortunately, it’s possible to abuse basic features like type classes and lazy evaluation (monad transformers and lenses do not require a lot of extensions).

Even a more concise syntax can be abused. OCaml uses @@ instead of $ and doesn’t have a function composition operator, so one needs to be either more explicit/wordy or not abstract some things at all.

Haskell allows a much better function composition style, but one needs to find a balance between making code cleaner and making it point-free garbage.


No language, natural or artificial, will make it unnecessary to learn and practice eloquence.


There it is:

…it wasn’t always like this:

So Haskell wasn’t always the centre of innovation it’s seen as now, with some of that occurring in a smaller Haskell-style language with its own (also smaller) implementation. Based on that observation, here’s a simple idea:

  • bring back Gofer for the purpose of innovation and exploration,

  • leaving Haskell to stabilise for use in production and education.

Ideally Haskell would then be a subset of Gofer, with new language features being migrated from Gofer to Haskell once they have proven to be useful for more than just (re)implementing Gofer. So there would be a need for Gofer to be accessible to the wider community, but in the manifest knowledge that Gofer will always be “subject to change with little, if any notice” - that would just be “life on the cutting-edge”


I don’t think either direction is correct. Real word complex systems evolve through small changes. In principle it’s possible that a large industrial user base could trigger a tipping point that eliminates friction in the language; on the other hand it’s possible that a large user base could be wiped out by too-high friction, especially once an alternative with less friction arises. Nonetheless, magically creating a large industrial user base is not one of the possibilities before us, so the only question that’s really important in this regard is “will the benefits of reducing friction be worth the costs?”. I suspect they will.


True, but 20 years ago, Haskell tooling was on par if not better than other languages.
For C++, there were nothing if you were working on Unix (which my case at work).
You’ll have to create your makefile manually, download external lib yourself, no repl etc …
Same for Python (PIP was released in 2011). Cabal and GHCi were actually cutting edge. Writing a cabal file was a breeze compare to a makefile.
Having a repl for compiled language was also really good.
This “advance” in tooling didn’t boost Haskell adoption neither the C++ lacks of tooling deterred industry to adopt C++.

I believe that even, of course the tooling could be better, it’s good enough and people would give up Haskell because of the tooling would have given up anyway.


I’m not exactly sure what your claim is, so it’s hard to know what exactly we disagree on. Mine is this: I can’t imagine Haskell becoming a widely-used language in industry if it doesn’t fix many of the frictions in its ecosystem first. This does not imply that those frictions wouldn’t be tolerated if Haskell was already widely used (although I think they wouldn’t) nor that fixing those frictions is the only thing that needs to be resolved nor even that Haskell can or should ever become widely-used at all – just that given from where we’re starting, I don’t see how we could proceed to wider use without fixing those frictions.


What I mean is, reduce the frictions for the current users (to make OUR lives easier) but don’t expect to gain new users from it.

1 Like

The problem with that curious reasoning is that most “current users”:

  • learned to use Haskell when it was a much smaller and simpler language;

  • and as @f-a noted elsewhere, have since then acquired more Haskell experience and knowledge since that time.

Because of that, only making things easier for “current users” would indeed fulfil your expectation to not “gain new users from it” (apart from a few determined individuals) - having seen such a comment, why would a new user even bother?

You are misinterpreting what I said (or maybe I didn’t express myself well).
What I mean is I doubt reducing frictions will increase the users base.
So improving tooling is good, just don’t be disappointed if it doesn’t result in more adoption.

Tooling is like the manners of a language. It’s the first impression you get when interacting with it. And it certainly can drive new (and existing) users away. Others won’t mind, because they focus on the value underneath.

But it most definitely is also a value considered in industry when weighing language/technology options. Because tooling can an does improve productivity (and fun).

Why C++ is still used in industry? Because there has not been a good alternative with better tooling. But that seems to be changing too.


This post on reddit
At this point I’m convinced that Monads aren’t really a thing in programming. It’s just a buzz word Haskell programmers through out to make themselves sound smart. is instructive on how Haskell can be perceived in the outside word. (I might not be representative though …)

Warning it’s not a nice read

1 Like