8 months of OCaml after 8 years of Haskell in production

Fair enough, I think your “concerns” about Haskell are totally valid and valuable feedbacks.


I’m familiar with the downsides of this approach too :disappointed:
Still, trade-offs are everywhere, and in the end, I tend to follow my preference while being aware of all the pitfalls. I guess, I got bitten more by the lean standard libraries approach :sweat_smile:

Regarding error messages, I take your point that “this is just one example (and most likely not the best one)” so here’s an attempt at an example that is fairer, because it doesn’t involve typeclasses or overloading of operators and numeric literals (that OCaml doesn’t support anyway). OCaml’s message is still better though.

True, I’m still a bit sour about this particularly confusing GHC behaviour :sweat_smile:

Actually, GHC error messages are even better now, as you get a stable unique error code which you can read more about online with examples and suggestions on how to fix them!

GHCi, version 9.8.1: https://www.haskell.org/ghc/  :? for help

ghci> True && [False, False, True]

<interactive>:1:9: error: [GHC-83865]
    • Couldn't match expected type ‘Bool’ with actual type ‘[Bool]’
    • In the second argument of ‘(&&)’, namely ‘[False, False, True]’
      In the expression: True && [False, False, True]
      In an equation for ‘it’: it = True && [False, False, True]

The ideal language for my would be a set of languages that are easily interoperable and where I can pick the complexity level I get exposed to based on the problem at hand.

C#, F# and F* somewhat go into that direction, but it’s still a long way.

Haskell seems to go the “let’s retrofit whatever we can” route. I’m not a fan of that approach.


Actually, in this case Haskell’s message is still better than OCaml’s:

# true && [false; false; true];;
Error: this variant expression is expected to have type bool
There is no constructor :: within type bool

We can thank OCaml’s ‘type-directed disambiguation’ of constructors for this ambiguous error.


In less nebulous terms? The first step to fixing problems is defining them clearly.


But this is how the feature shoot-out trade-off table should really end:

lang research potential stability (∴ production use, comfort zone)
haskell more less
ocaml less more
cobol (sorry couldn’t resist :laughing:) none infinite

Lately I’ve been working a lot with Standard ML (so basically OCaml–) and one thing I absolutely miss is generic deriving. For the codebase I need to define basically a map function for every datatype, but without typeclasses and generic deriving I have to write all of this error prone boilerplate myself.
Related to this is newtypes, I miss them a lot. IMO deriving (via)/newtypes is by far the best haskell feature

What I like in Ocaml is that letrec is explicit, so i can easily shadow variables so the earlier assignment is out of scope in the rest of the function.


I’ve done so many times on various occasions. Yes, there’s progress on some fronts, but I believe that progress is often based on specific individuals (e.g. SPJ or Moritz pushing the stability efforts).

The problem is that it’s not primarily a technical issue, but one of aligning perception and defining priorities. So, software engineering approaches don’t work that well.

Some examples:

All these issues talk about perception, goals and priorities. These are in flux depending on the members of committes, maintainers of tools and libraries, as well as funding and clientele of some companies.

The end user experience is always an afterthought.


This is a great case in point of why I believe the situation is improving. I hope this doesn’t come across as impolite, because that is not my intention, but this is my perception: a couple of years ago stability was not on SPJ’s radar. He didn’t think about it and he didn’t deeply appreciate how important stability is to a language ecosystem. Now he does, moreover he is one of the biggest cheerleaders for stability! That is the outcome of a lot of challenging values-aligning work by several people (particularly the stability working group).


And I don’t want to sound like a groupie, but SPJ is special, not just because he’s the father of Haskell, but because he keeps surprising us with his way of reasoning, collaboration and listening to other peoples arguments.

But I’m not convinced that show’s a shift in community perception. I think some of the tension we experience is intrinsic due to the roots of Haskell and will never truly cease to exist. A language like “Go” does not have much of this tension, partly because the community perception is focused on “getting things done” almost uniformly and low appetite for experiments.

Haskell will always attract people with high appetite for experimentation and that will keep causing churn, tension and debate about stability, goals, complexity etc.

Yes, I believe the GHC SC stability document is a major step forward, which is why I’ve actively participated in those discussions.

And yet, my feeling is always that these things depend way too much on the support of individual people and the fact that we just got here after 33 years of Haskell tells a lot about (past) perception and priorities.

These things are nebulous and you really only experience them once you want to change something non-trivial. That is why I keep trying to push towards a “think about the end user first” approach. And I don’t think I’ve been particularly successful… it’s one thing to improve a library or tool yourself and a completely separate thing trying to change perception. You can’t work against the community perception for a long period of time.

So I guess my outlook is more pessimistic than yours.


In OCaml, there’s code generation enabled by preprocessor tools called PPX (one example is ppx_deriving). It’s not as powerful as TemplateHaskell (i.e. it doesn’t have access to types, only to syntax, so it’s more like Rust macros) but it works well in practice :relieved:

Instead of being type-directed, it relies more on naming conventions and incapsulation but I find this trade-off reasonable.

Related to this is newtypes, I miss them a lot. IMO deriving (via)/newtypes is by far the best haskell feature

OCaml actually has newtypes as well. You can define them like this:

type size = Size of int [@@unboxed]

I think that ppx deriving can leverage this to derive stuff differently. Not sure if it has been done already but I like that this can be handled on the library and the ecosystem level, removing the compiler from the pipeline bottleneck.


To summarize how things appear to me:

We both believe that “culture eats strategy for breakfast”. Substantial elements of Haskell’s culture make it difficult to implement sustainable improvements to user experience. That makes you pessimistic, because you know that culture is hard to change. On the other hand I’m optimistic because I believe the culture will change.

Does that sound like a fair summary?


What makes you think that?


How many years does haskell have left? This thread revived my worry of haskell becoming basically dead.

How many years does Haskell have left?

In the continuing absence of an all-new general-purpose non-strict programming language - many.


My concern is if that would be enough to tie enough people in haskell. What if non-strict evaluation comes to be considered ‘legacy’?

More points for lazy evaluation (2011) provides an informative example of what would be lost. But if that isn’t convincing enough, read section 1. Conventional Programming Languages: Fat and Flabby of Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs (1978) by John Backus to better appreciate just how dystopian a programming future devoid of non-strict semantics would be…


Yea, I mean every other parts of the world is going straight into the dystopian world. What would prevent programming from following the trend?

1 Like

Welp, maybe it fizzles out, maybe it gets huge traction. It all depends on what people do, so there’s no predicting.

If it helps, I’ve been involved in a little game called Garry’s mod that exists since 2005, and was released for purchase in 2006. It was just 10$ at the time. We’re in 2024 now and that game still has a huge community. People are still making awesome things in the Lua it supports. It’s not as old as Haskell, but it defied all my expectations.

The world of politics may be bleak, but I am convinced that Haskell will live a long and happy life yet. Disagreements about direction aside, I think we have a beautiful community with wonderful people. Everyone Is brilliant in their own way. The Haskell foundation is also doing amazing work.

There’s no predicting the future, but right now I think Haskell, and its people, are doing quite alright.


I don’t mean to draw parallels… but if it helps, look at PHP ? :slight_smile:

Conspicuous failure to die, and actually some significant and sustained improvement of long-standing weaknesses. It’s possible if enough people want it.