Introducing NeoHaskell: A beacon of joy in a greyed tech world

Thank you @cookE (and @f-a for reaching out), yes I think that’s sensible.

Better to put your efforts into moving from ‘thinkware’ (nobody here said ‘vapourware’) to some prototyping or deliverables, so we can understand better what we might get excited about.



I recommend reading the first few sections and the conclusion of A history of Haskell - it may help you to understand why some of those solutions weren’t chosen for Haskell, and possibly why those new to Haskell (but having prior experience with more-imperative languages) can find the transition to be challenging.

I also recommend reading Why functional programming matters to better understand why we, as its users, believe that learning Haskell is worth the challenge.


The main thing I see here is people having strong opinions on:

  • how to engage contributors
  • how to market your ideas
  • whether the ideas are beneficial or harmful to the existing ecosystem (both is a possibility and must be openly discussed)

As such, it is important to share this feedback as early as possible. But it’s hard to share concrete disagreements, since those are still just ideas and they seem to be in flux even.

So far, the project has caused excitement, but also confusion.

As an analogy: if you’d announce on discourse that you’re going to fork GHC, possibly with proper funding, a lot of people will be anxious and a lot of people excited. You’re essentially creating an event horizon. This has happened before in our community and the results were mixed. Or: if you write a CLC proposal to ditch base for an alternative Prelude, you’ll probably get very mixed responses as well.

So, if you come up with grand and novel ideas, you should be prepared to be challenged. If you’re not, why are you going public with it?

In the end, this is a tech community.

It is ok to have strong opinions (as a proposer and as a challenger). It shows engagement. While we should value that people spend so much time on thinking about Haskell, we should also be aware that it doesn’t guarantee good decisions.

In my experience, sourcing opinions from the wider Haskell community is a tricky thing. In the past I was a big fan. Today I’m not, unless it’s about something very specific, like use cases for a feature. The noise and misunderstanding is often more draining than the actual valuable criticism.

I can’t say whether this project has merit or not. I find it hard to judge at this point. So I’ll just keep an eye out for it. Maybe it’ll take off at some point and surprise us all.

But if the original author thought this will be a walk in the park without controversy, I think they will also have to adjust their expectations.

While we should promote respectful speech, we should also be very careful to not discourage each other from voicing strong opinions, unless we’re ok to lose diversity of thought.


I too find very strange the caricature of Haskellers that is presented in the linked post. None of the Haskellers I interact with on a daily basis in my volunteer work improving the Haskell ecosystem are like that at all. They are all very focused on making Haskell a more practical and accessible language. Similarly, none of the Haskellers I’ve come across in my day jobs are like that either. They are all very interested in getting practical things done to generate good outcomes for their companies.

However, I think we have to accept that the Haskell community can be perceived to be elitist and snobbish and that perception is worth countering. Given what I said above, hopefully that shouldn’t be too hard!


That’s true. I think this discourse has been very helpful towards finding out the latest news and wanderings of the community, but I’ll admit I’ve still found it difficult to discover what the current efforts are towards improving the haskell ecosystem. Thanks to this post, I found some haskell proposals where important discussions were taking place regarding ghc-lib-parser. Im learning that the various ghc proposal githubs are where these discussions take place, but perhaps there’s more that can be done to really broadcast these efforts

I will say that the consistent updates the various teams have been posting (like ghc releases, Serokell dependent Haskell updates, CI updates) have done a lot to improve transparency.

Ive also heard others in the general programmer community say that haskell devs only care about academics. Perhaps as the eco system improves, this perception will change over time.


Just a short clarification on this. IOG does not engage in the dependent haskell work. That is the good folks at Serokell. IOG does engage in the JavaScript backend work, and general performance a bug fixing work as well as all the haskell.nix stuff though. :wink:


I find this really hard to understand. Do they think that work on ghcup, cabal, Stack and Stackage, GHCJS, The JavaScript backend work that @angerman mentions, Nix support, Yesod, IHP … is for academics?


I find this really hard to understand. Do they think that work on ghcup, cabal, Stack and Stackage, GHCJS, The JavaScript backend work that @angerman mentions, Nix support, Yesod, IHP … is for academics?

For what it’s worth, my perspective is that for a large portion of the programming community, work like this is quite invisible; i.e. it’s just expected to be there in a moderately large language. At least, speaking for myself, it took me quite a while to be aware of such efforts, and to appreciate what the challenges were. That’s, of course, not to say the massive efforts of this group of people aren’t appreciated; indeed, they are and they are essential!

I can at least offer some other data points to the general feeling out in the programming world of Haskell being “academic”; but I do think we as a group are making lots of good progress on that front, and I expect/hope a lot more to happen over the coming years :slight_smile: (that’s not to say I think it’s a particularly constructive criticism; personally I’d prefer to hear things that are a bit more actionable; i.e. a concrete list of things we’re missing focusing on, etc, etc, etc.)


Ah right! My bad. I should’ve verified that before saying it. I’m just gonna run back and fix that…

This is hard to know. Perhaps at one point I might’ve found myself with similar beliefs. The only insight I might have there is that those who aren’t plugged into the community don’t accurately understand what’s being worked on and perceive the pain points in the UX and language + slow progress to see anything change as confirmation that they aren’t being worked on


Yes, it definitely seems that programming languages get stereotyped in a way that is very difficult to shift.

And that comment is made even more ironic by the existence of Miranda(R), Clean and R (not to mention Prolog!) - there aren’t a great many complaints about those language communities being “primarily focused on resolving theoretical academic and mathematical challenges, often overlooking pragmatic solutions” along with being “overwhelming for newcomers”.

Surely by now Haskell receives more contributions than any of those languages, so this perceived lack of pragmatism is curious…which has me wondering even more if it’s certain choices made back in 1987 that are being questioned, hence my earlier post.

1 Like

My two cents… probably worth less. I’ll usually classify this as engineering and programming language research/computer science (bring out the pitch forks! :smiley:)

A lot of the engineering work is fairly mundane, grunt work. Not very flashy, not very shiny, also what you have to show for it is: something that was expected to work, but didn’t work for some arcane reason now works. So … ok. Now you can end up writing up lengthy posts as to why something didn’t work, and why it’s now working, and… well you won’t be wiritng much about the hours and days you spent actually trying to figure out what was wrong.

The PL research/comp sci side of things often has people being much more flashy about their achievements. They’ve got some great new idea, and can show something fun and interesting.

I could write how I recently found yet another null pointer exception in the windows linker in the rts. But is that really worth writing about? It’s after all “just a bug fix”. it does not feel as fancy or insight full as the latest “Let me show you what I read about category in this book, and how you can apply this in Haskell!”. Or “if we use these features of a dependently typed language, and model the following problem with it, we get these benefits (in our toy example)”. Who would read: "I’ve spent two days cursing my computer, staring and log messages, to find a missing if(NULL == x) return". Or “After tearing out my hair, I found out that we try to load clang++ into the linker because we didn’t check if the files we were loading were actually libraries or not”.


I find this quite readable:


There is something here that puzzles me. Haskell is supposed to be especially productive for writing compilers.

Haskell (especially its type system) is supposed to be expressive enough to give type-safety guarantees even for a compiler of a higher-order-typed language, thus avoiding much of the tedious cross-checking with compiler development.

GHC is a Haskell compiler written in Haskell. The language GHC supports is “being worked on” continually. Indeed the demands for development inside GHC often motivate bringing in new features to the language.

And yet if I compare the rate of new features appearing in GHC in (say) late 1990’s/early 2000’s [**] vs the last 5 years, progress seems to be getting slower and slower. Does this mean all those fancy higher-order features GHC is using are getting in the way of productivity with writing a compiler?

Are there fewer people working on GHC today vs 20 years ago? Something else outside of the compiler code itself that would explain? GHC is supporting more target environments? Supporting backwards compatibility and regression testing uses up nearly all the people that could be developing the compiler?

Is Haskell 2023 (or whatever older/conservative dialect is used inside GHC) actually less productive for writing compilers than ~2006?

[**] OverlappingInstances, FunDeps, PatternGuards, GADTs, TypeFamilies, Flexiblexxx, ExplicitForAll, … a large proportion of and somewhat beyond what’s now GHC2021.


I suspect (pure guess) that the numerous extensions come in the way because any new feature need to be compatible with all combinations of extension. I am in favour of cleaning up extensions (and the code) so that for example the default in Haskell 2010 and all extensions included in it are removed entirely (I mean there is no need to be able to deactivate them). Alternatively they could be an extension to chose Haskell 98 (as one extension) but there is not need to allow all combinations between 98 and 2010.

1 Like

If I were to guess the slowdown is caused by the weight of the backlog multiplied by all the new additions that are far more complex than any of those old ones (notably moves towards dependent and linear types).

As recent as GHC 8.10, small integers were represented as platform-sized ones (base-, text was in UTF-16 and writing strict low-level stuff still had to rely on lazy datatypes (UnliftedDatatypes landed in GHC 9.2). You can still be on the “oh man, I wish this feature was there five years ago” train, you just gotta move from the type-level camp into the low-level one.


No, I don’t believe this is the case. I’m just speculating here, but if I had to guess why development is so slow, I’d say it’s because the language is developed by the community as a whole. Lots of people has lots of different input and there are many different concerns we’re trying to address at the same time, in addition to trying to ensure that backwards compatibility isn’t lost.

I think this is a lot different than a language funded by an org and dictated by a few individuals who catered to their own agenda and didn’t care for the concerns of its users. People would probably find an environment like that very off-putting, but it’d probably also see features developed a lot quicker.

This is probably the case for the beginning of a lot of newer languages like rust and go. I heard that rust changed a lot in its beginnings, and people were upset that they broke things so often, and then it became more of a community effort once the core was more complete.

1 Like

@AntC2, it’s really none of your business to say what it is “better” for someone else to do. Can I implore you to please choose your words much more carefully so it is clear you are stating your own opinion and not making assumptions about other people?

EDIT: For example you could say

I would prefer you to put your efforts into some prototyping or deliverables, so I can understand better what I might get excited about.

Then Nick and everyone else is free to choose for themselves how they feel about your prefence.


It’s far from clear to me that work on GHC is slowing in any regard. In fact some of us are exhausted by how fast it moves and wish it would slow a bit! Perhaps there are fewer new type system features of the kind you’re interested in particularly? But when you look at the new features across type system, RTS and various usability improvements, plus the support across architectures, plus the fact that software systems absorb time and energy simply for maintenance I don’t see any reason to think that “Haskell 2023” is less productive than compilers in 2006. Quite the opposite.


If a system has N binary (on/off) inputs, then there are 2N combinations of inputs.

# ghc --show-options | grep '\<XNo' | wc -l
# ghci
GHCi, version 9.4.4:  :? for help
ghci> 2^132

It should already be happening…