Introducing NeoHaskell: A beacon of joy in a greyed tech world

Sounds to be fairly dismissive of the work that has happened over the last ~4 years.

With vscode, you pretty much have an all-in-one experience now. There’s still rough edges, but the concept is firm. It figures out your project GHC version, then figures out the latest HLS version that supports it and then installs everything in one go.

The rough edges are not conceptual issues. Those are all low-level issues that don’t need grandiose ideas, but people constantly fixing small things that sound boring to do.


This is sadly very true. We need lots of folks interested in the low level mundane parts of the compiler. Willing to help us do unglamorous work towards better stability. Including making more packages reinstallable, fixing our compiler story on windows (we have people who know how, but simply lack the time to do so, yet would be happy to mentor). Look at the side of built artifacts, why are our binaries (comparatively) so large? There are lots of quality of life improvements to be contributed to stack and cabal. The ability to use Haskell on mobile (natively), while possible is not really trivial. Iserv could be so much better. Helping with CI related topics across the whole ecosystem. I’m sure GHCup would like to see more maintainers too.

It’s not like we have too many people actually implementing these things.

We have a vibrant community with lots of interesting and grandiois ideas (neohaskell included). Getting them to the POC, and later implemented stages is a lot work. Polishing and making them great, is even more. Pareto seems to mostly hold true for most parts :frowning:


…and now for today’s dumb question: is the Prolog community having similar difficulties?

To me, Haskell and Prolog have at least one thing in common: the both require programmers from imperative languages to “unlearn” old habits - Haskell with immutability/constancy of expressions and laziness; Prolog with unification, et al. So both languages ought to be having approximately the same problems with regards to adoption and support…are they?


IME, Haskell’s strength lies in managing complexity - but this is not something you can easily demonstrate to someone who isn’t familiar with the language already. Most examples of any programming language are of “introductory” nature, and hence, there is not a lot of complexity to them. A fibonacci function, a guess-the-number game, or even a To-Do-List webapp, are not complex enough to drive home where Haskell actually shines, and trying to do so tends to come across as thoroughly underwhelming to anyone who doesn’t already know what’s beyond the immediately visible horizon. “Hello world” in Haskell is simpler than its Java equivalent, but compared to Python, it’s still baroque, so if you judge Haskell based on that, then, well, Haskell is going to lose.

And there, I think, lies a more fundamental problem with making Haskell more attractive to newcomers. Of course there is some accidental complexity with Haskell, of course there are the usual warts and quirks, but I think the real issue runs much deeper. Languages like Python and Go sacrifice the “top end” in favor of the “bottom end”, that is, they give up a lot of advanced sound abstraction features in order to keep the initial learning curve shallow, and avoid overwhelming new learners with a truckload of new abstract concepts to learn. This, however, is inevitable - I don’t think you can have a language that so heavily relies on sound abstractions to boost developer productivity, and at the same time keep “abstract abstractions” to an absolute minimum.

One has to experience that power in the flesh, by working on a nontrivial project over a longer period of time, in order to fully understand the benefits.

What Haskell buys you over Python is not “I can reduce these 12 lines of Python to 9 lines of Haskell”, nor “look, if I pass a string to this 4-line function instead of an integer, I get a compiler error” - it’s rather things like making refactorings in huge codebases while being reasonably sure you’re not breaking anyhing, things like swapping information about the problem domain out from your brain and into the code, and swapping it back in just as easily, things like keeping code workable by having solid, compiler-enforced abstraction and scope boundaries, and of course also the type system reducing the search space for your informal and formal reasoning endeavors to a fraction of a unityped universe. You don’t get to capture these benefits in a 20-line code sample.


Sorry, can’t resist, but you can always enable UnicodeSyntax to reduce the amount of ASCII in your code…


I hope, and truly believe, that in this forum we can accommodate a diverse range of respectful communication styles whilst pushing back against disrespect, insults and rudeness. As far as I know this forum doesn’t explicitly adhere to the Haskell committee guidelines for respectful communication but I think they can serve as a useful standard for individuals to refer to if they want guidance in their own communication style, and to get a sense of what might be reasonable to expect from others.

People from different backgrounds have differences in communication style. People also differ in the way they receive communication. There is no one perfect way of communicating and I think that, in terms of respect, the intention with which a message is sent is more important than its precise content. I would ask everyone to bear in mind that on the other end of the messages they send is a human being with feelings, and to ensure that their intention is always to respect the feelings of the recipient.


As others have stated on this thread, there are a lot of misconceptions and branding issues with Haskell. Old problems and old attitudes still shape widely-held beliefs.

Therefore, the list you write here seems like perfect material to use when suggesting changes to the homepage of If anything on the list isn’t actually a problem, or can be listed as a strength, then maybe we just need to communicate it better. Maybe Haskell does need a rebranding (a slow, thoughtful one – please don’t start from scratch), and is just sitting there waiting for suggestions. :wink:


You don’t get to capture these benefits in a 20-line code sample.

While true, part of my point is that Haskell can abstract far more aggressively due to its functional nature. A barebones web-server in my mind is just a tree that picks an endpoint and some HTTP/2 serialization, that’s two separate libraries and every extra format/encoding/authentication/cookie handler can also be completely separate. Having a batteries-included library where you can strip away layers to get near-infinite customizability would be different enough from the norm to warrant showing off.


Or maybe we just need to realize that Haskell is not the silver bullet we want it to be, it doesn’t suit everybody mindset and there is no point in trying to evangilize. I know (lots of) people which “hate compilers” and will stay away from static typing not matter what.
Maybe we should focus on making the tool/ecosystem work better for us (the Haskell users) than trying to adapt them to an imaginary target audience.


But when the trying stops, that’s when you become Common Lisp or Smalltalk, a group of highly-skilled users with domain knowledge that has low relevance on the outside and has lost its capability for new research.

It’s a case where the social value of Haskell (a grouping of like-minded individuals with comparable skills and values) has eclipsed its value as a basis for research or production.

Here’s the fundamental problem.

Tooling isn’t going to magically sprout out of no where, and work on GHC requires not only people, but also money.

All of the work on the existing Haskell ecosystem requires labor, often from very talented people, and these very talented people have to be paid or pay by volunteering their time.

If you decide that Haskell is a niche language whose main value is social, i.e, for Haskellers to hang around with others of common interests and values, you’ll face a resource crunch for work on GHC and Cabal, because much of the work is being financed by Haskell Foundation, and Haskell Foundation is paid substantially by production.

What’s worse, a lot of these costs are exponential; i.e, the original typeclass concept was done in a relatively small span of time, while adding ergonomic dependent types to GHC has taken years so far and is still not done.

If pushing into production is no longer emphasized, Haskell Foundation’s donors will gradually deplete (whether they move to new languages, get bought out and pushed off Haskell, or go under), and who will then pay for work on GHC?

And then there’s the fact that other languages are free to copy features from Haskell as they wish; what we saw with Rust was a ML-style type system grafted to immutable-by-default variables in an imperative language.

The gap between Haskell and other languages continuously diminishes and makes it harder for Haskell to be viable in production, until ultimately, what you have is a Smalltalk; all the good ideas have been stripped off and there’s just a community of die-hard devotees. You may say the Haskell community is better for you than any other, but then it’s no longer a living language.


About the homepage, it might sound trivial, but I think the pretty landing pages of Rust and other ecosystems helped them a lot.

Would it make sense to outsource design work to build a more modern and thought-out website for Haskell? The websites we have are great and functional, but it shows that they were built by Engineers and less by UX or landing page copywriting experts. It might help a bit with the branding and public image.


Yes, I think there is merit to that idea, although I should point out that people have attempted broad rewrites in the past and it was met with a lot of resistance. But thoughtful application of good UX (including understanding who the audience is and what should and should not be highlighted) to incremental changes seems like something that could be successful. It would be worth exploring the idea as a Technical Working Group proposal.

(Well, I’m not sure if that’s the right place to suggest this kind of proposal, but they’ll know where to point you if not.)

1 Like

And therein lies the conundrum: as @maxigit noted earlier, if we’re not careful we’ll end up pursuing an target audience which isn’t really there.

1 Like

…prehaps most notably being the arrival of type classes in C++! More importantly though, designers of other languages can pick and choose what new innovations have appeared in Haskell. Therefore any advantage granted to Haskell by way of continual innovation also diminishes over time. But the same cannot be said about the cost of innovation.

As I noted eariler, each new language-extending innovation has to exist alongside all prior ones, which leads to an exponential number of language-extension combinations. In the absence of some wondrous advance in managing the ensuing complexity…the cost of continual language innovation will eventually be overwhelming. Any notion of “a balanced approach” to innovation must take that cost into consideration.

For the forseeable future, the only way to manage that cost is to limit the total number of language extensions, by:

  1. consolidation - widely-used extensions are brought into new Haskell standards,

  2. or elimination - the least-used extensions are eventually dropped.


There is an answer to that, which is to remove the ability to disable extensions which have been included in the language (Haskell2010, Haskell202?). We are in 2023, there is no need to support Haskell98, especially the permissive ones.

For example, ImportQualifiedPost has been included in GHC2021. It’s harmless, what is the need to be able to disable it ? Why not just say, going forward that is valid haskell. (Ok I could the problem there is Haskell2010 doesn’t really bring anything and GHC2021 is not Haskell2021).

If people really need pure Haskell98 they might revert to an old version of GHC.
By the way, it might actually be easier to keep alive a few diferrent versions of GHC (one for each “language”) rather than dealing with this exponential complexity.

1 Like

You would need more that just separate compilers - all other parts of the Haskell realm would need to be replicated, possibly all the way to websites:


etc. Then the issue of ongoing support arises:

  • should that realm be based on an interpreter rather than a compiler?
  • should it only have CLI-based tools instead of GUI-based ones?
  • when should support for realms stop?

…and on and on and on it goes! It seems to resemble the ongoing development of an open-source operating system (sans the bootloaders and device drivers).

1 Like

I believe NeoHaskell might be focusing on the wrong challenges. To bolster Haskell’s adoption, what’s crucial is a thriving ecosystem of libraries tailored for business applications. More importantly, these libraries need consistent maintenance.

The economic dynamics of software development dictate that technology decision-makers often seek a balance between adopting the best technologies and avoiding risky choices that could potentially hinder business objectives. Why the emphasis on business applications? They constitute the bulk of SaaS and enterprise applications. Achieving a significant number of success stories in this realm will naturally lead companies to choose Haskell. Once these companies grow profitable and sizable, they are more likely to give back to the community.

To further underscore my point, I offer two illustrative examples:

  1. ReasonML: NeoHaskell’s objectives resemble those of ReasonML. ReasonML aimed to make OCaml more user-friendly. It boasted a distinct advantage: it could compile to JavaScript and had first-class support for React from the outset. Furthermore, it could tap into the expansive JavaScript ecosystem. But even with these benefits, ReasonML didn’t gain the expected traction. This suggests that neither ease of use nor the learning curve were the real barriers. The developers of ReasonML perhaps missed this insight, as they later rebranded to Rescript, distancing themselves from OCaml, functional programming, and any terminology unfamiliar to the average developer. Even this change didn’t have the desired impact, leading to some core Rescript developers leaving the project.
  2. Rust: Contrasting with the above, Rust, despite not being the simplest language to master, has witnessed significant adoption. This can be attributed to Rust’s early emphasis on showcasing its capability to develop applications across different domains. Initiatives like “are we web yet?” effectively communicated to potential adopters that Rust was a dependable choice. Once the job market for Rust developers increased, developers felt more compelled to learn the language despite its steep learning curve.

I had a moderately unpopular post to that effect on Reddit, but what we really need are more and better frameworks for common tasks. In my experience with using Servant and Monomer, often, with the right framework design, you can achieve all the necessary functionality without using any code in IO or some IO code generator (monad transformers, effect systems, free monads, etc) other than the framework itself.

There have been some posts to that effect by others; that there have been no “killer frameworks” in Haskell and that was what was holding it back.

As I’ve said before, the problem with Yesod was that it somehow presupposed knowledge of Haskell, whereas it should be possible to teach the framework first and then only as much Haskell as needed to operate the framework. Some of the publicity for Yesod suggest that Michael Snoyman realized that toward the end, and IHP selling itself as a framework for non-Haskellers was a significant step in that direction.

A framework-centric approach, combined with high-quality frameworks tailored for the target audience, might be what we really need.

The problem with this is that Haskellers don’t seem to like overly opinionated frameworks, but this kind of product is probably what the average programmer wants; no anxiety about choices or wondering about ReaderT vs MTL vs Free Monads vs Three Layer Cake vs Effectful, just follow the path on the HUD.

PS, also, three thoughts come to mind. If Haskell has achieved partially negative brand equity among a sector of tech leaders, the natural idea might be to rebrand, but that’s complex and difficult.

Instead of trying to turn Haskell branding around, sell IHP, sell Yesod, sell Hydra, sell whatever framework you can come up with and only mention it’s Haskell-based at the very end.

The framework-oriented approach also ties into Gabriella Gonzalez’s proposal in How to Market Haskell to the Mainstream Programmer; she identifies Haskell as a language very suited for building eDSLs, and consider the overlap between eDSLs and frameworks.

Lastly, I’m not sure if you’ve seen the comedy skit comment on some thread in Reddit, wherein the new CTO comes to axe the Haskell infrastructure at their firm, upon which the resident Haskell guru informs them that they are using the Monad for Medium to Large Enterprises, causing the new CTO to be enlightened, rainbows and stars showering upon them, and profits rising.

This is also a way to get rid of all the burrito monad tutorials flying around; if the marketing association with monad is now “business framework” (vaguely correct based on the eDSL interpretation), do we really need to correct them?


That was probably the biggest time-sink when we adopted Haskell, followed closely by figuring out an error-handling strategy.

1 Like

I think the biggest mistake adopting Haskell is trying to come up with standardization for such things right away. Especially when the team’s new.

Now, I get the “we need code consistency globally” argument (I guess). But if you commit to something early and force people to just go along with it, you end up not cultivating learning across your team.

If you’re working on some sort of SaaS-y web service-y thing, you can actually afford to allow people to experiment with error handling and the whole monad thing. You can start by standardizing on a config file format + a type to parse it. And a standard way to hook into main via IO or something.

But it’s fine if you have people working in different ways. Opportunities to unify things and reap dividends can only come after imo.

That’s just my 2c having seen a lot of these codebases grow in line count + engineers over the years. It’s a mistake I’ve seen often, and in the places where there were no standards from the get-go, the outcomes were better.