Commercial Haskell should go after Python / Julia, not Rust

End of the day, going to researchers on their own is a lost cause, because Haskell’s ecosystem has no hope of matching Python’s in the foreseeable future. It’s more usecases like LaurentRDC’s that are interesting; i.e, you have domain experts working in a larger firm who are not professionally-trained developers, and Haskell helps to improve their code quality for the engineering side of development.


A lot of this keeps on reminding me of Gabriella Gonzalez’s video on marketing Haskell, wherein she argues that Haskell’s niche might be building eDSLs, except that lots of people have issues with FP’s tendency toward eDSL, tightly binding developers to a codebase and making it hard to onboard.

But a lighter version might work; i.e, wrap mature and applicable libraries, provide a bunch of convenience functions that are easy to use, and only teach enough Haskell to use the resulting eDSL.

We agree entirely that auto-users are not good candidates for Haskell evangelization, but I think I disagree as to the reason.

I don’t think that Haskell is so complex that it cannot appeal to auto-users, but simply that for auto-users, it’s the ecosystem that is the most important part, and Haskell’s ecosystem for most use-cases where auto-users predominate is inferior, and we are unlikely to make much headway on it.


As for web developers, I think that was Michael Snoyman’s idea, wasn’t it? The ecosystem and tooling was just outrageously immature at the time, and while Michael Snoyman did a lot of work to make Haskell viable for that purpose (and even if you dislike Stack, as I do, you have to admit that Stack was a great revolution, and it is still the best tool for certain use cases), Haskell still has very poor market share in that field.

But I wouldn’t think that Haskell is completely dead there; IHP seems to be making good progress, vs Ruby, Haskell is substantially more performant, and vs Node, Node lacks performance and is actually a minority itself.


As far as going after Python though, while Haskell cannot compete in the vast majority of use cases because of Haskell’s dearth of ecosystem (and the Codon compiler suggests that Haskell might even eventually lose in terms of performance), I think the basic direction is sound, only because trying to go after Python means that Haskell has to address two key weaknesses.

The two main spaces where Haskell loses out to other languages is ease of learning and ecosystem.

But ease of learning does not have to be a major problem; moving toward dialects as with C++ helps to simplify onboarding, and Haskell already has a strong tendency toward eDSL; it’s going to be dialectical anyways. Moreover, we’re getting a lot more pragmatic books on Haskell, although I think my wish for an update of Real World Haskell will likely never come to pass.

Ecosystem-wise, let’s just admit it, Haskell’s ecosystem needs substantial improvement, and more importantly, careful guardianship and directed development. There are packages that used to be useful that have bitrotted. Haskell is considered a “general purpose language”, but there are lacuna in Haskell’s ecosystem that make that more of a technical term. And of the libraries that exist, API design can be lacking, and, as LaurentRDC has pointed out, documentation, both “hard” and “soft” (tutorials, examples, prototypes) documentation can see improvement.

If I were to choose a domain for Haskell to make a push, it would probably be web frontend programming:

  • Work is under way for GHC to officially support compiling to JS/WASM. It may appear that there isn’t much enthusiasm for “Haskell on the frontend” because GHCJS has existed for a while (and a few other Haskell-JS experiments), and there isn’t that much commercial use of it. But an “experimental” fork of GHC which lags behing master and official support are, in my opinion, two entirely different things. I would not personally consider Haskell for frontend today for a “serious” project, but may very well do so soon when GHC support is solid.
  • Languages like Elm/Purescript/ReasonML have shown that there is significant appetite for functional programming on the frontend.
  • The current area where Haskell has seen the most adoption is arguably web backend, which is closely related to web-frontend, so there is some “network effect” potential here. Furthermore, the idea of “isomorphic” apps, or just simply code reuse between backend and frontend is very attractive. This is probably the only reason Javascript was ever even considered for backend programming, despite all the ways in which it’s not obviously suitable for the backend, for example.

And web frontend programming is a huge space. Making inroads in that domain would make Haskell much more visible and well known.

8 Likes

Ugh, how concrete is the evidence? I mean, Haskell is well-known for being difficult, so I believe the burden of proof showing otherwise is on you. I think you need some kind of credible research to disprove the general consensus.

1 Like

https://news.ycombinator.com/item?id=23621930

Google’s experience with Haskell

https://news.ycombinator.com/item?id=21916629

Max Gabriel at Mercury claims the time to become productive is 4-6 weeks.

https://news.ycombinator.com/item?id=35442409

Claim of time to first commit is 1 week for multiple people, but unsubstantiated.

https://news.ycombinator.com/item?id=14148786

Guy claims he got a simple Haskell app up in 1-2 weeks of what seems to be self-learning.

There’s also someone who hung around FP Discord, a C++ programmer, who decided to speed-run Haskell and built a compiler as his first project. This… didn’t go well, mainly because while he managed to get a fairly advanced Haskell project working in 3 months, he burnt out.

IIRC, there’s someone at Hacker News (I’m searching for the comment) who can do 2-4 weeks onboarding for non-Haskellers.


Compare this to, say, Ying Wang blowing 3 months to go through HPFFP: A Pythonista's Review of Haskell > Ying Wang


I mean, it’s not exactly good, and any hopes of getting Haskell to be taught as quickly as Python are delusional. But if we are obsessed with pure pragmatism, and are willing to constrain the dialect, the time needed to train an idiom of production and productive Haskell is not going to be as high as you think it is.

Moreover:

It’s just good practice to learn Haskell by doing and refactoring; I consoled the disappointed C+±er by pointing out that he was basically trying to do the equivalent of learning Template Metaprogramming before writing his hello world.

2 Likes

maxigit: to cut a long response short, what do you think about the following tutorial?

Of course, I’d consider this tutorial not really “teaching” Haskell, but we’ve discussed Pythonistas, and honestly, it seems that quite a lot of them don’t really know Python either.


The story I’ve heard is that when Juspay adopted Purescript, there was an utter dearth of useful learning resources, and people just looked for phrasebooks and just modified code samples, and somehow the project worked. Presumably, the developers got better by the time they adopted Haskell.

In the same regard, if you wish to put Haskell into production use, there is no need for your developers to truly learn Haskell, only a limited dialect selected by competent Haskellers to make onboarding relatively easy.

3 Likes

Which libraries are needed? Which libraries need to be documented? What do you mean by «API design can be lacking»? Can you draw a roadmap from here to success?

4 Likes

That goes more into my criticism of the library ecosystem, no? The documentation often expects a level of skill or experience that’s beyond the basic level needed to be productive in Haskell; i.e, it’s typically geared at intermediate or advanced Haskellers, and is thus newbie hostile.

But I can see what you mean, if you define the level of Haskell needed to be productive as being able to work independently with the libraries, there is a considerable skill requirement and Haskell is “hard”. On the other hand, for production purposes, if there’s hand-holding or internal guides on how to use the libraries employed, then this needn’t be a problem.

@kindaro

This is just my opinion on libraries that are needed, but:

Accelerate, as far as I can tell, has not substantially improved from its condition last year, where Accelerate was bound by default to older 8.x GHCs. The GitHub version seems to be updated, and I’m trying to get that working right now.

Brick and vty is a relatively key library for TUI, but while its maintainer put out a request for assistance back in the 2010s, it’s only now beginning to get work done on providing Windows support.

There’s the issue with Haskell Tensorflow (Google seems to have abandoned it) and the slow maturation of Hasktorch (there’s a disclaimer on the Github saying that the Hackage version’s use is discouraged).


As far as necessary libraries in general, many libraries need simpler APIs that are more suitable to newbies, because while many people can get past HPFFP or other Haskell learning materials, when they get to the actual Haskell ecosystem, they have maxigit’s experience, which can get people to give up.

In general, Haskell libraries tend to be designed for a very high level of power and composability, but while that’s a good thing in general, it’s often very newbie hostile.

I remember clearly Aeson failing to impress an acquaintance, for instance, because while the type structure ensured greater safety, it was, well, less ergonomic, and at the time, I couldn’t figure out how to work directly with the Value type.

1 Like

I love it. May have to steal that format for other things!

2 Likes

13 Likes

If I may I digress a bit, and use another way of looking at it:

I have noticed a subtle (developer) cultural thing, people slapping “rust-based” onto their project with pride. It’s like fashion, you want to be “in”, in the cool kids camp.

I think it would be out of character for Haskell to be the “cool kid” again, it would be like middle aged person going to a rave party.

Using this lens, I think what Haskell could go after is to find a cool kid spokes man to revamp the image of Haskell, iff we think staying culturally relevant is the key to survive as a project.

3 Likes

To offer a different perspective on the Rust vs Haskell debate, I personally moved the other way around.
I started with Rust, ended up using it heavily in startups for various backend services.

After a lot of frustrations with runtime errors in JavaScript and TypeScript to some extent, I was looking for a language and ecosystem actually designed with static typing in mind and Rust felt like a huge step up.
With no prior Haskell experience, two things made me look for a Rust replacement a couple years in:

  • For the type of backend services I’m usually writing (live data streams processing, web APIs …), Rust feels a bit too low level. I certainly got used to the borrow checker, to the point were I wasn’t really noticing it anymore, but make no mistake, you will spend a significant amount of time thinking about lifetimes and what type of references to use here and there. This is especially apparent later on, as things need to be refactored. Since what I’m doing isn’t particularly performance sensitive, I felt like I was sort of wasting my time thinking about all these things (seemingly for no benefit).
  • There’s a clear divide between the sync and async ecosystems in Rust. This is further exacerbated by the fact that there are multiple async runtimes that behave slightly differently. Library authors have to make a choice between sync, async and which async runtime to target. The async ecosystem is still maturing, so there’s currently no async Drop or async Traits out of the box, things usually work differently than in the synchronous world.

In contrast, for concurrent applications, Haskell seems much simpler, there’s only one way to do IO (if you want concurrency, just fork a thread) so some of the footguns are gone and the ecosystem isn’t split like the Rust ecosystem is. The language is obviously higher level so you don’t need to think in terms of lifetimes and references either, but it is still compiled and GHC generates pretty fast code.

All in all, I sort of agree that it doesn’t necessarily make sense to “compete” with Rust for performance sensitive applications.
… But at the same time, because Rust is “newer and cooler”, I feel like it is used a lot in situations where Haskell would do great. Namely, your average concurrent application.

Personally, from a purely technical standpoint, I would default to Haskell for concurrent applications.
And since both languages are FFI-friendly, Rust is actually a great pick to complement Haskell for the performance sensitve corners of a code base !

16 Likes

Did you also consider using Go? That seems to have most of the same benefits of Haskell that you mention, and it is a more mainstream option.

3 Likes

Having used Go for years in production, I wouldn’t recommend it if Haskell is already an option for you - unless you desperately need a faster hamster wheel (compile times).

4 Likes

I appreciate languages that (try to) make it easier to write correct programs. Go has nullable pointers, no sum types, no pattern matching, non-existent error handling and at the time it didn’t even have generics or a package manager.

It doesn’t really provide anything I’m looking for in a programing language.
It comes with a nice concurrent runtime, but so does Haskell ¯\(ツ)

7 Likes

That is really the only thing it got going for it… (I worked with Go too professionally)

4 Likes

That is really the only thing it got going for it… (I worked with Go too professionally)

It’s also very easy to make fully statically linked binaries, as well as making it painless to compile for other operating systems and architectures. As much as I’m not a fan of the language itself, within an hour I was able to write an application with a GUI on my aarch64 linux machine, compile it for x86_64 windows, send it to the relevant person and when they went to use it, it worked perfectly.

2 Likes

In haskell as well. GHCup works on alpine linux and you can just pass --ghc-options="-optl-static" and be done.

Alpine aarch64 linux is in aports, but there are no GHC upstream bindists yet. That may follow.

I just learned about this from the “Certainty by Construction” book:

By induction, the only programmers in a position to see all the differences in power between the various languages are those who understand the most powerful one. (This is probably what Eric Raymond meant about Lisp making you a better programmer.) You can’t trust the opinions of the others, because of the Blub paradox: they’re satisfied with whatever language they happen to use, because it dictates the way they think about programs.

http://www.paulgraham.com/avg.html

So, where Haskell is in the world of blub? (Hey, is it why some people defecting to Agda?)

4 Likes

@slack1256

I’ve found it : after 42m07s he explains why state management is the next fight of moore’s law and after 43m51s Haskell is mentionned as one of the languages that have “suddenly become important because of this multicore problem”.

Pros: The guy is a mentor and argues that only FP can succeed in exploiting multiple cores

Cons: it’s a 2014 talk and we still don’t have 256 cores on every standard laptop - and yes, Rust can also shine at FP

Thanks for the GC argument

@kamek-pf Very detailed and opiniated answer, have my +1 :slight_smile:

1 Like