Haskell evangelism

My counterperspective is that traditional markets for Haskvangelism are both tapped out (in the sense that everyone who wants to learn Haskell, under the present pedagogy, has done so) and have lost interest. Haskell as a language used to have tremendous competitive advantages against existing languages, but as time goes on, Haskell’s unique selling points (USP in marketing slang) have become ubiquitous. This wouldn’t be such of a problem if Haskell could reliably remain on the cutting edge, but Idris, Agda, have dependent type advantages, Julia has a multiple dispatch paradigm that results in massive performance advantages (among other reasons), and Haskell has not been able to profitably clone either feature of late.

In my view, the biggest asset of Haskell is the prestige associated with it. It is not in fact that hard to learn, but the fact that it’s still practical, and it has cultural connotations and associations of elite computer scientists, makes it extremely sellable to people you wouldn’t normally think of as potential Haskellers. A Cambridge graduate student on #Haskell, for instance, has been teaching people Haskell as a first language for quite some time.

The problem is, given that Haskell is in relative stagnancy (where’s the dependent types? Where’s the performance-enhancing multiple dispatch paradigm), and that the traditional recruitment bases for Haskell have been tapped out, it should be imperative to actively seek new Haskellers where they’re not normally sought.

Certain characteristics of Haskell, incidentally, make them quite useful for amateur programmers as well as new programmers; the fact that Haskell is conceptually near the cutting edge makes Haskell an ideal language to learn with (and Haskell isn’t as wedded to the pure statically-typed FP paradigm as people might think; there’s, for instance, dynamic types enabled via a library, stateful computations enabled via monads, composable objects via a library), as well as an ideal language for amateur programmers simply because Haskell code can be written more quickly (if coding is your hobby, you don’t want to spend months writing and debugging your hobby) and often with a high guarantee of both correctness and architectural sophistication; there’s quite a few start-ups done by hobbyists that can’t sell because their codebase is spaghetti code.

This is basically a supply-side approach. Haskell, right now, seems to have an exceptionally high library contributor to userbase ratio. But other languages have built excellent ecosystems using the opposite approach; Python, for instance, is used because it’s used. What would happen if a supply-side approach to Haskell could generate a much larger user-base, even if the contributor-to-user ratio drops? Could we see much better developed libraries and ecosystems?

As an example, for instance, certain people working critical Haskell resources have been in a bit (this is probably an overstatement) of trouble because they can’t find the relevant Windows specialists to accommodate their software to Windows Store. If the userbase were allowed to dumb itself down, the same problem wouldn’t be present, because Windows specialists are a commodity in other realms.

But to accommodate a lower-quality, but larger, userbase, you need a few things. First, you need improved pedagogy; it’s disappointing for me that Haskell (HaskellBook doesn’t count, it’s way too long) does not have its own equivalent of K&R The C Programming Language or Structure and Interpretation of Computer Programs. There seems to be a dearth of Haskell books that are targeted at allowing utter neophytes to computer science to produce useful programs via Haskell, in part because this would require the usage of Haskell libraries to do many practical things, and newbie books (barring Will Kurt’s Get Programming in Haskell) tend to eschew libraries, and even with GPIH, the libraries are mentioned in the end. Overall, however, there seems to be a lot of problems with Haskell pedagogy, something that the Haskell Foundation might wish to take control of (I know Hecate is moving funds to typeclasses.com and Julie Moronuki).

Second, certain libraries and tools are too painful to use, or are obsolete. GUI is not a powerful strength of Haskell, for instance. For accessing a Windows filesystem via GUI, tinyfiledialogs is obsolete, but still crucial. Neophyte coders, likewise, are unlikely to take kindly to cabal or stack’s config-file based setups.

Third, the Windows support is an utter mess. getLine has been broken for years, and I’ve gone to using Haskeline just to get around getLine (winio seems to have broken getChar as well in the latest version). The last time I tried to contact Haskell Foundation about moving toward newbies and amateurs as a target audience, I got the line that tooling and ecosystem simply wasn’t mature enough, and I have to agree. The VSC-HLS experience is pretty much easy mode for Windows users, hell, type suggestions make the type system practically dynamic, but Haskelly presents quite a few useful GUI additions (load GHCI, runfile), it’s obsolete, and requires hacking if you don’t want to work with Stack.

===

I’m just quite delighted to meet a Haskell Foundation ED who seems to share some of my views regarding how the Haskell community can move forward. I apologize for the rant, but I just saw an opportunity to try to reiterate my views. Some of the people I’ve talked to seem enthusiastic about the same approach, but it’s incredible to find HF ED in partial accord.

5 Likes

Could you possibly clarify aside what performance advantage you are talking about here, for the reader that is not familiar with Julia? Maybe some reference?

In Haskell we generally compile type classes as dictionary passing. That is very elegant and flexible, but it leads to many calls to functions which are not known at run time (which is slow and limits optimizations).

An alternative is to specialize each function that uses a type class to all possible instances of that type class. You can tell GHC to do that by using -fexpose-all-unfoldings -fspecialize-aggressively. The problem with this approach is that it can lead to enormous binary sizes and very long compile times (try compiling GitHub’s semantic that way).

Julia solves this issue by compiling each function with ad-hoc polymorphism only when it is actually called. That means you only have to generate specialized functions for the types that are actually used. That does mean that the compiler needs to be present at run time.

I don’t think this reflects the data:

  1. Haskell is increasingly becoming more popular in industry
  2. Industry users very much care about stability and maturity… and this has been pointed out as a weak point of Haskell many times
  3. You won’t find any language with more fancy features in the top 20 of the tiobe index: Go is far more popular than Haskell, yet is pretty boring feature wise. rust has somewhat stopped evolving and is focussing on finally becoming robust and serious and will likely soon enter the top 20.

Yet, Haskell has a history of more radical experimentation and we would do good keeping the researchers we have engaged and leaving room open for new ideas. In all the GHC/stability threads, we’ve discussed this many times and I believe there are some approaches that could work (e.g. language standard switch like C compilers have, GHC LTS versions, etc.).

However, I question that squeezing out the last bits of advanced type level features to re-attract the small number of Agda/Idris devs that left us (as you indicated) will make any real impact on popularity or industry adoption. These are goals independent of popularity and are catered towards advanced users of the language, not the masses.

I’m excited about how the new ED will manage this with the Haskell Foundation. It certainly is a tricky task.

8 Likes

I cannot agree more with hasufell’s view.
Moreover, while hanging around haskell communities what I have learned is that lack of job opportunity is what turned many ppl down. Heck, even a blogpost in my locale pointed at lacj of profitable application turned them down in going through the hard journey.
For most people, prospect of getting hired is one of the highest priority. So they just learn language where they can get hired, which is why I think Go/JS is becoming dominant in SW industry. Because industry loves them. Meanwhile, haskell is disliked by industry, so it is hard to get hired with it, causing ppl to eventually leave.

2 Likes

I’m criticiziing the approach of relying on Haskell’s current strengths; i.e, it’s traditionally been the cutting-edge language, but I’m pointing out that it is no longer as cutting edge as it used to be, and possibly cannot retain the status.

I am not deriding attempts to keep Haskell on the cutting edge; that is something orthogonal to my goals, I am just emphasizing that there is a lot of marginal profit to be gained from working in other directions.

Re: Julia: another advantage Julia’s proponents emphasize is that their multiple dispatch paradigm permits considerable concision.

Another thing to point out, by the way, is that ghci is only about 250 kb and ghc is only 100 mb. It might be possible to clone MDP through a JIT engine for MDP functions, calling the compiler when necessary and sticking to the binary when it’s not.

Julia is honestly a bit of a bugbear for me; every new Haskeller usually goes through a phase where they think Haskell is the best, the one language to rule them all, etc. Julia pretty much points out the opposite: Julia has a Haskell-like terseness, a Python-like learnability, and a C-like performance, even if it doesn’t match any of these languages in these fields. I was originally trying to see if Haskell pedagogy could be refined to the level that Haskell can attempt to compete with Python in the application of “hobbyist”, “amateur”, “teaching language”. Julia’s substantial performance advantages, and probably learnability advantages, mean that Julia’s better set up to take bites out of Python’s market share. And it shows, in most indices, Julia shows growth where Haskell shows stagnation, or if Haskell is improving, Julia’s improving faster.

1 Like

Thank you Jaro. I follow you until this:

— How do you mean «compiler needs to be present at run time»? This sounds like «dry water» to me. Do you mean that, during a single run of a program, and once an overloaded function is called with a given type for the first time, a new specialized instance of that function is created and made available for the remaining life time of that process, as if it was compiled ahead of time?

1 Like

Yeah, I think you’ve got it. Julia calls it “just-ahead-of-time” compilation as a combination of ahead-of-time compilation and just-in-time compilation. Although I’m not sure about what kind of caching strategy they use. Presumably the generated machine code does get cached out at some point.

1 Like

Well, doesn’t Julia cater to different audience compared to haskell? To me it sounds like it is designed for numerical calculations. Lack of true static types would certainly bite at complex programs. Ofc, it can still beat python. IMO the audience haskell could take is that of Java, although I don’t know how to make that happen.

Was haskell’s terseness that much of attract, btw? Never seen ppl getting to haskell because of terseness - rather, I’ve seen many that was put off by it.

Also, I think haskell still has an cutting edge front on one aspect: its pervasive laziness. Perhaps what we’ve seen is that laziness is relatively unpractical, yet it would still attract research-minded people.

For myself, and presumably other amateurs, it’s Haskell’s terseness that makes it beautiful. A C++ program (which I was studying before Haskell) can be really long, difficult to understand, and include many vocabulary terms / structures one might be unfamiliar with. A Haskell program, on the other hand, just gets to the point, sometimes with boilerplate, but often with a certain elegance that makes it a pleasure to read.

I guess for more professional programmers, the terseness is just something they’re not used to, and when facing a language they’re unfamiliar with, the boilerplate often hints at what’s going on, but the terseness means you get it or you don’t.

But it’s why I love Haskell, tbh. Reading Haskell code, writing Haskell code, then optimizing the Haskell code for terseness (if not performance), for me, it’s a great pleasure. Remember, in literature and philosophy, we admire the laconic, but it’s rare that the density and verbosity of a text contributes to its aesthetic value.

IIRC, re: laziness, SPJ admitted that if Haskell had a successor, it’d be strict. What do we get out of laziness anyways? Infinite data structures? Some phantasmal performance improvement that’s too difficult to actually achieve? It’s an absolutely fascinating idea, but it seems incredibly hard to get to work.

2 Likes

I see that coming from C++, it would look great. However, I do think Lisp variants are much better in terseness, as you don’t need to specify types.

Indeed concision is artistic. However, as often, you have to explain what is going on for others for practical measures. Not to mention that software engineering is more engineering than art - which divorced long ago.

True, I was simply commenting that laziness could steadily attract researchers until next lazy language comes out. Plus, purity won’t even exist if it were not for laziness. Why would you go for purity if you adopt strict evaluation?

I also want to mention, this is sort of chicken and egg, isn’t it? Businesses don’t use Haskell because they can’t get a sufficient supply of Haskell coders that deliver good value (i.e, Haskell can’t be that well-paid relative to value delivered), and Haskell developers can’t get Haskell jobs because businesses don’t use Haskell.

What I’ve heard is that Haskell Foundation was attempting to handle things via the demand-side, i.e, to properly upgrade intermediate Haskell devs into something more marketable to satisfy demand, which is intrinsically limited. On the other hand, what I’m proposing is a supply-side bomb; i.e, hobbyists, learners, etc, don’t necessarily care whether or not a language is employable, because if they were to be serious, they’d likely pick up multiple languages. They just care about whether or not the language is interesting to use, and whether or not it’s “productive” for their needs.

It’s a different way to address the phenomenon you’re describing; i.e, people who learn Haskell eventually give up because they find out it’s hard to be employed working Haskell. You can either solve their problem, which HF was seemingly intent on doing, or you can solve your problem, which is to find people who learn Haskell and don’t care that it’s hard to be employed working Haskell. Not saying one is better or worse than the other, I’m just pointing out options. The Pandoc developer, for instance, definitely doesn’t care that he’s not going to be hired as a developer, given that, if I recall correctly, he’s a tenured philosophy professor.

And this, arguably, is one of the reasons seeking new user markets, as HF ED Christiansen seems to imply, is worthwhile, simply because the existing user markets will complain that they can’t get a job doing Haskell. User markets that don’t need a job doing Haskell (hobbyists, learners) have an advantage in that they don’t get upset when they find they’ve sunk a year of effort into something that, at best, looks good on a CV but isn’t directly applicable.

2 Likes

Indeed, it does, alike many other things, have chicken-and-egg aspect. However, I want to attribute it more to the usability in business & hostility of management positions towards haskell. I was said in many places that it is unfit for production usage. Just as often, I could hear groan of managers towards FP languages. Not as many complaints about lack of manpower supply - such addresses, if exists, were more of excuses. In the end, they think what they have now is good enough, then they usually remain there. They do not worry much about supply side, esp. if they could employ cheap labor by using commodity language anyway. So yes, we need some main driver towards haskell in niche usages. IMHO, at least, we should think about how to temper such hatred from management towards haskell. Also stability would be crucial for a business to stand on haskell.

I see, however I’ve seen quite a lot of hobbyists doing haskell. More hobbyists than employed personnel. Even I am a hobbyist, as well. In my perspective, haskell has always had abundance of hobbyists, but they cannot dare attempt a job on it because next to no employers offer such positions. Plus, are these hobbyists going to be hired at as cheap price as e.g. JS/Python? I doubt it.
I think haskell does provide artistic aspect(as you mentioned), which is often great for many hobbyists. More libraries and tooling could certainly help, but that would be much better if more business interest were in.

This recent discussion would seem to belong in a separate thread. Better to discuss things rather accosting a few polite remarks from an ED who hasn’t started yet.

3 Likes

Agreed. Apologies for the sidetrack.

Discourse moderators can split comments into a new topic, if any are present.

2 Likes