The evolution of GHC

FOMO just means “fear of missing out”. It is indeed usually used disparagingly in the sense that the person feeling the fear needs more backbone to just ignore the ad, but presumably the person feeling the FOMO rather than using the pejorative “FOMO” might not necessary think in those terms :slight_smile:

I imagine the pythonistas who think pattern matching is stupid shit are saying python has developed a case of “FP envy”, and pattern matching is a passing fad. But presumably the people pushing the proposal (unless they are super cynical) think the pattern matching is actually good!

That is a complicated question that best be taken elsewhere. But I invite you to join me down the Post-Keynesian rabbit hole :).

1 Like

Yea, I hope great for the people. At least, it seems that one of the python influencer, Guido van Rossum (I heard he is the creator of python, tho I don’t know if that is true), had led to the great change.

There’s a couple of misconceptions in the last few posts:

  • Haskell has never aimed to be “popular” in the sense of numbers of users/numbers of commercial applications built on it. (“Avoid success at all costs.”)
  • Market uptake/return on investment of cash/‘capitalist’ metrics has never been part of the equation. GHC and Hugs were/are academic projects. They rely on enough Educational Institutions giving enough funding (typically through academics’/researchers’ salaries). These days there’s some commercial sponsors (and big thanks to them), but they’re not ‘investors’ looking to push GHC into an IPO. (Haskell is not Miranda™.)

So if you’re here making suggestions with the aim of widening Haskell’s appeal, GHC HQ will not be persuaded. From early on the thread:

That “cost” is not monetary – or not that GHC HQ can see. It’s “social” in that the community might get grumpy. OTOH some might be delighted a wart has been removed and replaced by the ‘proper’ way to achieve the functionality. Should the current ScopedTypeVariables or AmbiguousTypes be preserved in perpetuity for fear of breaking somebody’s crufty code? They’re so yeuchy I’ve always avoided them. I’d want them gone – that is, if I cared any more. The FTP library changes I objected to not because they broke stuff, but because they brought in a design worse than what they were trying to ‘fix’ – as the instigator would have been told if they’d incurred the ‘social cost’ of actually consulting.

I agree with the problems esp with the extensions, but I do not get why FTP was problematic, as one who came after the FTP change.
What problem did FTP intend to fix, and why was it a worse fix for the case? To me, they always seemed more of niceties.

It might have been like this in the past, but now a large part of the work on GHC is done by employees of Well-Typed, Tweag, Galois, and IOHK. And in fact, it is exactly these industrial users (mainly Tweag) who are pushing complex features like linear and dependent types. So, perhaps if one wanted to keep the language simple they should have pushed for avoiding success even more.

1 Like

Haskell usage in industry is easily 10x bigger than it was 10 years ago. Anyone who was trying to get a job in Haskell at that time could not dispute that. I got my first Haskell job 9 years ago, and I found a couple of job ads in a year suitable for a developer on the junior/senior boundary; now there are a few of those a month.

5 Likes

hmm? Is it Tweag corporate pushing for those features? Or is it that the people pushing for those features happen to work for Tweag?

It’s less than clear to me why an industrial user would put up with an awful records ‘system’ and prioritise Dependent Types. Lack of (polymorphic, extensible) records is why I don’t promote Haskell to my industrial/commercial clients.

Again, I’m not claiming there’s a Haskell that’s “simple”. I want ‘small’. It’s not clear to me that conflating namespaces is essential for Dependent Types; even if it is, it’s clear that conflating namespaces has made the syntax a whole lot bigger. (Maybe it’s the trying to remain backwards-compatible that’s bloated it. Perhaps Tweag should start afresh, throw out the baggage and sponsor a Haskell 2030 or some such.)

Hmm, as I asked: where are all these people? Do they not care/not want to volunteer an opinion on where Haskell is going? Perhaps their employers are sticking at very old releases, because keeping up with the churn is a cost with little benefit?

As a point of comparison, look at the number of people with an opinion about the FTP/AMP library changes (and note the couple of 'stepping back’s towards the end of that month). How many of those are still active in Haskell?

1 Like

See @rae: Update on Dependent Haskell - YouTube (around 8:22). @rae mentions that companies like Serokell and Obsidian systems are paying their consultants work on dependent Haskell.

1 Like

Imho this claim requires much more concrete data. The ‘statistic’ might only apply to your local neighborhood.

Let’s not stir sleeping dogs. ‘FTP changes’ is a shorthand term for a bunch of stuff “technically a separate proposal” (and including reorging AMP), none of which were clearly signalled in advance; and that wiki substantially underestimates what was actually coming. (For example the wiki doesn’t mention the changes introduced a bunch of instances, that might have been contrary to instances other libraries had already written.)

Here might be a place to understand the sorry mess. (And SPJ’s attempt to ‘move forward’ a few messages later.) There’s a mix of people surprised at the suddenness/lack of notice; surprised at the decisions (some eventually persuaded they were ‘logical’); surprised at the scope/impact of what appeared in one release; as well as plain disagreeing with the decisions.

How many are no longer active? Searching for a few names + Haskell yields recent results (at least >2020) for every name I’ve tried. Update: I haven’t found anything about Brian O’Sullivan.

I was searching for jobs globally then and I’m talking about global availability of jobs now.

2 Likes

Oh, so you were talking about the process, and naturally, the details of the breakage? I did not know it was this serious. I guess I liked streamlined F-A-M aftermath, but I certainly might not appreciate the change if I were involved with haskell beforehand.
I wonder how it served the researchers. Was it for their best interest? Surely it is, right? Otherwise why would such a change happen.

Well, imho ‘global’ does not add much to the data point. It is still a single data point, and likely it is coming from certain surface. One possibility is that the entire pool of businesses on haskell has been decreasing, while haskell businesses have been gathering together in one particular group. Gives illusion that there came to be more haskell businesses.

I suspect they are happily earning a living, paying little attention to how the language and ecosystem evolve because they are sufficiently happy with how it is now and sufficiently happy with where it is going.

Granted, we can do much better in many things! But I don’t share your arch-pessimism, in fact I’m not pessimistic at all about the future of Haskell.

2 Likes

The notion that commercial Haskell has grown by less than 10x in 10 years just doesn’t ring true based on my experience. You’re welcome to interpret this anecdotal evidence as you see fit.

4 Likes

I mean, it is hard to believe when concrete statistics point the other direction.

There are tons of statistics indicating similar trends.

That’s a relative ranking. The languages that have overtaken Haskell have presumably grown even more than 10x. Swift didn’t even exist 10 years ago.

3 Likes

Please share, particularly if you can produce a summary analysis. I would love to see!

1 Like