My talk "Functional Programming: Failed Successfully" is now available!

So now that we’re back on-topic: why didn’t pragmatism work in 1978? Because if it did work, John Backus’s ACM Turing-Award lecture would surely have been about (some aspect of?) Fortran, rather than a “clarion call” for radical (at the time) change…

I might be wrong for sure. Don’t people expect everyone in the community to be right all the time? Or maybe it’s expected to not say anything unless there is a strong evidence backing claims. If this is so, natural conversations would be too difficult, but I can take this as granted.

I don’t however see how the technical arguments work here. Backward compatibility was not a big concern yet, it breaks constantly for new versions of GHC. It was not a big concern when introducing a significantly more severe change “simplified subsumption” (that was made opt-in, not opt-out after that), and the story of backward compatibility is known to be controversial. If it’s the dot itself, this could be overcome by chosing a different syntax. I don’t mean it should have been so, the dot itself is valuable. But I don’t see how this is a so big problem. So I don’t find this specific case somehow different in the technical terms, and this makes me looking for other reasons. But maybe I’m not an attentive reader who interpreted that epic discussion on GitHub wrongly.

1 Like

This is true for me too, but with a big caveat: I think the group of people who want to use Haskell is much bigger than the group of people who use Haskell (maybe 10x, to take a very rough guess) and the number of people who would want to use Haskell, if they were introduced to it in a way that appealed to them, is bigger still (maybe 100x, again extremely roughly).

If we made an effort to make Haskell more accessible to those groups then we could help them benefit from Haskell, and their contributions in turn would help Haskell! Such an effort would involve improvements to tooling and documentation, and fostering a software engineering culture within Haskell that is accessible to a wider range of people.

I think I agree with OP in this regard, although I differ in how I think the matter is best expressed and discussed.

6 Likes

(I realised after posting that I’m replying to Tom, but this is really a reply to the whole thread.)

IMO, language adoption / “perceived practicality” is not really correlated to some language being “better” than another. It is almost always a function of the language’s ecosystem, though. That is, the number of available libraries for a given task, how well these are maintained and how well they are documented.

I recently attended a talk by Shiram Krishnamurthi, who said that it’s all about education. In his view, a successful ecosystem must be good at education as well.

Hence I would try to take Haskell-the-language out of the equation. There is no way Lang X can succeed if its proponents are bad at teaching others about it, or think that time is better spent explaining why “Lang X is better than Lang Y” and “you really just should try it to see”, because ultimately it is all about good teaching materials for Lang X and its best-in-class libraries.

8 Likes

Hmmmmm. It’s not as if anybody’s stopping people trying out Haskell. Maybe ~20 years ago Haskell was in obscurity, but these days I don’t think there’s any curious programmers who’ve not heard of Haskell.

A lot of programmers aren’t going to benefit from Haskell: if you’re churning out database-to-screen-to-keyboard-to-database applications in a rigid employer-dictated framework, on a 15-year-old application, what can Haskell teach you? Or the benefit will be ‘recreational programming’/hobby projects – in which case see paragraph 1.

I don’t see the point in talking about “contributions”. The continuing problem is there’s tiny capacity to make any changes to GHC or its tooling. Piling more people into the demand side will just make for more frustration. I note the people who are competent to make actual contributions (not me) are too busy to hang out on Discourse.

Some of those people would massively benefit from Haskell. I know because I was one of them. And it’s not about what Haskell will teach (perhaps this is one of the mental blockers: “the benefit of Haskell is that it helps you reach enlightenment”) it’s that Haskell is a massively more pleasant language for churning out database-to-screen-to-keyboard-to-database applications in a rigid employer-dictated framework, on a 15-year-old application.

I’m not sure what this means. It seems obvious to me that more competent people in the ecosystem will lead to more progress on the things we all benefit from.

Could you elaborate? @sgraf is posting above you, for example. @mpilgrem maintains stack and participated in this thread. Many people who work on GHC, Cabal, stack, GHCup and HLS are here regularly. So I must be misinterpreting something in your comment.

3 Likes

Hugely disagree with this point and this logic.

First, why are we deciding for some unknown persons what will or will not be good for them? We should treat people we don’t know about as adult persons who can make decisions for themselves. If we are rational, we just can’t make such decisions: “there are people who would not benefit, so we should not do anything”. The best we can do and should do is making more opportunities for all, and those who need, will use these opportunities.

And yes, the more people in Haskell, the more benefit it is for all haskellers.

4 Likes

So what is something that has “mass appeal”, something that can be read from the side of a box at a computer shop?

…because these days, who doesn’t want to “get more stuff done” simultaneously? Moreover:

Exactly.

The choice of non-strict semantics by default, and consequently purity now places Haskell at an advantage over most other “step-by-step-by-step” languages, where historically parallelism and concurrency have been thoroughly muddled-up. But as the existence of ParaSail shows, nothing stops new parallel languages from appearing. So for Haskell to be synonymous with parallelism means taking the necessary measures now, while (most of) the rest of the competition are still deciding how to “add-on” purity e.g:

(…let alone non-strict semantics ;-)

EDIT: Sorry, the following was meant to be a reply to this post by @graninas; I must have clicked on the wrong Reply.

I’ve tried to read or, at least, get an impression of, the almost 540 comments on the RecordDotSyntax GHC language extension proposal between its being made on 11 October 2019 and the announcement of the GHC Steering Committee’s conclusion on 3 April 2020. I also read the public emails of the GHC Steering Committee on the proposal between 9 December 2019 and 3 May 2020.

Based on my own impression of that process, I would say that it is not a good example of insufficient pragmatism as a perceived Haskell community value (recognising that the community is diverse; I’m not saying everybody involved in that process always took a ‘pragmatic’ approach; and also recognising that ‘principles’ are important too).

However, near the conclusion of that process, there was a passing comment by Simon Peyton Jones that, for me, did chime with your thesis. He wrote: “… We have waited a long time already – I have been engaged in debate about this topic for over two decades – and I think it’s time to decide something. …”. By ‘two decades’, I understand (EDIT: from this email, preparing for a vote) him to refer to a paper that he had written with Mark P. Jones in 1999 entitled Lightweight Extensible Records for Haskell.

That said, a significant part of those two decades would have fallen before ‘Haskell escaped from the Ivory Tower’.

No I wasn’t “deciding for” anybody. I’m making a prediction for what decisions they’ll make “for themselves” after they’ve played with Haskell – based on having worked amongst programmers and commercial applications for decades. (Commercial applications that typically take the user row-by-row through the database, so will show no benefit from parallelism pace @atravers’ claim for a “mass appeal”. [**])

Yes, that’s what I said Haskell is already doing paragraph 1. You can lead a horse to water, but you can’t force it to drink.

I think this discussion has got to the point of repeating itself.

[**] Just what proportion of the industry is turning out PC games? And don’t they and the players have something useful to do with their lives?

Phew! You’ve gone through all that discussion? Epic!

Yeah thanks, but no: earlier than that. Haskell 98 records was very much at the time seen as a stopgap because they had to put something into the standard. There was already in Hugs Trex 1996 [3 below]/[5 in the 1999 paper]. And Trex continued to be developed in Hugs up til ~2004. The 1999 paper essentially proposed to abandon H98 records and adopt Trex. This would have been a majorly breaking change. But in an era when Haskell users were a tiny ‘ivory tower’, and much more tolerant of breakages.

Errm coming in a thread titled ‘Failed Successfully’ I’m afraid I’ve lost track of the double/triple negatives going on here. Who was pragmatic? Who was purist? Who could have been more pragmatic?

@graninas’ claim was that one side of the debate wanted to implement . to be like OOP; the other side wanted to not implement . because it would be dumbly aping OOP – and only for that reason, like Haskell must keep itself aloof from other languages. My memory (without going through the whole thread) is nobody particularly was suffering from envy or jealousy of OOP. Rather it was: Haskell has got itself in a mess with . [**], can we find a compromise syntax/lexing that allows all existing usages to co-exist (backwards compatibility) and also this new syntactically-specific usage?

@graninas suggested they could have proposed a different operator for field access – except what? Any symbol from user-space might be already taken, so again breaking backwards compatibility.

[3] B. R. Gaster and M. P. Jones. A polymorphic type system for extensible records and variants. Technical Report NOTTCS-TR-96-3, Computer Science, University of Nottingham, November 1996.

[**] because . is just a terrible symbol to use for such a common operator in lambda-calculus as compose °; because . is used for all sorts of purposes, including decimal separator and Module namespacing.

…because modifying a shared resource does requires concurrency, which would help to explain why ParaSail has no global variables. Interestingly, this appeared in one of my searches yesterday:

Lazy Evaluation of Transactions in Database Systems (2014)

…so given the appropriate circumstances, maybe “read-only” transactions can occur in parallel.

But more generally…unless someone discovers a way to use e.g. tungsten carbide as a semiconductor, the future is multicore/threaded, and “straight-track” sequential programs will have to be adapted accordingly.


I’ve mentioned it before: Unicode didn’t exist back in 1987…

Of course I get that. . doesn’t even look much like °. But that’s degree sign (as I used above), which isn’t quite right, should be at mid-height (pasted from wikip, I’ve no idea how to get it on my keyboard). Then @ at least has a circley thing at mid-height; and is a kinda reserved sym in Haskell but can’t currently appear in expressions.

. is used in math to mean multiply/also dot-product (at mid-height wot also we don’t have in ASCII) or to denote some arbitrary binary operation. Just already too overloaded and too precious to use as a vanilla operator.

Yeah. The database is the global variable – and not just global to your program/all its threads, but global to every other user/program on the network. And updates to it need to be under commitment control, to avoid any other session seeing it in a half-updated state.

Sure, that makes sense where most activity is enquiries. With optimistic locking for updates. We then need all sorts of double-check and rollback strategies in case of the database having changed after my user looked at it but before they entered their update. Did somebody above say "There is nothing “simple” about reassigning state. "?

…yes, I do vaguely recall something being written to that effect - now what did I write again:

So a ParaSail library which does access a shared resource such as a network-wide database cannot use global references/variables for that purpose - it would need to work differently. But however it did work, concurrency would still be required because:

There is nothing “simple” about reassigning state.

…when it’s shared so vastly as a resource on a network, or just shared within a program.

Interesting idea. Let me tell how it works in the real world: the fastest-moving product at the busiest store is also the data point that gets the most enquiries – so by this lazy update strategy will suffer the worst read-latency.

A more realistic strategy is to remove as many of those data integrity ‘promises’ as possible. Typically, to allow the stock-on-hand balance to go negative (even though that makes no sense physically), in the expectation delayed transactions will make up for it. Of course this then needs a human follow-up procedure to enquire on negative on-hands and go physically examine what’s on the shelves.

Amazon, for example will happily sell you some dead trees and take your money with no idea whether it can ship to you – either within their promised delivery time window or ever. They’ll then take their time using your money before giving it back to you because product undeliverable.

In this scenario, enquiries on the fastest-moving product at the busiest store are next to useless. The system might as well make up a number as force all that read-latency. (What I’d record is the date/time of latest reasonably confident stock level; plus a metric for how fast-moving; then guess a number by applying the decay metric over the intervening time interval.)

So a lazy database update might work. But you’ll have to redesign the whole user-facing business logic/manage user expectations. The sequential or otherwise programs are not really the place to tackle it. I don’t see Haskell vs (say) COBOL really having any bearing – to explain which is why I’m going so deep into the weeds on this thread. And I wouldn’t bet my stock control on an application infrastructure for which there’s only a handful of (rather too purist) programmers in the country. That’s why I say

Perhaps I mean: a lot of employers of programmers aren’t going to benefit from their employees grokking Haskell. Especially not if it makes their employees as argumentative as us lot round here (myself included :wink:).

1 Like

…which could also explain the allegedly-slow uptake of Haskell commercially. But this also raises a question about the use of multi-paradigm languages: how do employers of programmers prevent their employees from arguing over which paradigm to use and when?

  • if there’s some generic strategy, perhaps that could be reused in a commercial Haskell setting;

  • if there isn’t…then using an multi-paradigm language risks employees being equally as argumentative, but about paradigms.

(Hmm…this could be another reason why FP was expelled from Python: to avoid this form of distraction.)

For the record: I don’t oppose OO. I just think it’s overused, maybe.

Record dot: My only gripe (even then only softly) is the dot. I would have preferred # (which has been in OCaml for decades). But I concede that, as you pointed out, any other symbol like # has the same ambiguity issue, and if the parser gets it to work then the case is closed. Really, even if one opposes OO, any record syntax cannot have OO meaning unless you also semantically have…

Extensible records: I actually would love that. The one thing in Hugs that I wish GHC had is extensible records. (Then again, I guess the fact that it did not gain traction in the Haskell community is evidence for what you said about the community.)

Existential types: I actually oppose and cringe when people cite the “the existential type anti-pattern” link. Please stop calling it an anti-pattern. OK, if beginners reach for it out of instinct (see what’s wrong with intuition?) for very easy scenerios, then I oppose it and recommend instead simpler ways (in this case low tech and low setup) with just functions and algebraic types. But I am OK if people choose it after knowing their options.

And even that “simpler, low tech, low setup” about functions and algebraic types, I only call that true in Haskell (and SML, Idris, …). Did I say something bad about Python? And yet:

One time I was grading an assignment that asked “code up boolean expressions and an evaluator” in 3 languages: Prolog, Haskell, Python. One student liked Haskell so much, they translated Haskell to Python, i.e.,

if isintance(e, And):
  ...
elif isinstance(e, Or):
  ...
elif ...

I gave it a very low mark. If you use Haskell, play to the strength of Haskell. If you use Python, play to the strength of Python.

I don’t oppose OO. But I oppose the mainstream dogma that OO is the only way to be extensible. The Expression Problem implies that that dogma is false. It is another factor why I’m so skeptical about the mainstream.

Good rule of thumb when a Haskell blog says “never do X” - ignore the directive to never do X. It’s not hard to assess your programs as-is…you don’t need hard rules like that.

A good example of this is the often-linked “What’s wrong with Implicit Params?”

The example given to illustrate the problem is pathological and definitely not a strong enough argument to not use -XImplicitParams.

And the conceptual argument about incoherence feels almost like a non sequitur…incoherence is on the tin and not inherently a bad thing even if it sounds like a bad word.

If I read this advice and just filed away "-XImplicitParams = bad" in my brain, I would have become a worse Haskeller. There are plenty of fun and useful ways to use IPs that I would’ve missed out on.

hah i actually loaded up the “counterexample” in ghci and toggled NoMonomorphismRestriction as requested and as expected -Wtype-defaults called the code out as suspicious.

horrific-implicit-params-example.hs:7:24: error: [GHC-18042] [-Wtype-defaults, Werror=type-defaults]
    • Defaulting the type variable ‘t0’ to type ‘Integer’ in the following constraint
        Num t0 arising from the literal ‘456’
    • In the expression: 456
      In the expression: let ?myparam = 456 in result
      In the expression: (result, let ?myparam = 456 in result)

so that is definitely Bad Code but not due to -XImplicitParams.

like i said - evaluate code on its own merits instead of trying to find global rules handed down by blog posts :stuck_out_tongue_winking_eye: