The future of Dependent Haskell

this argument makes no sense to me don’t basically all universities teach C? even assuming a university teaches haskell what would prevent them from teach only a subset of equivalent syntax/code to haskell 2010 or GHC2021?

3 Likes

(sigh) I should have been more specific - this is the closing remark from the final page of that MicroHs paper:

(But thanks for the YT link - I’ll have to watch that one later.)

There’s an world-wide standard for C; there’s also a world-wide standard for Prolog. So educators would only have to update their learning resources for those languages every few years or so.


[…] assuming a university teaches Haskell what would prevent them from teach only a subset of equivalent syntax/code to haskell 2010 or GHC2021?

Apart from (re)writing the necessary libraries in Haskell 2010 or GHC 2021? Not much, I suppose (but I’m not an educator - you would have to ask them).

Good grief there is so much the-sky-is-falling on this thread. Firstly:

No don’t fork anything. There’s little enough resources to support the one stream of GHC development we have. Splitting it will halve the resourcing for each (or more realistically starve one fork).

We presume the DH developers are following GHC practice and disciplines, and hiding all their work behind extensions. If you don’t want DH features, don’t switch any on. If DH turns out to not deliver the promise, those’ll become dead features, like … SafeHaskell.

The debate at the time of committing to DH was pretty clear: we’re [‘we’=the Steering Committee] not sure enough to ‘prove’ anything yet; we won’t know until we try. This is not unreasonable for a language that supports experiment – except Haskell is no longer an experimental language in the way it was before (say) 2010. What I didn’t see in that debate:

  • Any definition of what it is for DH to be ‘finished’.
  • Any timescale for delivery [**]. So getting @int-index’s ‘task breakdown’ will be of little use unless it comes with delivery dates.
  • Any enumeration of non-DH features/fixes or extensions that would be paused or for how long. So in effect there was a whole backlog of accepted enhancement requests that were just nuked.
  • (More subtlely) how much more complex GHC’s internals would get, such that any later attempts to revive those features would turn out to be greater effort [**] than if GHC had never done DH.

[**] So for example, the early 9-series work on record fields/HasField proceeded without anticipating DH, and ended up using FunDeps (and potentially overlapping instances) – which are problematic, and in the bucket of features/fixes on which no progress is being made (and which I suspect will be more difficult to engineer in DH-riddled GHC).

Helpful for who/for what? I’d be very reluctant to slow the snail’s pace of DH progress by getting diverted on to making more noise, just to give the illusion of community engagement. You’re dreaming that “other Haskellers” would be anything more than a distraction.

Frankly, @atravers, if you’re sure you don’t want a language with Dependent Typing, now is the time to get off the bus [***] and go looking for some other language. It’ll only get worse the longer you hesitate. If you’re committed to Haskell (perhaps mandated by your employer) then suck it up. I programmed in COBOL/SQL/RPG for decades. My opinion on them as languages was not invited.

[***] As I have done, in effect: I’m not using GHC 9 series at all. I’m applying the features/fixes, extensions I value to another Haskell.

1 Like

What is so bad about GHC 9 compared to 8 exactly? I feel like I can code Haskell the same as I did in 2016 no problem…

It seems MicroHS is mostly an exercise for Lennart’s own interest, and all good he had fun producing it. But why is it relevant to this thread? H2010 was always a useless standard – indeed even H98 was a dead letter: all Haskell systems of that time already supported all sorts of extensions. I’d go so far as saying H2010 is expressively inadequate. I’m not surprised there are few packages that can restrict themselves to H2010. So Lennart could have spent his time better by producing a (Micro-)compiler that supported all the features in Cabal/toolset. More relevant for this thread:

What are these libraries using TH for? Does this suggest there’s a gaping hole in GHC functionality, for which TH is the only (fiendish) solution? Is addressing that gaping hole (that is, supporting that functionality within the language) what GHC should be prioritising instead of/before addressing DH? Or is indeed DH’s main ‘selling point’ exactly that it will render unnecessary lots of TH code? (If so, I didn’t see that mentioned during any of the debate.)

Lenses relies on TH. Is that what we’re talking about? Lenses is imho fiendishness on top of TH fiendishness. So the gaping hole is that Haskell doesn’t have a records system? Should that be the priority to address? There’s a Haskell that does support a records system, and that does integrate reasonably well with the de facto language+extensions as of … oh 2006. Does need a Kind system; doesn’t need type-in-type; does need MPTCs+FunDeps+Overlaps. Is it anywhere on DH’s radar? (Again I didn’t see any mention during the debate.)

P.S. what does Lennart think is wrong with needing ghc-prim? I’d expect crucial packages for a MicroHS environment to need microHS-prim.

  • It’s int-index who will be publishing the DT task-breakdown, not me.

  • Here’s a post from a Haskeller who would be interested in that breakdown:

    Serokell’s Work on GHC: Dependent Types, Part 4

    (there may be “others” ).

  • If enough of those subtasks can be done by others besides int-index and sand-witch, that ought to increase the speed of progress towards a full implementation of dependent types for Haskell (beyond the current “snail’s pace”).


If you’re committed to Haskell (perhaps mandated by your employer) then suck it up.

That only makes sense for more experienced Haskellers like us - we should not expect that “crawl over cut razor-wire and landmines” attitude of new Haskellers.

But since you mentioned employers:

…perhaps they got tired of having to “suck it up”.


I programmed in COBOL/SQL/RPG for decades. My opinion on them as languages was not invited.

Neither was this thread - it was a reaction to the hiding of this other post of mine to the original thread:

before it was moved here.


[***] I’m not using GHC 9 series at all. I’m applying the features/fixes, extensions I value to another Haskell.

…much like how MicroHs has various Haskell extensions as permanent features: “you get what you get”.

Because there have been suggestions by others that it’s still possible to program as normal in Haskell 98 or 2010. But as this closing remark from the MicroHs paper shows:

the ecosystem has moved on: there are very few packages/libraries left which could be used by a Haskell 98/2010-only program or library.

Fair question. It’s very much my personal decision, I’m not advocating it for any body else.

In one sentence: All the features so far in 9-series (and planned) I hate; I see no circumstances in which I’d use any of them.

Furthermore all the infrastructure around a release – including especially the compilation error messages – assumes you are/might be using all its features. I complained about this/put in a proposal/it got accepted/which was amongst the backlog that got nuked by the DH decision for specifically the error messaging suggesting AllowScopedTypeVariables – which I think is an execrable extension which should never have been allowed in that form; which certainly shouldn’t be included in GHC2021 in that form; which suggests the (snail’s pace) stampede towards DH will be similarly unhelpful to learners. (The how not the why.)

BTW it’s not that GHC 8.10 features are so great; more that GHC makes extra efforts to keep the last of a series stable. And I do want some of the extensions earlier in the 8 series, specifically PatternSynonyms.

Exactly: it’s int-index’s time I want devoted to working on DH, not diverted into being nice to people. Consumers will get “that breakdown” in release notes. Why do they need it earlier? Of course they’re “interested in” free (for them) stuff.

New Haskellers might encounter fewer such hazards using GHC 8.10 (and without AllowScopedTypeVariables). Either way, they need more hand-holding as to which extensions to favour and which to avoid; and to treat all suggestions in compilation messages with deep suspicion.

From int-index’s post:

  • I’ve been working on a task breakdown that I intend to publish by the end of this year.

  • As it stands, my draft for that page has 18 completed milestones, 5 ready for implementation, and 20 that are yet to be discussed and refined.

Therefore if more of those listed subtasks can be done by other Haskellers, then the “snail’s pace” towards dependent types for Haskell ought to improve.

I see two options here:

  • wait for that task breakdown to be published and observe what happens next,
  • or continue to speculate about its potential effects (good or poor).

New Haskellers might encounter fewer such hazards using GHC 8.10. Either way, they need more [assistance].

Unfortunately the Haskell ecosystem keeps moving on, and like Haskell 98 and 2010, will eventually make the use of GHC 8.10 impractical, even for educational purposes. To provide the guidance you suggest in the long term, it seems the only option is, well, new options:

  • -XSimpleHaskell, to provide a “pedagogical” set of extensions and options,
  • -fno-error-suggestions being one such option.

There is a distinction between using a library as a dependency, and a tool compiling it. The paper you link is concerned with compiling packages from hackage. In which case, yes, it needs to understand quite a few extensions, since it needs to build them.

But that does not means users need to enable or even understand those extensions in order to use those same libraries in their own code. The async library, for instance, enables both ExistentialQuantification and RankNTypes, though you certainly do not need to understand them in order to use it.

It is common GHC principle that extensions should only be required at the definition site, not the use site, though there are some counter-examples.

It is quite feasible to write “simple” haskell today, and there are plenty of accommodating packages.

5 Likes

It is quite feasible to write “simple” haskell today, and there are plenty of accommodating packages.

Presumably in the same way the ecosystem would still be there if people aren’t interested in using dependent types:

As time passes, the list of “simple” (to use) packages will diminish, just as they have for Haskell 98 and 2010.

Any definition of what it is for DH to be ‘finished’.

Π-types and function promotion. We get that and we’re in dependent types land.

Any timescale for delivery.

No timescale and no promises. We’re doing it as a series of incremental improvements, so if development stops one day, it’s still a net positive for the language.

Any enumeration of non-DH features/fixes or extensions that would be paused or for how long.

None. Zero. I don’t get where you get this idea that DH interferes with development of other features. This is not a resource allocation problem where I either work on DH or some other Haskell feature. I’m interested in DH specifically, so it’s either that or nothing at all.

So in effect there was a whole backlog of accepted enhancement requests that were just nuked.

That never happened. If there’s some accepted GHC proposal that you’d like implemented, the way forward is to step up and submit a patch.

The GHC Steering Committee does not hand out tasks to contributors, it simply approves or rejects language changes that volunteers want to implement. No volunteers – no implementation.

how much more complex GHC’s internals would get, such that any later attempts to revive those features would turn out to be greater effort than if GHC had never done DH

What makes you believe the internals are getting more complex? They’re getting simpler. There’s already a plan to merge types and terms, at least syntactically – that’s less code duplication, more uniformity.

and which I suspect will be more difficult to engineer in DH-riddled GHC

The work I do on DH goes through many rounds of code review. We’re not getting a “DH-riddled GHC”, the engineering quality is as high as ever. Rather, we’re getting “DH-enhanced GHC”, and it doesn’t preclude the work on any other feature, even when it comes to very ambitious features like LinearTypes.

if you’re sure you don’t want a language with Dependent Typing, now is the time to get off the bus

The opt-in principle is enshrined in #378, and we follow it. I challenge you to show a program that you can no longer write due to the work on DH done so far, or that you imagine will be invalidated by future work.

we should not expect that “crawl over cut razor-wire and landmines” attitude of new Haskellers

That’s what we have today. The razor wire beginners have to crawl over is well-documented in the Hasochism paper (2013), and it took me a detour to Agda before I figured out how to use GADTs and TypeFamilies effectively. The entire idea of DH is to sidestep all of that with a proper implementation of dependent types.

I repeat: the work on DH is conducted as a series of incremental improvements. Worst case scenario is that the language is improved moderately instead of improved greatly.

specifically the error messaging suggesting AllowScopedTypeVariables

There’s no such extension. You either mean AllowAmbiguousTypes or ScopedTypeVariables, and neither extension is part of DH. You can nuke both of them from orbit, or at least remove them from error message suggestions, and it won’t affect DH in the slightest.


Any other concerns I could address?

31 Likes

The razor wire beginners have to crawl over is well-documented in the Hasochism paper (2013) […]

(While mentioned thrice in #378, this is the first time it’s been referenced here on this Discourse.)

The razor wire beginners have to crawl over is well-documented in the Hasochism paper […]

By the time a Haskell has their first encounter with dependent types, they should no longer be a beginner. But as has happened with some other type-level entities, there is the risk of new (“What’s Haskell?” ) Haskellers having that encounter at the very beginning, while they’re being introduced to other concepts such as:

  • call-by-need instead of call-by-value
  • binding instead of destructive assignment
  • functions free of side-effects, instead of imperative procedures
  • static types instead of dynamic types

and so forth. Hence my concern expressed here:

If they perceive “the learning curve” of a “Dependent Haskell” to be too steep, they will simply go elsewhere…

I don’t see how this is different from the current situation where beginners might be exposed to DataKinds, PolyKinds, GADTs, TypeFamilies, RankNTypes, etc.

In fact, you write:

as has happened with some other type-level entities

So is there a reason you want to single out dependent types? In a beginner’s mind, wouldn’t it all register as “advanced type-level Haskell” without fine-grained distinctions that would make DTs more intimidating than, say, type families or kind polymorphism?

7 Likes

In a beginner’s mind, wouldn’t it all register as “advanced type-level Haskell” […] ?

Only if the beginner doesn’t have an adverse encounter with dependent types while trying to understand the basics of declarative functional programming (in Haskell). Otherwise there’s the risk of Haskell being perceived as the programming-language equivalent of an IQ test.


So is there a reason you want to single out dependent types?

For the same reason as I’ve also been doubtful of functional dependencies, and every type extension which has attempted (and failed) to replace them - the ability to program at the type level (which, in the case of functional dependencies, I have previously referred to as poor-man’s Prolog).

…or for us:

  • Less code means less bugs.

So expecting future Haskellers to write more code in the form of proofs, propositions, and other artifacts of logic can only result in more bugs. Given the first-class status of both values and types in a future dependently-typed Haskell, just how smart will people need to be to debug that code?

Thanks @int-index for your long post, but I really really don’t want you to use up your time with the grumblers on Discourse (including me).

I disagree with nearly everything you claim. It’s pointless continuing; we’re merely repeating entrenched positions. Suffice to say, I am so disappointed with the direction of GHC, I’ve given up. It’s not just one or two topics (so not DH specifically, although I now realise there’s much that was groundwork for it); it’s pretty much everything GHC has addressed – and more importantly failed to address – since around 2016. I find Hugs (as modified by me) superior for my Haskelling purposes in every way to GHC 8.10 other than not providing PatternSynonymseven for that I’m lukewarm.

I feel I’ve volunteered heaps. It’s been thrown back in my face. So I’ve implemented in Hugs.

To provide some balance here, I did say as my first response above:

That is: for those who are continuing to keep up to date with GHC, DH is not making the experience significantly worse, because it’s all behind extensions (as several have pointed out).

I think that reading code with DH is more counter-intuitive, but I totally agree with @int-index that:

  1. DH makes inner magic just visible and accessible to everyone
  2. DH with merging types with term makes code more uniform and this allows to get rid of code duplication

Hopefully DH does not block alternative language developing!

1 Like