The future of Dependent Haskell

If that task breakdown is as informative as the ones available in the weekly GCC-Rust status reports, that would be quite helpful - thank you.

Moreover, such a breakdown may help to improve this project’s “bus factor” :

“What If Linus Torvalds Gets Hit By A Bus?” - An Empirical Study

because other Haskellers could then carry out some of the subtasks - as I currently understand it, only sand-witch and yourself are it’s primary developers.

The work on Dependent Haskell has already made my Haskell-writing life better today. TypeAbstractions and RequiredTypeArguements are breaths of fresh air compared to the contortions you would otherwise have to go through. The consistency regarding wildcards type A _ = ... is a good example of how the DH work is improving overall language consistency, which I consider a huge win.

DH has also spurred efforts to improve the compiler’s internals, which should be important to anyone who cares about Haskell’s future.

Regardless of the “final destination” – to be clear, I still really want foreach and am confident about its future – I am grateful for all of the hard work that is being put into DH, since these incremental steps are wins in and of themselves.

14 Likes

The work on Dependent Haskell has already made my Haskell-writing life better today.

That’s all well and good for an established Haskeller today, but what about new Haskellers today, tomorrow, next week next month, next year, or the next decade? If they perceive “the learning curve” of a “Dependent Haskell” to be too steep, they will simply go elsewhere…

1 Like

They can just stick to Haskell 98 features (or GHC2021/24) it’s not going anywhere.

Haskell is a deep, feature-rich language with many ways to do it. If beginners struggle, it’s because people don’t tell them they don’t need to know it all to use it. I sure don’t know it all and I do great.

8 Likes

They can just stick to Haskell 98 [or 2010] features […]

MicroHs: A Small Compiler for Haskell (2024)

The Haskell 98/2010 ecosystem which would conveniently allow that choice has long since disappeared - will GHC2021 be next?

I think most of the bread-and-butter libraries on Hackage don’t require many extensions at all. I use them the same as I used them a decade ago fine. So the ecosystem is there if you don’t want to go full DH.

And if a theoretical future library does require DH in its API…then either don’t use it or do? Or find a library for your needs that doesn’t lean on DH? If it doesn’t exist, go write it :face_with_hand_over_mouth:

[…] the ecosystem is there if you don’t want to go full DH.

For now. And clearly your experience differs to one L. Augustsson…


[…] find a library for your needs that doesn’t lean on DH? If it doesn’t exist, go write it.

In addition to the many strict, unboxed and linear versions of already existing libraries? Ouch…

I had this exact theoretical debate about -XLinearTypes back in the day. “Oh if we do this then it’ll infect the ecosystem. And LTs is half baked so that’d be bad!”

Like…if there are libraries that use these new features, it’s because the new feature is useful. If it gets so popular that you can’t avoid it, that’s a gigantic success and we are living in a new frontier of ergonomic, production-ready dependent types.

It probably won’t be like that though :cowboy_hat_face: but i’m trying to impress how this isn’t a real concern.

5 Likes

Would it? Like, Cabal will work with it somehow? HLS? Nix stuff? Random libraries? I’m sorry, but that doesn’t strike me as obvious, at all. In fact, I don’t think this has ever happened in the PL history, with C++ “inheriting” C, maybe, being the closest case.

And those alternatives are not even close to the detailed plan you wish DH to have. I don’t hold my breath for DH, and I share some of your skepticism, but at least the work is being done here and the ecosystem already exists. That is a huge, practical, difference to whatever “what if” argument. Count me in on Lazy Lean or Agile Agda hype train ride, but so far, it simply doesn’t exist, not even close.

3 Likes

I’m not really grasping the point. Certainly if you’re writing a Haskell compiler, and you want to test it out on existing packages or you want it to be self booting and still rely on existing packages, you might be annoyed that people are using LambdaCase, ScopedTypeVariables, DeriveFunctor, or any of the other popular extensions that you can’t compile. That’s quite reasonable. It feels like the one use case imaginable where other people’s haskell writing habits actually matter.

But I’m not seeing how that connects to the notion that new haskellers can’t just write regular old haskell. Unless they are trying to write production ready parsers for other people’s haskell, but if you try to do that as a new Haskeller its a self inflicted wound, I’d say.

If it gets so popular that you can’t avoid it, that’s a gigantic success and we are living in a new frontier of ergonomic, production-ready dependent types.

“If”, “might” - that seems rather “partial”. So what happens if there’s a Loss b? What’s the backup plan?

Having mentioned that, int-index’s listing of subtasks helping to arrive at a “dependent Haskell” more quickly could be enough - if dependent types in Haskell are a loss, it will be much easier to revert back to the most-recent non-dependent version of Haskell (because it won’t quite as old).

We already emulate dependent types in Haskell in many ways. This is not new. In that way, DH isn’t some completely greenfield thing. So I think you’re phrasing what DH is incorrectly. It isn’t some bugbear getting strapped onto GHC. It’s a smooth staircase and it isn’t one thing.

3 Likes

The difference is those features (strict, unboxed, and linear) are all non-default. Type-level programming is also non-default. I don’t understand the concern that DH is going to “take over the world” and force everyone to engage with type-level programming. Precedence clearly shows the opposite i.e. you generally have to go out of your way to use “fancy” features.

1 Like

Then once the the new dependent Lean-based language reaches parity with non-dependent Haskell, it would inherit most of the entire existing Haskell ecosystem as a bonus!

Would it?

In the case where the Lean-derived language literally was Haskell with dependent types as an extra feature? Then it would work - how could it not; you would in effect have a replacement Haskell implementation that supports dependent types as an extra feature.

But that would be optimistic, with the decision to get “close enough to existing Haskell” being the more likely one; this being the result of balancing:

  • the extra effort in achieving perfect parity with existing Haskell,
  • and the effort needed to migrate the existing ecosystem to use the new dependent types.

And those alternatives are not even close to the detailed plan you wish DH to have.

As I mentioned earlier, int-index’s soon-to-appear page listing all subtasks could change that.

1 Like

Because they probably wouldn’t be able to use the Haskell libraries that exist now with “regular old Haskell” (98 or 2010). Hence the closing remark in the MicroHs paper.

We’ll both find out when int-index publishes that task-breakdown page.

actually what would getting a Loss b for DH even mean? i find it astronomically unlikely its anything worse than current haskell + new DH extensions that no one uses like arrow syntax or safe haskell

1 Like

They may not be in “the default set” but all those libraries must still be kept in working order. So they will all still have to be maintained - a great deal of extra work, just like all those “fancy” extensions that GHC still has to support.

Here’s one possibility: universities, colleges and other centres of education decide to switch their curricula away from a “dependent Haskell” and its perceived-as-too-steep “learning curve” to a combination of:

  • Prolog (logic, laziness);
  • Erlang (functional, Prolog-based syntax).

with “dependent Haskell” possibly being relegated to an advanced-theory course (but with education budgets getting ever-tighter…that isn’t guaranteed either).

but… that’s not what the closing remarks are about at all… He wants to parse stuff on hackage but can’t because most packages there requires GHC specific haskell features to parse, not to use. Its not that writing haskell 98 or 2010 has grown more difficult, its that parsing, err, ‘colloquial’ haskell has grown harder. What more, the biggest issue seems to be libraries relying on TH and ghc-prim. TH is fiendish, imo, but not a particularly brand new feature and seemingly newcomers aren’t too put off by it widespread usage just yet.

see: https://youtu.be/uMurx1a6Zck?t=3738 for his rant.