Maintaining haskell programs

Psst… Don’t tell anyone, but see Preparing for vty multi-platform release · Issue #260 · jtdaugherty/vty · GitHub

8 Likes

Is it really “issues” as in bugged? My understanding is it was a design trade-off made because the Windows terminal is a whole bag of worms. So I don’t see how that’s really relevant to a discussion about Haskell ecosystem maintenance. People make software that is Linux-only all the time.

2 Likes

https://www.reddit.com/r/haskell/comments/7tutxa/vty_needs_your_help_supporting_windows/

5 years ago. I think I contacted the maintainer like a year or two ago and offered to see if I could get a fundraiser to get vty patched to support recent versions of Windows, but he told me then he had finally gotten some people to help out.

At the present timeline, maybe, another 6 months? 12 months? Given that Mr / Ms. Daughterty had expressed interest years back, it’s sort of disappointing it took 5 years for us just to get to this anticipatory point.

If people are not happy with the pace and they desperately brick/vty on Windows, they can fork it. Otherwise they can sit tight and be grateful for the maintainership work the authors pour in.

Windows terminal is a bit of a mess and apart from the techical legwork there will be some difficult decisions to be made (mainly on interface: a uniform one which will not take advantage of some Linux/OsX capabilitier or vice versa a composite one which will not abstract cleanly over all platforms)? I expect way longer than six months to se this completed.

3 Likes

There’s --offline, recently fixed.

1 Like

But wait.

As I see it, we have a typical coördination problem here. There are a bunch of people that want to have brick work with the Windows terminal, they are willing to pay, but neither of them by themselves can afford the whole price. Here, of course, «price» can be metaphorical — it can be monetary, or it can be given in the form of work, or good will, or whatever else.

The wish of @Liamzy is to have a trusted coördinator (say the Haskell foundation) pool the contributions and concentrate the effort. This seems reasonable for me.

It seems you are asserting that the only solution to this coördination problem is the «null solution» where everyone waits until some single person can afford the whole price. Is such pessimism truly warranted?

For example, maybe we can deploy a smart contract on some public blockchain, such that will collect contributions and send them to the designated address (perhaps the address keys for which are given to the maintainers of brick or vty) as soon as the amount reaches a certain threshold. This does not even require trust in a central authority.

Considerations such as this make me think that there are solutions other than the null solution.

6 Likes

If you’re interested in maintaining a project, there’s the official process of Taking over a package on Hackage.


Otherwise, it’s OSS. Often, not much you can do to influence external projects. Maintainers appear and disappear. Huge contributions might be ignored for a very long time or even rejected.

What I do in my OSS packages is I write a DISCLAIMER like the one below, so at leas contributors are aware about expectations from me.

3 Likes

What I’d really want is a task force nominally under the supervision of the Haskell Foundation, willing to pitch in when vulnerable and key parts of the ecosystem (I’d count Accelerate as such, given that it was featured in Simon Marlow’s book) are in danger of regression or simply becoming unusable.

A library maintainer can request the task force’s help, perhaps offering repayment in kind (developer hours for some other project), simply to keep the library maintained.

To an extent, I’d think it’d help keep library maintainers involved because instead of sitting on their egg and guarding it, which can get boring (you’re not extending your library, you’re just keeping it from bitrotting as GHC evolves), they are trading favors and fixing others’ libraries, which can be more novel and provide opportunity for growth as a software developer.


One of the important things, I think, is Lisp, and more specifically, the Lisp Curse:

http://www.winestockwebdesign.com/Essays/Lisp_Curse.html

Haskell is not a Lisp. Lisps are about trying to achieve ultimate expressive power with metaprogramming and unrestricted effects, whereas Haskell’s macros are substantially less ergonomic, the type-system constrains possible programs, and the inability to throw effects anywhere, obviously, mean that while Haskell’s purely functional programming is extremely expressive, it still ends up trading off pure power for safety, equational reasoning, and maintainability.

I’d consider this a good thing, but only if Haskellers acknowledge this to be true. Haskellers are extremely smart, extremely talented, but are often not capable of doing everything on their own, and require a mild level of collaboration to actually keep things afloat.

I hope that Haskellers, when they have a good library that is important to the ecosystem, immediately recognize the need for collaboration and partners, perhaps through open volunteer teams, to help keep the library maintained, to help keep the library moving toward future goals and updates, or just to provide a failsafe should they lose interest, fall ill, or worse.

Remember, Haskell itself is a design-by-committee language, albeit instead of being designed by a corporate bureaucracy like certain languages we know of, it was designed by very smart academics working together. Why not retain the same attitude for the Haskell ecosystem, albeit on a smaller scale?

4 Likes

The problem is - many packages compile only with a certain (narrow) subset of GHC versions. Besides, they have often-too-narrow upper bounds. Besides, their authors often are too free with introducing breaking API changes.

As a result, it’s relatively ready to end up with an application that won’t build by any GHC anymore, because one dependency requires an older compiler, and another - a newer one.

So, the only “guaranteed” maintainability on this ecosystem is the ability to patch your own code, remaining on the versions of all of the dependencies and toolchain that were fixed when you wrote your app. But if you want to upgrade a dependency - maybe because they’re was a security big there - it’s a coin-toss whether the newer one would fit the rest of your dependency tree. Which does not mean “easy to maintain”.

3 Likes

Understanding old Haskell code as a human is pretty reasonable, but the accumulation of minor breakage over time in terms of the compiler and core libraries is a tar pit. Old C++ code that you can’t understand but that still compiles and runs today may be more valuable than old Haskell that you can better understand but requires some unknown amount of fixing to compile. Which situation is better will vary.

I think part of the challenge that GHC hasn’t conquered (no fault here, it might be an impossible problem) is how to make changes such that new code doesn’t carry historical baggage while old code can still compile. For instance, if you had a package that was more or less unmaintained and didn’t compile with the latest and greatest GHC out of the box, but could still be built in a Haskell2010 compiler mode, could you call into that package from new code written in a GHC2021 mode? One enormous challenge is writing the compiler such that it preserves those old paths so can compile both kinds of code, and another enormous challenge is making what is essentially an FFI between those language versions as usable as possible.

6 Likes

I think code would break much less often if people really just used Haskell2010. A large part of the problem is that so much of the ecosystem uses various extensions which are at different levels of stability, or worse, depends on the internals of GHC itself.

Can anyone name one breaking change that breaks Haskell2010 code? I can only think of base library changes, but you should be able to get a Haskell2010 base library which matches the specification exactly.

1 Like

Everything would be simpler if people didn’t use extensions, but I don’t think it would be the same GHC! Part of GHC’s appeal (for some of its audience) has always been its willingness to experiment, and extensions are a plausible mechanism to do so. Base library changes are a great example of why interop between versions is so fraught.

What I mean to say is that I think you can only have two of these three points:

  • bleeding edge experimental features
  • a guarantee that all your programs will still compile in X years time
  • a simple compiler that can be maintained by a low-budget team

Of course all three points have varying gradations, so it might be possible to strike a better balance than we do now.

6 Likes

That’s a great summary!

I cannot think of any breakage in my code because the use of extensions.

On the other hand I can think of one (minor) breakage that involved no extensions: the split of Monoid into two classes. But this was a very trivial change in my code.

1 Like

I am curious to know which extension have actually broken packakes. My experience with broken packages is 99% due to lower bound of Base needed to be bumped, TemplateHaskell code being broken by some internal change in GHC.

4 Likes

Respectfully disagree - to me it sounds like “maintainability would be much better if we all stick with GHC-8.10.7 (or older - whatever’s the latest GHC-7) and whatever Cabal/Stack that accompanies it”.

Sure, if toolchains did not change, and existing packages’ API didn’t change - maintaining Haskell apps would be a breeze. Unfortunately, that’s not the case. “And if wishes were wings - . . .”

39 years of software development make me think that language extensions are a likely source of maintainability issues. It’s analogous to the maintainability issues for C++ (and other “large” languages) for teams above a certain size. Small teams can decide on, and police, the set of language extensions they use. But this doesn’t work so well in large teams, especially if there is a lot of turnover over time.

2 Likes

What about language extensions does this exactly? As a manager or whatever, I guess I see how it can run afoul of the “engineers are cogs who should be able to get run over by buses” philosophy. But from a technical standpoint, I don’t get how a module being able to turn on LambdaCase, ImplicitParams, LinearTypes, DataKinds, TypeFamilies, etc is gonna be any worse than a project having a messy dependency graph or being untestable (way more common of problems + language agnostic).

4 Likes

I agree that messy dependency graphs and untestable code are also likely to make Haskell code unmaintainable, but, as you point out, these are language agnostic. Language extensions, especially if many of them are used inconsistently across a large project, could greatly increase the difficulty of understanding and modifying the code. I suspect that as more and more extensions come into existence, it might even be possible for a large project to become arbitrarily unmaintainable. Lsmor pointed out some specific difficulties.

1 Like