Just to notice, I’m also from Novosibirsk, Russia (don’t live in Russia though). I don’t think it’s valid to use ad-hominem arguments here.
Thanks everyone for all the thoughtful responses! I’m possibly starting a green-field project in my company, and have been a little on the fence about if I should use Haskell or not. In my gut, Haskell is what I want to use, but public opinion (from friends and internet) was causing me a little bit of doubt. But seeing experienced practitioners chiming in with a lot of positives helps me to feel a lot more confident in my decision. I really appreciate everyone taking the time!
My glualint hobby project is also close to a decade old (it’s from 2014/2015 IIRC), and still quite nice to maintain. It’s rock solid software that I manage to compile for Windows, Macos and two architectures of Linux. I still love maintaining it.
In recent years I’ve been refactoring some things. I originally wrote it while I was doing my master’s, and let’s just say it’s code I wouldn’t write that way today. I recently switched it over from “IO everywhere lol” to a proper effects system. I found a bunch of bugs in the process, and only accidentally introduced backwards incompatibilities. Otherwise, it was a great experience to refractor.
It runs on 9.4, and is only one broken dependency away from working on 9.6. Definitely a success story imo.
Just want to point out that theres survivorship bias here since everyone here likes haskell one form or another.
Well, I mean, its a necessary evil I think lol. Only people who have stuck it out long enough to maintain a Haskell project in the long term can actually comment on how easy or hard it is to maintain.
But we still do learn something useful here, even with the bias: The survivors and enthusiasts find Haskell code easy to maintain. If even the enthusiasts were to concede “It’s not as easy as you might believe”, or “It’s about the same as other languages”, then I might be worried!
I’m of the impression that those enthusiasts have already left for rust…
Tbh I hope that Haskell Foundation steps in and offers more protection for the Haskell ecosystem. It is very nice that we have all these libraries available for use, but often they bitrot or the maintainer vanishes.
Moreover, from the reports I’ve heard from various package maintainers, it often seems a bad idea to solo maintain a Haskell package and having a team or an organization in support to help keep packages current might be necessary, especially with important libraries.
There is a survivor bias indeed but it is minimezed by the fact that most haskellers have experience with other language but they chose to use haskell.
Package maintenance is a problem in any language. I don’t think that expecting the Haskell foundation to ensure the continued life of big non-core libraries is a good plan.
Sometimes when big packages die out, new ones take their place. For most big things in Haskell (web servers, streaming, parsers, dynamic dispatch systems), there are several rather big libraries that compete for popularity. If hypothetically conduit
were to die out, then eh, there’s several alternatives. Shame if your huge project depends on conduit
, but that’s not an impossible hurdle to overcome. Worst case you fork/pr it to update for the latest GHC and dependencies. That’s usually not too much effort.
I’m really not sure what role the Haskell foundation would have there. An iron grip on the ecosystem, or more of a focus on the language itself, GHC and the core libraries?
Speaking of maintenance, keeping stuff up to date is much easier with Haskell than e.g. Python in my opinion. By far, most errors are caught at compile time. There’s the occasional semantic change that surprises on production, but that’s not nearly as problematic as updating a Python code base.
This is less of a problem for libraries wherein the Haskell ecosystem is well-developed, and more of a problem for where the Haskell ecosystem is relatively weak.
Tensorflow, for instance, is not being supported by Google anymore.
Accelerate’s Hackage is trapped pre-9.x, Brick still doesn’t work on Windows because of issues with Vty, some libraries I think are promising are not being developed fast enough, etc…
That said, Accelerate has alternatives in Arrayfire bindings and Halide, but both of these are bindings to C++.
I think the big problem with Haskell’s ecosystem is that Haskellers like to pretend Haskell library maintenance can be run as though they were Lisps, wherein a single developer can keep a library active on their own. But GHC has a breaking-changes policy, and there’s been quite a few library maintainers who’ve given up to avoid keeping up with GHC, so perhaps some kind of formal support infrastructure would be useful. At the very least, what happens if the library maintainer is run over by a truck? What about other unsavory accidents (afaik we have maintainers who seem to be living in an active war zone)? If these are maintainers of important libraries, it’d be best for people on standby to be able to take over smoothly.
Psst… Don’t tell anyone, but see Preparing for vty multi-platform release · Issue #260 · jtdaugherty/vty · GitHub
Is it really “issues” as in bugged? My understanding is it was a design trade-off made because the Windows terminal is a whole bag of worms. So I don’t see how that’s really relevant to a discussion about Haskell ecosystem maintenance. People make software that is Linux-only all the time.
https://www.reddit.com/r/haskell/comments/7tutxa/vty_needs_your_help_supporting_windows/
5 years ago. I think I contacted the maintainer like a year or two ago and offered to see if I could get a fundraiser to get vty patched to support recent versions of Windows, but he told me then he had finally gotten some people to help out.
At the present timeline, maybe, another 6 months? 12 months? Given that Mr / Ms. Daughterty had expressed interest years back, it’s sort of disappointing it took 5 years for us just to get to this anticipatory point.
If people are not happy with the pace and they desperately brick
/vty
on Windows, they can fork it. Otherwise they can sit tight and be grateful for the maintainership work the authors pour in.
Windows terminal is a bit of a mess and apart from the techical legwork there will be some difficult decisions to be made (mainly on interface: a uniform one which will not take advantage of some Linux/OsX capabilitier or vice versa a composite one which will not abstract cleanly over all platforms)? I expect way longer than six months to se this completed.
But wait.
As I see it, we have a typical coördination problem here. There are a bunch of people that want to have brick
work with the Windows terminal, they are willing to pay, but neither of them by themselves can afford the whole price. Here, of course, «price» can be metaphorical — it can be monetary, or it can be given in the form of work, or good will, or whatever else.
The wish of @Liamzy is to have a trusted coördinator (say the Haskell foundation) pool the contributions and concentrate the effort. This seems reasonable for me.
It seems you are asserting that the only solution to this coördination problem is the «null solution» where everyone waits until some single person can afford the whole price. Is such pessimism truly warranted?
For example, maybe we can deploy a smart contract on some public blockchain, such that will collect contributions and send them to the designated address (perhaps the address keys for which are given to the maintainers of brick
or vty
) as soon as the amount reaches a certain threshold. This does not even require trust in a central authority.
Considerations such as this make me think that there are solutions other than the null solution.
If you’re interested in maintaining a project, there’s the official process of Taking over a package on Hackage.
Otherwise, it’s OSS. Often, not much you can do to influence external projects. Maintainers appear and disappear. Huge contributions might be ignored for a very long time or even rejected.
What I do in my OSS packages is I write a DISCLAIMER like the one below, so at leas contributors are aware about expectations from me.
What I’d really want is a task force nominally under the supervision of the Haskell Foundation, willing to pitch in when vulnerable and key parts of the ecosystem (I’d count Accelerate as such, given that it was featured in Simon Marlow’s book) are in danger of regression or simply becoming unusable.
A library maintainer can request the task force’s help, perhaps offering repayment in kind (developer hours for some other project), simply to keep the library maintained.
To an extent, I’d think it’d help keep library maintainers involved because instead of sitting on their egg and guarding it, which can get boring (you’re not extending your library, you’re just keeping it from bitrotting as GHC evolves), they are trading favors and fixing others’ libraries, which can be more novel and provide opportunity for growth as a software developer.
One of the important things, I think, is Lisp, and more specifically, the Lisp Curse:
http://www.winestockwebdesign.com/Essays/Lisp_Curse.html
Haskell is not a Lisp. Lisps are about trying to achieve ultimate expressive power with metaprogramming and unrestricted effects, whereas Haskell’s macros are substantially less ergonomic, the type-system constrains possible programs, and the inability to throw effects anywhere, obviously, mean that while Haskell’s purely functional programming is extremely expressive, it still ends up trading off pure power for safety, equational reasoning, and maintainability.
I’d consider this a good thing, but only if Haskellers acknowledge this to be true. Haskellers are extremely smart, extremely talented, but are often not capable of doing everything on their own, and require a mild level of collaboration to actually keep things afloat.
I hope that Haskellers, when they have a good library that is important to the ecosystem, immediately recognize the need for collaboration and partners, perhaps through open volunteer teams, to help keep the library maintained, to help keep the library moving toward future goals and updates, or just to provide a failsafe should they lose interest, fall ill, or worse.
Remember, Haskell itself is a design-by-committee language, albeit instead of being designed by a corporate bureaucracy like certain languages we know of, it was designed by very smart academics working together. Why not retain the same attitude for the Haskell ecosystem, albeit on a smaller scale?
The problem is - many packages compile only with a certain (narrow) subset of GHC versions. Besides, they have often-too-narrow upper bounds. Besides, their authors often are too free with introducing breaking API changes.
As a result, it’s relatively ready to end up with an application that won’t build by any GHC anymore, because one dependency requires an older compiler, and another - a newer one.
So, the only “guaranteed” maintainability on this ecosystem is the ability to patch your own code, remaining on the versions of all of the dependencies and toolchain that were fixed when you wrote your app. But if you want to upgrade a dependency - maybe because they’re was a security big there - it’s a coin-toss whether the newer one would fit the rest of your dependency tree. Which does not mean “easy to maintain”.
Understanding old Haskell code as a human is pretty reasonable, but the accumulation of minor breakage over time in terms of the compiler and core libraries is a tar pit. Old C++ code that you can’t understand but that still compiles and runs today may be more valuable than old Haskell that you can better understand but requires some unknown amount of fixing to compile. Which situation is better will vary.
I think part of the challenge that GHC hasn’t conquered (no fault here, it might be an impossible problem) is how to make changes such that new code doesn’t carry historical baggage while old code can still compile. For instance, if you had a package that was more or less unmaintained and didn’t compile with the latest and greatest GHC out of the box, but could still be built in a Haskell2010
compiler mode, could you call into that package from new code written in a GHC2021
mode? One enormous challenge is writing the compiler such that it preserves those old paths so can compile both kinds of code, and another enormous challenge is making what is essentially an FFI between those language versions as usable as possible.