That is indeed really a different target audience than the big Haskell companies with lots of experts that already have most of it figured out. I wonder what the needs of that group are. Is stability really important? I think they will only have to deal with it after they’ve been using Haskell for a while. But maybe they’re already cautious.
I’ll throw in my two cents:
Do I feel like Haskell is dying? No.
Hobbist: someone who advocates for the rights of hobbits. See also, Gandalf, Haskell.
Sorry for all my rants, I was being too negative in the thread. I want to express the sorry…
I have been reading through all this thread now, and I am a bit disappointed by the conclusions (are there any conclusions?) Is there anything we want to do about these negative feelings? Or maybe I missed something.
Personally, I really love writing Haskell.
I am disappointed/annoyed by
- The lag of tooling support (e.g., see Supported GHC versions — haskell-language-server 1.6.1.1 documentation). Maybe I just got used too much to the excellent help during coding from
hls
, but it feels weird that my environment breaks all the time, for example, when I want to use the latest released GHC (one year after release). - Outdated and not-well-written (too complicated) documentation (especially the Wiki).
- Lack of a comprehensive set of well maintained core libraries. And here, I am not talking about bytestring etc. I am thinking about well interacting statistics libraries, data science libraries, probably AI libraries, plotting libraries, and the list goes on.
I know, it is always easy to complain and harder to fix things, but here I am :).
In conclusion, I really love the language, and I think that Haskell is developing well, and still I don’t feel particularly happy about the state of affairs outside the actual *.hs
files.
For 1, there is a recent proposal [Pre-HFTP] GHC.X.hackage: A tool for easing migrations to new GHC versions that should make it possible for HLS to support new GHC versions much faster. This seems really promising, so I still have a lot of hope that this will mostly be resolved in the near future.
For 2, there has been a proposal for the wiki in particular [RFC] Evolution of wiki.haskell.org, but no clear plan has formed yet, I believe. I think it is difficult to get consensus on what the best course of action is. As for general documentation, I think the Haskell foundation has set something in motion to improve the documentation, but it is still going a bit slow. If you have concrete examples of bad documentation, please report them as mentioned before in this thread (thanks for making that repo @tomjaguarpaw!).
For 3, I consider all your examples to be in the data science domain. That’s a domain that the Haskell ecosystem just isn’t very good at yet. As mentioned in this thread, there have been attempts before to improve this, but it hasn’t been successful yet.
I advocate for there to be an authoritative guide for onboarding, up to intermediate and recommended reading to become experts, community-supported, and hosted on haskell.org. This is possible/feasible, but agreement among various parties seems to keep blocking that or burning out contributors who push in this direction.
Edit: I also advocate for the migration and archival of content on the wiki, folding that content into haskell.org if it should be kept, and off wiki.haskell.org - atm the wiki competes too much in search engine results when you query and look for some guide or doc on some random haskell topic.
I like this:
“…an appreciation for the craft of programming that is exceedingly rare in the wider world of programmers…”
I think there is a spectrum from coders, programmers, … to actual software designers. The later knowing a range of paradigms and approaches which Haskell introduces to others in the imperative bubble. Bit by bit other languages, created by knowledgeable experts who know the domain are incorporating similar things - even Java’s streams with the triad of filter, map, reduce.
Nowhere as simple and clean as in Haskell, but echoing similar ideas. Qwerty v.s. Dvorak.
IMHO Haskell has a multi-tiered learning and usage curve, initially clean and simple, and then as many new language pragmas, it branches out into a tree of DSLs. The ability to define new operators and monadic operations is also powerful, but can lead to very localized code structure and usage.
Even if there is no big company X using Haskell, there is much to learn from it to anyone who programs.
Folks tend to forget that the Bluespec, a HDL,which is haskell or haskell with syntactic sugar is doing well .Google uses it, so does DARPA and so do startups. It’s competitor chisel, uses Scala. No other mature HDL at this level of abstraction other than these two in commercial usage. And these startups are shipping RISCV CPUs written in non popular/dying languages and getting venture funded ! But Bluespec which came from MIT does have a sucessor at the research stage, uses a DSL embedded in Coq.Not likely to get industry adoption anytime soon. Try getting a newbie programmer to shift to Coq! In the HDL world, a lot of the haskell features are a must have and the difficult to use tag simply does not exist.
It seems like haskell is not doing well regarding industry adoption, open source activities and community activities… I lwish research portion could keep haskell from dying completely. (or… is research moving out as well?)
Honestly, when I learned coq it did not feel that hard. Bigger problem was to write total functions with limited matching capabilities, but that might be mitigated in DSL.
Are you subscribed to the Haskell Weekly Newsletter?
I’m begging the mods to please just lock this thread, there’s no reason to keep having this conversation here. Every substantive point has and will continue to be made in other threads where the topics are relevant.