It consists of the common sense foundation of Haskell.
Good luck with defining the âcommon sense foundationâ.
- A core library is one that only depends on base and is declared as such from the maintainers
Thatâs impossible without basically moving all of the interdependencies into base. I think thatâs the wrong approach. Base should become lighter and more decoupled from GHC. This would achieve the opposite I think.
So why not have something like a pure architecture and decouple core libraries from each other such that each library only depends on the base library?
To be taken seriously, the proposal would need to do a slightly deeper analysis to propose a realistic organization for the libraries.
That said, it would probably be easier to come up with a single stdlib
.
Given the way the Haskell ecosystem works, we could (probably even more easily) just make that stdlib
wrap/re-export the others.
But regardless of those semantics and how itâs organized and presented to the world, I think the thing we have learned over the last 10+ years is that the hardest thing to do here is agree on what should actually be in that stdlib
, along with the refactorings/changes to get closer to the proposed idealâŚ
Not necesarrily. For example, the interdependencies of the packages Win32, filepath, process, directory, unix and time could be removed by grouping them together into a larger package called âsystemâ.
Also the interdependencies on template-haskell have only existed for around two years and could probably be removed again.
I see basically 3 ways to remove an interdependency between core libraries:
- Remove by refactoring the package that includes the dependency.
- Remove by grouping with the dependency package.
- Remove by moving the parts you need into base.
In my opinion, these three possibilites should be questioned in this order. The last option would be to move the dependency into base.
⌠the language has evolved a lot since Haskell2010, I would like to see an official language specification.
To be a little more precise:
- You can still program using just the Haskell2010 standard; and thatâs plenty powerful.
- As at 2010, all Haskell compilers supported extensions beyond the 2010 standard â there just wasnât enough agreement about which extensions were âstandard enoughâ.
- And Iâm not sure even by now whether thereâs enough agreement: We might all agree the language should support MultiParam Type Classes; but that alone is not enough; then should an âofficial language specificationâ as at today include Type Families and Associated Types and Closed Type Families; and Functional Dependencies? Should the FunDeps be specâd per the only âofficialâ academic papers covering them?; or per what the actual compilers support?; which compiler? GHCâs implementation describes itself as âbogusâ.
In the last 5 years since Iâve been using Haskell, I feel like that things donât really make progress in this regard, e.g. the failure of the Haskell2020 specification.
The Haskell2020 âstandardâ process did eventually result in a blessed set of extensions. Has anybody (or any major Haskelling shop) restricted themselves to that set? Was it a worthwhile exercise? (I know Iâve just ignored it.)
What good would a standard specification do now? Trying to produce it would cause a great deal of debate. Then everybody would ignore it and carry on using the extensions they prefer (and avoid the extensions they donât, even if âmandatedâ by the standard).
Also itâs sad to see that all Haskell compilers except GHC died years ago.
You can still download and use Hugs. You can switch to purescript.
Hugs supports the whole Prelude
/base; and is powerful enough to implement sophisticated applications: for example, you can build in it a compiler for a modern FP language such as ⌠GHC.
Libraries
Then after your opening paragraph, the only thing you talk about is Libraries/dependencies. How would a âbetterâ library structure (for some value of âbetterâ) make any difference to producing a Haskell2025 standard?
I tried to explain how guidelines for core libraries could naturally evolve the language standard. In my opinion the language standard could be also based on the things that are most often used by the programmers, like things in deepseq. Surely it would also be possible to invent a similar mechanism that could be used instead. How things are implemented, if some special function is safe or not, thats not my point. It should mainly be useful. My point is the lack of a language standard at all.
Realistically I canât produce any sophisticated software without the use of core libraries. I donât know when Hugs was last updated, but the date on the website is from 2003. I doubt, that it can build all the core libraries. And purescript is mainly for web apps and doesnât have the libraries I need. I donât only depend on core libraries, but on a huge part. It would be nice to know that core libraries would not take in dependencies from outer space, where I donât know that the dependency graph and as a result built times and binary size is growing uncontrolled.
I mainly program in Haskell2010, but I donât know if I can rely on old libraries and compiler in the future. Changes in hardware architecture could be possible for example. So I have an interest that things go well.
Besides, Haskell is the basis of my work, i.e. Iâm writing a commercial app for around 4 years. That is, I canât easily switch to another language. That would be the end of my business. All in all I think it was the right decission to use Haskell. I come from python and itâs an huge difference. Itâs a lot easier to get things right in the first place. Refactoring, which Iâm constantly doing, is also a lot easier.
Lastly Iâve read something like: âWhy does someone use Haskell for business, as one should know itâs only an academic language.â That makes me somehow sad, as there surely was promotion for more adaption of Haskell in the industry.
I think the barriers to your proposal are less technical ones and more about the social aspect. As a community, we canât really ever agree enough to get big things like this done, and there is no central authority to help us make those types of decisions. Still, good on you for pushing something like this forward and looking for ways to help better define/bless best practices.
The reason nobody should be using Haskell for business â that is, commonplace commercial applications: payroll, ERP, even a web-based store â is there arenât enough Haskell programmers, and theyâre all cranky and opinionated (talking about myself). Making your business hostage to propellor-heads would take commercial IT back to the 1980âs.
Whereas there are gazillions of (say) Java/Javascript/python/COBOL programmers: if one gets too precious about âtheirâ beautiful code, sack them, get another easily. Producing code is not the purpose of the business: itâs a service to the businessâs main purpose. (If youâre working for a software company producing a package; yes producing code is your business; but very few users of a package know or care what technology underpins it.)
(BTW the last release of Hugs was Sept 2006. It can build all the core libraries in the H2010 standard. It predates the AMP/FTP changes to the libraries; but since those were mostly reshuffling existing stuff and introducing a bit of H2010-compliant polymorphism, I donât think re-compiling for those would be major. But Iâm certainly not going to bother, because those AMP/FTP changes are wrong: Iâm cranky and opinionated about it.)
Not necesarrily. For example, the interdependencies of the packages Win32, filepath, process, directory, unix and time could be removed by grouping them together into a larger package called âsystemâ.
Please no.
These packages have different maintainers. Itâs already difficult enough collaborating with some of these packages.
It will slow down development of all of these, cause much more communication churn, lead to more PVP incompatibility when unrelated parts of the API break and depending on the quality of the communication may cause people to accidentally introduce changes to code they donât understand.
I think the barriers to your proposal are less technical ones and more about the social aspect
You nailed it.
I think programmers tend to see a technical problem in everything. Collaboration and coordination is hard in a fragmented set of volunteers that donât get paid.
API boundaries are good. They donât just force you to think more carefully about the scope of your project, but also serve as a communication nexus. They are contracts, express intent, hide implementation details. All of this goes out the window if you unify loosely related packages.
Thatâs why everyone is dissatisfied with base, why we canât figure out how to decouple it from GHC and why we need an entire committee to maintain it.
We need the opposite direction: clean, isolated APIs. Yes, they also provide challenges. Sharing of types, avoiding circular dependencies, occasional inlining of other libraries functions. But thatâs more managable than a huge blob of core functions with very high entry barrier for contributors.
Yes, thatâs funny. What I had in mind was âallgemein anerkannte Grundlagenâ, which better translates to âgenerally accepted principlesâ
Yes, maybe I see a lot of things too technical. But in the end, Haskell is a construct to solve technical tasks.
This looks promising:
âŚit even has its own paper! So everyone who is dissatisfied with base
can join the effort to make it better, based on the KISS generally accepted principles.
I donât think this addresses any of the points I raised.
Maybe for the packages themselves, I would agree⌠but for the general direction and overall cohesiveness of the lot of them, I think we actually need a group or person acting as a decisive force with the trust and power to set the direction for âwhat is blessed and recommended for production codeâ. We already have too many decisive bodies and committees, so adding another one isnât necessarily better⌠and the ones that exist approach this topic with reluctance⌠but to me the problem stems from no one taking ownership over the experience and direction of âQuality Haskell for Productionâ, and then also being trusted by the community of users, as well as the community of maintainers, and community of academics.
Proposals such as this one continue to come up and be disagreed on to no end, and then fizzle out. Individuals and groups of people who step up to lead and show the way end up fighting uphill and burning out, giving up, moving on, or staying in their own corner doing their own thing in frustration and wishing everyone else would get on board with something that works. Just look at how many âalternative preludesâ we have.
All the while, users are left drifting, we look a bit foolish, and people wonder why there is this idea that Haskell isnât great for production business software, or that attrition is a problem.
A stdlib
would be pretty cool, so would an authoritative guide to âbuilding production software with haskellâ, but they need to be in sync and harmony with the package ecosystem, navigate picking sides on how things are recommended to be done, made relevant in the community, and maintained on into the future as packages and âthe best way to do Xâ evolve. Itâs not that we havenât or arenât trying, but we arenât holding it together or finding success.
I think we actually need a group or person acting as a decisive force with the trust and power to set the direction for âwhat is blessed and recommended for production codeâ
I donât believe that itâs even a good goal.
There are many competing opinions about whatâs good in production code. Everyone is free to create their own guidelines or gather with other maintainers and projects to try to come up with a cool set of guidelines.
Making any of this âauthoritativeâ seems quite questionable. The more expressive the language, the more opinions you have. And many of them are valid and have their arguments.
Something like this should be much more focused on education rather than being authoritative. It should help people make better decisions, not tell them what to decide. Even something seemingly simple like âhow to handle effectsâ will make someone vehemently vote for RIO
and others for polysemy
.
I also find it questionable to tell volunteers, who maintain (core) libraries in their free time, what architecture to pick, what patterns to use. If you want to influence (core) library development, become a maintainer and write code.
Individuals and groups of people who step up to lead and show the way end up fighting uphill and burning out, giving up, moving on, or staying in their own corner doing their own thing in frustration and wishing everyone else would get on board with something that works.
Well, there certainly is some burnout rate in Haskell. But Iâm not sure this is the reason.
Core library maintainers are volunteers. Itâs already difficult to get improvements into the ecosystem that touch more than one library (like AFPP, which Iâm currently working on). The main issue here is lack of engagement and communication. Iâm not sure if thatâs systemic or just a function of time.
The problem I see is when you come up with a proposal and expect other people to implement it. Are you offering your help or are you telling other people what to work on? Thatâs a fine line. Being authoritative will likely not improve the situation, but make it worse and only leave a couple of maintainers that have very clear guidelines, but no one to implement them.
Thereâs a reason the CLC isnât an authoritative body for core libraries (it only maintains base
in fact), but a support network. It didnât work.
What I think we should focus on is:
- making contributing a bliss: quick reviews that donât take several years
- improving communication across libraries (in a non-authoritative way)
- building trust (thatâs what the CLC is currently trying to do)
If those things improve, we may eventually be in a position where deciding on over-arching principles is more feasible. I donât see that now and even if it was, I believe those should be very light principles.
Nobody should be demanded how to implement a library. I think many of the maintainers of the core libraries are not the original authors. Maybe theyâre not so opinionated about the design. Itâs also totally fine if someone doesnât have the time or the changes are just too complicated. Maybe others are willing to help.
If someone doesnât want to change the design, thatâs perfectly fine too. But then one should accept that at some point the library can no longer be considered a core library.
These changes arenât very urgent, itâs more of a shift over time.