Haskell Architecture Proposal

1 Like

It consists of the common sense foundation of Haskell.

Good luck with defining the “common sense foundation”.

4 Likes

This could be worth a look:

https://www.simplehaskell.org

2 Likes
  • A core library is one that only depends on base and is declared as such from the maintainers

That’s impossible without basically moving all of the interdependencies into base. I think that’s the wrong approach. Base should become lighter and more decoupled from GHC. This would achieve the opposite I think.

4 Likes

So why not have something like a pure architecture and decouple core libraries from each other such that each library only depends on the base library?

To be taken seriously, the proposal would need to do a slightly deeper analysis to propose a realistic organization for the libraries.

That said, it would probably be easier to come up with a single stdlib.

Given the way the Haskell ecosystem works, we could (probably even more easily) just make that stdlib wrap/re-export the others.

But regardless of those semantics and how it’s organized and presented to the world, I think the thing we have learned over the last 10+ years is that the hardest thing to do here is agree on what should actually be in that stdlib, along with the refactorings/changes to get closer to the proposed ideal…

Not necesarrily. For example, the interdependencies of the packages Win32, filepath, process, directory, unix and time could be removed by grouping them together into a larger package called ‘system’.
Also the interdependencies on template-haskell have only existed for around two years and could probably be removed again.

I see basically 3 ways to remove an interdependency between core libraries:

  • Remove by refactoring the package that includes the dependency.
  • Remove by grouping with the dependency package.
  • Remove by moving the parts you need into base.

In my opinion, these three possibilites should be questioned in this order. The last option would be to move the dependency into base.

… the language has evolved a lot since Haskell2010, I would like to see an official language specification.

To be a little more precise:

  • You can still program using just the Haskell2010 standard; and that’s plenty powerful.
  • As at 2010, all Haskell compilers supported extensions beyond the 2010 standard – there just wasn’t enough agreement about which extensions were ‘standard enough’.
  • And I’m not sure even by now whether there’s enough agreement: We might all agree the language should support MultiParam Type Classes; but that alone is not enough; then should an ‘official language specification’ as at today include Type Families and Associated Types and Closed Type Families; and Functional Dependencies? Should the FunDeps be spec’d per the only ‘official’ academic papers covering them?; or per what the actual compilers support?; which compiler? GHC’s implementation describes itself as ‘bogus’.

In the last 5 years since I’ve been using Haskell, I feel like that things don’t really make progress in this regard, e.g. the failure of the Haskell2020 specification.

The Haskell2020 ‘standard’ process did eventually result in a blessed set of extensions. Has anybody (or any major Haskelling shop) restricted themselves to that set? Was it a worthwhile exercise? (I know I’ve just ignored it.)

What good would a standard specification do now? Trying to produce it would cause a great deal of debate. Then everybody would ignore it and carry on using the extensions they prefer (and avoid the extensions they don’t, even if ‘mandated’ by the standard).

Also it’s sad to see that all Haskell compilers except GHC died years ago.

You can still download and use Hugs. You can switch to purescript.

Hugs supports the whole Prelude/base; and is powerful enough to implement sophisticated applications: for example, you can build in it a compiler for a modern FP language such as … GHC.

Libraries

Then after your opening paragraph, the only thing you talk about is Libraries/dependencies. How would a ‘better’ library structure (for some value of ‘better’) make any difference to producing a Haskell2025 standard?

I tried to explain how guidelines for core libraries could naturally evolve the language standard. In my opinion the language standard could be also based on the things that are most often used by the programmers, like things in deepseq. Surely it would also be possible to invent a similar mechanism that could be used instead. How things are implemented, if some special function is safe or not, thats not my point. It should mainly be useful. My point is the lack of a language standard at all.

Realistically I can’t produce any sophisticated software without the use of core libraries. I don’t know when Hugs was last updated, but the date on the website is from 2003. I doubt, that it can build all the core libraries. And purescript is mainly for web apps and doesn’t have the libraries I need. I don’t only depend on core libraries, but on a huge part. It would be nice to know that core libraries would not take in dependencies from outer space, where I don’t know that the dependency graph and as a result built times and binary size is growing uncontrolled.

I mainly program in Haskell2010, but I don’t know if I can rely on old libraries and compiler in the future. Changes in hardware architecture could be possible for example. So I have an interest that things go well.

Besides, Haskell is the basis of my work, i.e. I’m writing a commercial app for around 4 years. That is, I can’t easily switch to another language. That would be the end of my business. All in all I think it was the right decission to use Haskell. I come from python and it’s an huge difference. It’s a lot easier to get things right in the first place. Refactoring, which I’m constantly doing, is also a lot easier.

Lastly I’ve read something like: “Why does someone use Haskell for business, as one should know it’s only an academic language.” That makes me somehow sad, as there surely was promotion for more adaption of Haskell in the industry.

2 Likes

I think the barriers to your proposal are less technical ones and more about the social aspect. As a community, we can’t really ever agree enough to get big things like this done, and there is no central authority to help us make those types of decisions. Still, good on you for pushing something like this forward and looking for ways to help better define/bless best practices.

3 Likes

The reason nobody should be using Haskell for business – that is, commonplace commercial applications: payroll, ERP, even a web-based store – is there aren’t enough Haskell programmers, and they’re all cranky and opinionated (talking about myself). Making your business hostage to propellor-heads would take commercial IT back to the 1980’s.

Whereas there are gazillions of (say) Java/Javascript/python/COBOL programmers: if one gets too precious about ‘their’ beautiful code, sack them, get another easily. Producing code is not the purpose of the business: it’s a service to the business’s main purpose. (If you’re working for a software company producing a package; yes producing code is your business; but very few users of a package know or care what technology underpins it.)

(BTW the last release of Hugs was Sept 2006. It can build all the core libraries in the H2010 standard. It predates the AMP/FTP changes to the libraries; but since those were mostly reshuffling existing stuff and introducing a bit of H2010-compliant polymorphism, I don’t think re-compiling for those would be major. But I’m certainly not going to bother, because those AMP/FTP changes are wrong: I’m cranky and opinionated about it.)

1 Like

Not necesarrily. For example, the interdependencies of the packages Win32, filepath, process, directory, unix and time could be removed by grouping them together into a larger package called ‘system’.

Please no.

These packages have different maintainers. It’s already difficult enough collaborating with some of these packages.

It will slow down development of all of these, cause much more communication churn, lead to more PVP incompatibility when unrelated parts of the API break and depending on the quality of the communication may cause people to accidentally introduce changes to code they don’t understand.

5 Likes

I think the barriers to your proposal are less technical ones and more about the social aspect

You nailed it.

I think programmers tend to see a technical problem in everything. Collaboration and coordination is hard in a fragmented set of volunteers that don’t get paid.

API boundaries are good. They don’t just force you to think more carefully about the scope of your project, but also serve as a communication nexus. They are contracts, express intent, hide implementation details. All of this goes out the window if you unify loosely related packages.

That’s why everyone is dissatisfied with base, why we can’t figure out how to decouple it from GHC and why we need an entire committee to maintain it.

We need the opposite direction: clean, isolated APIs. Yes, they also provide challenges. Sharing of types, avoiding circular dependencies, occasional inlining of other libraries functions. But that’s more managable than a huge blob of core functions with very high entry barrier for contributors.

1 Like

Yes, that’s funny. What I had in mind was “allgemein anerkannte Grundlagen”, which better translates to “generally accepted principles”

1 Like

Yes, maybe I see a lot of things too technical. But in the end, Haskell is a construct to solve technical tasks.

This looks promising:

…it even has its own paper! So everyone who is dissatisfied with base can join the effort to make it better, based on the KISS generally accepted principles.

1 Like

I don’t think this addresses any of the points I raised.

2 Likes

Maybe for the packages themselves, I would agree… but for the general direction and overall cohesiveness of the lot of them, I think we actually need a group or person acting as a decisive force with the trust and power to set the direction for “what is blessed and recommended for production code”. We already have too many decisive bodies and committees, so adding another one isn’t necessarily better… and the ones that exist approach this topic with reluctance… but to me the problem stems from no one taking ownership over the experience and direction of “Quality Haskell for Production”, and then also being trusted by the community of users, as well as the community of maintainers, and community of academics.

Proposals such as this one continue to come up and be disagreed on to no end, and then fizzle out. Individuals and groups of people who step up to lead and show the way end up fighting uphill and burning out, giving up, moving on, or staying in their own corner doing their own thing in frustration and wishing everyone else would get on board with something that works. Just look at how many “alternative preludes” we have.

All the while, users are left drifting, we look a bit foolish, and people wonder why there is this idea that Haskell isn’t great for production business software, or that attrition is a problem.

A stdlib would be pretty cool, so would an authoritative guide to “building production software with haskell”, but they need to be in sync and harmony with the package ecosystem, navigate picking sides on how things are recommended to be done, made relevant in the community, and maintained on into the future as packages and “the best way to do X” evolve. It’s not that we haven’t or aren’t trying, but we aren’t holding it together or finding success.

2 Likes

I think we actually need a group or person acting as a decisive force with the trust and power to set the direction for “what is blessed and recommended for production code”

I don’t believe that it’s even a good goal.

There are many competing opinions about what’s good in production code. Everyone is free to create their own guidelines or gather with other maintainers and projects to try to come up with a cool set of guidelines.

Making any of this “authoritative” seems quite questionable. The more expressive the language, the more opinions you have. And many of them are valid and have their arguments.

Something like this should be much more focused on education rather than being authoritative. It should help people make better decisions, not tell them what to decide. Even something seemingly simple like “how to handle effects” will make someone vehemently vote for RIO and others for polysemy.

I also find it questionable to tell volunteers, who maintain (core) libraries in their free time, what architecture to pick, what patterns to use. If you want to influence (core) library development, become a maintainer and write code.

Individuals and groups of people who step up to lead and show the way end up fighting uphill and burning out, giving up, moving on, or staying in their own corner doing their own thing in frustration and wishing everyone else would get on board with something that works.

Well, there certainly is some burnout rate in Haskell. But I’m not sure this is the reason.

Core library maintainers are volunteers. It’s already difficult to get improvements into the ecosystem that touch more than one library (like AFPP, which I’m currently working on). The main issue here is lack of engagement and communication. I’m not sure if that’s systemic or just a function of time.

The problem I see is when you come up with a proposal and expect other people to implement it. Are you offering your help or are you telling other people what to work on? That’s a fine line. Being authoritative will likely not improve the situation, but make it worse and only leave a couple of maintainers that have very clear guidelines, but no one to implement them.

There’s a reason the CLC isn’t an authoritative body for core libraries (it only maintains base in fact), but a support network. It didn’t work.

What I think we should focus on is:

  1. making contributing a bliss: quick reviews that don’t take several years
  2. improving communication across libraries (in a non-authoritative way)
  3. building trust (that’s what the CLC is currently trying to do)

If those things improve, we may eventually be in a position where deciding on over-arching principles is more feasible. I don’t see that now and even if it was, I believe those should be very light principles.

5 Likes

Nobody should be demanded how to implement a library. I think many of the maintainers of the core libraries are not the original authors. Maybe they’re not so opinionated about the design. It’s also totally fine if someone doesn’t have the time or the changes are just too complicated. Maybe others are willing to help.
If someone doesn’t want to change the design, that’s perfectly fine too. But then one should accept that at some point the library can no longer be considered a core library.
These changes aren’t very urgent, it’s more of a shift over time.