Towards a prevalent alternative prelude?

@bgamari decoupling base from GHC sounds like an extremely worthy cause that can unlock a lot of action. If some foundational modules and typeclasses can’t be decoupled, that is fine. I hate to see perfect be the enemy of good, especially when remaining problems can be iterated on later.

@Tikhon’s point about availability is something worth considering. How can we decouple, but remain available by default?

4 Likes

In my opinion a completely new standard library with a new API would break to many existing codebases. A lot of people have come to terms with the existing functions in base. Personally I don’t have a problem with the String type as I don’t use wordprocessing that much. Also head is ok for me, because I prefer to have a simple list as input. If I want more safety I can implement my own safeHead.

I think an an evolutionary approach would be the better option. Maybe a starting point would be to pick some widely adoptet core libraries, e.g. containers, bytestring, text, random, vector, process, time and put them all together in one single core package. As a guideline for an improvement process the core libraries maintainers should strive that each package only depends on base. The content of base would define the Haskell language standard. Common wanted functions by the core library maintainers should go into base. Also if language extensions are needed for the core libraries, they should be in the new Haskell language standard. The bundled core package should be only loosely coupled to make it easier for newbies.
As a consequence of this approach I would expect shorter build times because of shorter dependency lists and more low-level code in the core libraries which is by trend more performant. Another benefit would be that Haskell compiler developers would know that their implementation would also build the core libraries.

Surely this would be a task for several years to implement, but I think there is an urgent need to establish guidelines before the ecosystem becomes too frayed. Growing dependency lists of the core libraries are generating growing concern for me.

3 Likes

I find this a bit problematic. The learning resources I used pointed out the warts of Haskell, but just end up saying “it is what it is” because nothing can be done since it’s the current standard. How lots choose to accept the quirks, circumvent around them, or just drop Haskell as an option entirely. Because all the little annoyances compound into a bad experience which matters a lot in the context of increasing Haskell’s industry adoption. For instance (the simplest example I know of) is String vs Text. The amount of resources I read where it was recommended to use Text over the former is a lot. So I definitely found it strange that if this is such a common recommendation, why is it not the standard?

But I do understand the sentiment on why it’s a difficult decision to make. Since if the new standard library is not as great as we all have envisioned it to be, it breaks the existing projects, and doesn’t convince potential adopters to use Haskell.

4 Likes

@sekun: I don’t recommend to use String over Text in general. Options to have is beneficial. There are cases where Text is surely the better pick. But I refused to take it for my project because I simply don’t know if the library maintainers come at some point to the idea to take some additional dependencies that blow up the dependency tree in consequence. If there would be an officially maintained core package with some guidelines that I can trust, I would be more confident.

It’s true that it can be confusing for newcomers which package to take. I had the same experience two years ago. But I’ve learned in many years before not jump up on every new train. I remember that the lens package was very often proposed as the solution for easier record access. Now there is RecordDotSyntax. The more high level a technology is, the fewer maintainers it will have and will possibly get obsolete (exceptions prove the rule).

2 Likes

I resonate mostly with @snoyberg and @viktor. In my opinion the prelude is just a small part of the issues of base that need to be fixed. Just piling more stuff onto base will make the situation worse, not better.

As others said before me, base needs to be decoupled from GHC, but also still be available by default. A standard library that is not always readily available is useless.
This would mean:

  • Put all the primops in their own package that is coupled with GHC (AFAIK ghc-prim is already a thing)
  • Create base-5 that uses PVP for the ghc-prim library, so it can be used with more than one GHC version as long as there are no breaking changes to the primops.
  • Also have the typeclass hierachy in base, and have the compiler support a PVP range of base-5. The incompatibilities that could arise here are just if changes to the typeclasses are not reflected by the deriving mechanism of GHC, so this could just give a compiler error if one tries to use base in a wrong version. I don’t expect this to happen very often, as the barrier for breaking changes should still be high.

For this new and decoupled base, we should fix the warts that are already there. No partial function, no lazy IO, builtin UTF-8 string type, builtin (mutable) array type, builtin simple stream and the rest that others already mentioned.

At the moment AFAICT the main argument against folding text/vector/containers into base is that it is hard to release a new version of base with bug fixes/perfomance improvements/etc because of the coupling. This wouldn’t be an issue any more.

TL;DR: We need a big breaking change that includes decoupling base and GHC and even after that we should not avoid breaking changes at all costs (again, just like GHC).

P.S.: My wishlist for the new base (in addition to the stuff that was already said):

  • PartialOrd superclass of Ord
  • Lawless Num class for overloading literals and more fine graned number hierachy
7 Likes

I wrote in my reply to the origin thread that the only way to fix base’s issues is to break it up. We need finer-grained libraries to separate the uncontroversial/stable stuff from the controversial/unstable parts, avoid version churn, and give the ecosystem greater ability to experiment while also not breaking old code.

We can then make all the “batteries included” preludes/bases/package-collections for end applications we want, purely with reexport without the friction that arises today, and without coarsening the underlying library ecosystem which would be disastrous.

2 Likes

In my opinion a completely new standard library with a new API would break to many existing codebases. A lot of people have come to terms with the existing functions in base.

Yeah, and in language like Haskell which implements lots of functionality on library level, standard library built from scratch could have as much effect as a new language.

But honestly, I think that may be a good thing, as long as:

  • community provides official, well written, authoritative resource about it’s usage (as mentioned above)
  • developers provide some incremental approach for switching to new standard library (e.g. migration tools)
  • developers give existing users enough time to do so

Incremental approach sounds appealing, but honestly, there are so many things that could be potentially changed that such transition could take years, frustrating users by regularly breaking stuff along the way.

Python 2 is a scary example of an old version of ecosystem staying alive for years, though it feels slightly different compared to Haskell - even though it is an old language, big part of it’s history consisted of mostly academic use, which while interesting and important in it’s own way, isn’t usually about long term support and probably won’t be strongly affected by such change. When it comes to learning materials, as long as we have official resources mentioned above, we can invest energy in pointing newcomers towards it - there are few strategic places where we can put links to new materials.

In this light of doing breaking changes, it may be good idea to pair release of new standard library with corresponding GHCXXXX extension as a new default, so that newcomers hop straight into modern setup without need for collecting extensions they want/need manually.

4 Likes

Trying to decouple the base release cycle from the GHC release cycle and many other approaches have been discussed plenty of times in the past, and again for very good reason. But IMO, it’s presupposing something that isn’t true: GHC itself doesn’t tend to break code.

Yes, but I think part of the benefit is doing that is the that the pain of breakage grows superlinearly with the amount at once:

  • base having small breakages and ghc breakage occurs together
  • libraries are more likely to be impacted by the combination than either alone, and if they just cut a new release from master rather than backporting fixes, they might release breaking changes of their own
  • the process cascades when downstream libraries are also forced to upgrade to support the new GHC.
  • Doing all our are breaking changes in one big wave forces a bunch of busy work all at once.

(We are getting this right now in 9.2 with bytestring-11, for example.)

Conversely, if we decouple things, and amortize the cost of breakage, we get huge benefits even without doing less breakage

  • Pure amortization of benefits: less busy work happens at once, less “ecosystem adapting” latency
  • Less cascading: little ripples of breakages are less likely to amplify themselves causing downstream breakage. It might even work like a lower transmission rates for infectious diseases, something we are all more familiar with.
  • Spreading the work out also frees up time to backport fixes to prior incompatible releases, further avoiding breakage cascades.
2 Likes

IMO, any new base library should not take the same tact we’ve been taking up until now of an “alternative prelude” that simply reexports better things than Prelude. We should be looking at fixing many of the flaws you’ve already pointed out, like inconsistencies in different “core” libraries.

That’s not to say relude, or rio, or others, should be ignored. On the contrary: I think the lessons learned from alternative preludes should feed in to the design process here. But I also think that if we limit ourselves to libraries that already exist, we’re going to be selling ourselves short on what a great standard library should look like for Haskell.

I agree, but to me, this also means it’s a bit premature to be able to be making the grand new library at all. I don’t subscribe to the “design by committee” memes that collaborative design doesn’t work, but I also think it is hard to sit down and design a bunch of radical changes that are a big imaginative leap from what we have without making mistakes. And yet making multiple whole-cloth new batteries standard libraries would be immense churn.

I am focused on breaking up base, making orphans good, etc. because I want to set up the foundation for more fine grained experimentation to allow many ideas to be explored in parallel with minimal breakage (due nice properties of good fine grained dependency partial orders). If we do that that, we can then start independently curating them into radical alternative preludes / Haskell platforms, and thereafter, in a 1+ years time, come back to the table to finally deliberate on the one true HF-endorsed curation with actual evidence.

6 Likes

Another issue that we might be able to address is to include strict versions of standard data types, e.g. strict Maybe, Either. But we should go further than that, e.g. it would be nice to have strict versions of MVar, TVar, which should be encouraged (other than writing MonadFix instances lazy mutable variables should not be that common). There is an advantage of strict data types versus strict interface (i.e. making putMVar strict). We did that at IOHK, and it has worked quite well for us (the library is not yet published at hackage, but soon it will be released). In conjuction with the nothunks library finding memory leaks was a matter for writing quickcheck tests.

2 Likes

This is from a very practical perspective: I don’t fully understand all the Haskell concepts. But I can use a lot of them and have written quite a few popular and fully functional apps using it.

I enjoyed working with ClassyPrelude: it gave me access to almost all the common types I need, put Text in charge, and the use of type classes meant not needing to worry about which version of a function to import in – which seemed like the ideal usage (close to interfaces in OOP).

I’ve since moved onto using RIO because (I think?) that’s what the ClassyPrelude folks moved onto next. I’m not such a fan of always having to import the type specific versions of functions – seems to miss what I thought type classes were for. But the conventions and such are all helpful.

2 Likes

I come from the world of web-dev, specifically using PHP and JS. Two quite different languages in that JS can really never be changed, without breaking old sites - as the developer has no control of what version of JS is used for their site. As such JS is full of weird issues that can’t ever be fixed, that are incredibly annoying (and a pain to teach). I often end up telling students the convoluted history of JS, as it’s the only way to explain some of the weird behaviours.

PHP, on the other hand, can make breaking changes. And they’ve started to use this to make the language better. As long as the developer can control the version of PHP that’s being used, they’re fully in control over whether they break their code or not.

Having worked with both languages a lot, I appreciate that PHP can improve and get rid of its baggage. It is gradually improving as a language in a way that JS never can.

It would be great if String wasn’t the default. It’s incredibly anti-newbie. It took me a year to realise I shouldn’t be using String. And then I had to spend a day updating my codebase to use Text.

So I’m all for Haskell X breaking things to make it easier to get started.

5 Likes

I have the feeling that is difficult to find an agreement about a new standard for a language like haskell that is developed in large parts by the community. Approaches in the past that went this direction have failed, e.g. Haskell platform, Haskell2020.

Why not start a competition in writing new core libraries, with the restriction to only depend on base? The authors should request features they need at the base library maintainers. This restriction would naturally evolve the base library which could define the new Haskell standard as an end result. This way everybody who puts some effort in writing a library could get some influence.

2 Likes

And then I had to spend a day updating my codebase to use Text.
@smallhadroncollider

This is exactly how I see these problems. The fear of breakage and the specter of Python 3 is unfounded. Yes breaking changes create busy work. Yes they can cause ecosystem ripple. However, the type system does so much work that the transition becomes mechanical.

The ecosystem has also been moving away from many of the historic issues in base. For example, network dropped String support in version 3. That had a ripple, it took time for downstream to catch up, but now the work is done.

7 Likes

Hi folks, thank you for working on Haskell improvements. As a relatively new user, the new GHC20x extensions and a new Prelude sounds like great news.

I’ve considered using relude a few times because it has better defaults, such as Control.Monad, Data.Either, and Data.Maybe. It also exposes the Text type, but it seems like I still have to register the text dependency to use the value. As @ChShersh mentioned, a “qualified reexports” would be very useful, and that could provide standard namespace names for things like lazy bytestring (e.g. should it be LBS or BSL?).

However I’m sticking to base because it is “official” and somewhat easier to use. Though I am looking forward a more modern Prelude being blessed by the community and build on that instead.

3 Likes

I think we can separate this into a few separate issues:

  • We need to fix broken stuff
  • We need to expand what’s available by default
  • We want less pain with upgrades and dependencies

I think we can immediately move forward on “fixing broken stuff.” base-5.0.0.0 can be a big breaking change, but even base-4.14 could have a lot of breaking changes too - it’s a major bump! I don’t see any disagreement on this point. No on is saying “Nah let’s keep foldl around, it’s fine.”

Some people want a more pared down standard library, and some people want a beefier standard library. Both have compelling points. The “default” workflow better be really good, and I think it makes sense to target it towards beginner-friendly applications (without having bad-for-production choices, either).

So, this suggests to me that we might want to figure out a better way to pick a standard library. base::Prelude is deeply privileged right now. Can we alter Haskell such that this privilege disappears, and it’s easy to have a Good Default + a minimal base for those that want it?

9 Likes

This is OK but end-applications, but terrible for libraries. They need libraries.

There’s many reasons for this, but one good one is portability. GHCJS is used plenty in production (and I think the long term plan around merging it into GHC will be a tremendous boon to Haskell in industry), but one wants very different batteries included when writing frontend code. Still though, lots of a the “abstract nonsense” fundamentals are to be shared, and that’s the whole point! It’s very important containers, kmettverse math stuff, not accidentally import things that would make them less portable than they need to be, as that would default the whole purpose of GHCJS (sharing code between backend and frontend).

We need to fix broken stuff

I do think this is the most important, the problem is less stuff not being imported by the front, but separating the wheat from the chaff. New uses navigate a minefield, and there’s little way to help them. Once we have curated collections of stuff, users shouldn’t mind browsing and index to find what they need to import if they can be confident whatever turns up is something worth using. That index could be as simple as the combined haddocks of a library collection.

2 Likes

What would happen to packages that import base like base == 4.*? I am grepping index-00 and I see lots of hits (some quite popular too — like hspec).

There seems to be a lot of confusion about the relationship between Prelude and the dual roles of base library. Let me try to lay out the land.

  1. Starting from the lowest level, there is base-of-GHC-primitives, a collection of low-level modules that are provided by GHC and cannot be implemented anywhere else.

  2. Then there is base-the-kitchen-sink, which is a collection of modules that you get as part of library called base, guaranteed to always be available without fiddling with Cabal or Stack.

  3. Finally there’s Prelude, a special module that is always implicitly imported.

The current situation is that base-the-kitchen-sink includes both Prelude (as one of its exposed modules) and base-of-GHC-primitives (as a subset of all its modules). This however need not be the case. GHC installation can come with more than one primitive library, and base-the-kitchen-sink doesn’t need to expose everything from every primitive library.

The better way to think about this from user’s point of view is:

  1. There is a small set of standard types and values that are in scope by default, with no need to import them. This is Prelude.
  2. There is a wider set of relatively well-known types and values that are always available with any GHC installation, but one has to import a module before using them. The import of these modules does not depend on a package manger, and there’s no question of which version of a module one gets.
  3. Finally there’s a universe of Haskell modules that come from packages, and you gotta use a package manager before you can import one.

Note I didn’t mention base in the second item, because it’s irrelevant where these modules come from. The only thing that matters is that they are available by default, at least until you want to define your own package and need to specify the dependency bounds.

4 Likes

I’ve seen this sentiment pop up a few times in this thread. I feel like this is already the case with the “wired in” packages that GHC provides. Am I missing something?

For example if you install GHC 8.10.4 then you can rely on these packages being available, which includes many of the usual suspects: bytestring, containers, process, text, transformers, and so on.

And with respect to base and Prelude, my two cents are: Alternative preludes work great for applications but suck for libraries. As a library author I want my software to be approachable and usable. Unfortunately that means sticking with the lowest common denominator, which is base. Depending on wired in packages is fine, but on Hackage it’s still visually cluttered to depend on mtl, transformers, text, and bytestring even though they’re all provided “for free”.

2 Likes