Convenience in the Haskell ecosystem

I’m aware. It’s even been split out as project-template, which is very helpful!

2 Likes

In my early involvement with Haskell, somebody explained to me that the Haskell Tool Stack (Stack) did not follow the ‘Unix philosophy’, which I understood then to mean ‘a tool should do only one thing, and do it well’. Stack is indeed like a Swiss Army knife. The challenge is to somehow make sure that introductory users can find the main blade without needing to know (or be confused by) the fact that somewhere, tucked away in the handle, there is also a tool that will remove stones from a horse’s shoe.

8 Likes

Exactly.

But that doesn’t mean that you can’t have high-level tools that are unix philosophy. Those tools are glue abstractions over existing tools: I consider VSCode haskell extension to be fairly unix philosophy adhering. It isn’t a lot of code and its objective is to make things “just work”. It uses HLS and GHCup and some minor custom logic to achieve that.

This is on editor level… the editor is the closest we have to “end user experience” and we can make some rough assumptions about what end user experience is expected (e.g. vim vs emacs vs VSCode).

The other angle is: sane defaults.

Those are the difficult decisions: which abstraction layer should provide what… and what are sane defaults.

We have to decouple the tools from the user interface… this allows us to build different user interfaces based on the same set of core tools. And this is only possible with the unix philosophy. Otherwise we end up re-implementing and rewriting, although we just wanted a slightly different UX.

2 Likes

summoner exists but i believe has gone unmaintained. i understand people thought it was pretty handy, and it could be a good base to start from, or at least inspiration.

3 Likes

We had haskeleton too which I understand has been merged into stack and I’m pretty sure there have been a few other ones. All of them got unmaintained or forgotten for some reasons. It might be intersting to know those reasons before creating a new one. My guesses are

  • discoverability: you can’t use a tool if you are kot aware of its existence (that is the limit of the unix phylosophy)
  • difficulty to find the correct template
  • the correct template rarely exists (everybody has different needs)
  • it’s not worth the effort. It takes longer to find the template vs rolling it yourself.

An easy alternative would be to create a cabal file with the common packages commented, so the user just has to uncomment the desired packages.

Also to guaranatee that the cabal file matches version of ghc and HLS maybe that should be a job for ghcup (rather than cabal). Ghcup generates you a cabal file for the version you selected.

2 Likes

GHCup is an installer, not a cabal file templating mechanism.

This is exactly why we need unix philosophy: tool boundaries shouldn’t be violated, just because an existing tool is popular.

A unix compatible solution to this is already proposed:

4 Likes

Just curious, how much would have Cabal scripts / headers have helped in your use case? I’ve never used HLS with cabal headers, so I’m not sure if HLS would have been useful for your case.

Indeed, this requires writing an exact-printing parser, which is difficult. As it happens, I believe @Liamzy is currently working on one for Cabal, but I’m not sure of its current status.

Stalled, I sort of wish I never got myself caught up in NeoHaskell drama.


Also, there is yet another solution to the present problems.

One thing I’ve suggested a thousand times is just to have wrapper libraries around standard Haskell libraries; i.e, quite a few libraries aren’t newbie friendly, but it doesn’t require writing new code.

All you need to do is to wrap existing libraries, have somewhat sensible defaults, and there you go.

For your problem as well, it seems as though the reasonable way to deal with having to figure out what libraries you need would simply be a bunch of wrapper libraries on Hackage; i.e, you have a task, you want the task to be simple, the wrapper libraries reexport the dependencies you want and come with suggested workflows.

If you don’t like the wrapper libraries or don’t like their defaults, just don’t use them and import their underlying dependencies instead. The wrapper libraries are just there for convenience.


This strategy easily gets around high maintenance requirements, if that’s what killed Haskell Platform. Instead of aiming to be all-in-one convenience, it’s some-in-one; i.e, the wrappers would be targeted to only specific use cases. If one wrapper dies, it’s not the same as the entirety of Haskell Platform dying.

if you give it a try, you’ll find vscode + haskell works wonderfully well.

Not always (try it in many kinds of project, eg multi-package ones; or on a resource-constrained machine) and not all the time (try using it heavily over an extended period). I would temper that claim a little because it is highly frustrating for users who trust that it’s true and then have a different experience [again].

7 Likes

sorry for the stack team

Why, maxigit ? stack has been stable from first release to the present, IME.

3 Likes

My mistatke. I didn’t mean that stack wasn’t stable but I am not sure of its long term future.

If you look at the contributors graph, you can see that Micheal Snoyman worked on it for about 5 years. Ok @mpilgrem has taken over, it might be for 5 years too. Then what ?

Also there is a possibilyt that in 5 years time Cabal or a new competitor will make Stack obsolete.
Don’t get me wrong, I’m a Stack user and I think the Stack team is doing a fantastic job !

1 Like

I think this is an important point. The only reason people use Stack templates is because they’re right there as soon as you look at stack

These are also important, which is why I suggested a slightly different approach above: to have a small number of highly curated and largely unopinionated templates, so that it’s immediately obvious which one to use. (‘Hmm, I’m doing a quick-and-dirty experiment, so let me select batteries-included-script’… that sort of thing.)

I disagree with this — it’s barely different to the current situation. It might help beginners with discoverability of packages, but little else.

I agree with this.

In response to @hasufell’s criticism of this point: I see it as a GHCup issue precisely because GHCup is an installer, and responsible for provisioning HLS. That being said, GHCup does provide an API, so it doesn’t necessarily need to be the responsibility of ghcup-the-tool… a templating program could easily call into the API instead.


On further reflection… I think the templates can be split up into several more or less orthogonal directions:

  • Basic info: name, date, license, etc.
  • Form: single Cabal script vs entire Cabal project
    • If the latter, its components: library and/or executable and/or test suite
  • Available packages: either base alone, or a selection of boot libraries, or a broader selection of popular libraries
  • The code: either as minimal as possible, or with some common imports, or with a sample program

Conceptually, this seems reasonably easy to integrate into the interactive cabal init prompt sequence. It also doesn’t seem too difficult to implement, since it doesn’t require parsing and extracting any template files — it should mostly be string concatenation. What does everyone else think of this idea?

1 Like

If we’re brainstorming around this problem area, I have a question: why does Haskell have packages?

‘Unit of code distribution and versioning,’ I know, sure. But Stackage package sets get pretty close to users not having to care about the package layer as far as versioning and distribution are concerned. There’s still a step where the user manually affirms that they want their project to depend on a package named whatever, but then the problem of figuring out what version of that package and how to get it is happily abstracted away. And the problem of figuring out what packages are needed based on modules imported is also automatable. And rather than have all of this tooling to paper over the existence of the package boundary, what if Haskell just didn’t have it?

As a user, at the level of the module, all I want to specify is which other modules this module depends on. At the level of the project, what I really want to specify is which entities I trust (in terms of both security and quality) to provide me with the libraries I use, and a choice of a vetted package set. What if library vendors published their packages to a Stackage-like service that kept package name and version as metadata attached to modules (in order to look up documentation or bug reports or what have you), but did not retain any concept of ‘package X depends on package Y’, and served modules back individually on demand in association with an entire package set instead of a particular package and version? And what if the build tool used that service to fetch any module imported on demand as long as that version was authored or endorsed by a trusted entity (such as when said entity pushed that module as part of publishing their package)?

Then the default Cabal template would just be, use latest stable package set, trust the Haskell Foundation, and nobody would need to care about whether they remembered to include containers. You can write a Yesod application by saying that you additionally trust Yesod Web, and that gets you whatever subset of yesod and shakespeare and persistent that you actually use. It’s the batteries-included solution but lazy about actually shipping the batteries—what’s more Haskell than that?

3 Likes

This is a pretty big thing though :grin:

1 Like

I think it was a bigger thing before package sets, is what I’m saying. (Probably still a big thing for anyone who has some sort of unconventional or throwback library sourcing model, I suppose, which is why I don’t really expect this idea to go anywhere.)

1 Like

I am not sure I understand what you are suggesting. Package sets are still made from packages, right? so you do need packages to start with.

1 Like

This should be fairly easy. ghcup list allows machine parsable format and has a lot of options.

The VSCode haskell extension uses that too: https://github.com/haskell/vscode-haskell/blob/f22634252b4918be044baf027247548da23d92c5/src/hlsBinaries.ts#L629


However. The larger problem is the compatibility between GHC and HLS. There are two things that GHCup does for the end user here:

  • it ensures
    • that the recommended GHC works with the recommended HLS
    • that the latest GHC works with the latest HLS
    • that the recommended GHC works with the latest HLS
  • informs you about incompatible GHC+HLS set versions
    • prints a warning to stdout when you trigger such combination via ghcup set
    • shows hls-powered for the currently set HLS in ghcup list and ghcup tui

As such, you can spawn a “virtual env” with a working set via ghcup run --ghc recommended --hls latest --install -- sh.

Once you have that, all you need is the base version (which you can inquire via ghcup list too). And then you lock that base version in your cabal file.


What would be interesting is if there’s a better interface for saying “install the latest HLS that’s compatible with GHC X.Y.Z”. This is implemented in VSCode Haskell extension in fact! It utilizes the following metadata for that: https://github.com/haskell/ghcup-metadata/blob/develop/hls-metadata-0.0.1.json. GHCup itself doesn’t use that file, currently for two reasons:

  • it’s an additional download (GHCup has been designed to be both very quick and up to date, so download times and even yaml parsing times are relevant)
  • I haven’t figured out a good generic design yet… it seems this might be a feature that should live in the TUI, with some interactive mechanisms, where an additional download in the background isn’t a problem anyway

Some related issues:


The third option that has been mentioned is to ship HLS with GHC, but that has large implications, including for the GHC devs and their resources. I can see some benefits, but I’m not sold yet.

4 Likes

Currently, a package set is a set of packages/versions that can all work together. Meaning, among other things, that you could use every module from every package at once, if you were so inclined—no module name collisions, no dependency conflicts. In my fantasy, a package set would just be that collection of modules—the concept of ‘package’ would be an implementation detail for publishers, not consumers. You wouldn’t download the entire package set, or individual packages; you’d ask the package set server for the modules that your project imports, and the package set server would get you their transitive closures.

If I’m the author of package foobar, publishing foobar entails submitting a request for the next package set to include a set of module files I provide. The important part is, I tag this set of module files with foobar primarily so that the assemblers of the package set know to remove the previous set of modules tagged foobar if they accept my new submission—consumers won’t access the module files I submit by their package name, only by their module names. (Package names and versions can still be presented to users so that they can report issues and whatnot—this is more about changing how modules are resolved, and making that a one-step package-set-to-module link instead of a two-step package-set-to-package-to-module link.)

1 Like

Thanks, this is an interesting vision.

1 Like

I think the problems I could foresee are:

  1. You still need packaging. Library authors leverage them and PVP. And package sets benefit from that work.
  • I guess this would only be useful for “applications” - but that kind of goes against the “everything is a library” mantra. Most Haskell applications also make themselves available as libraries.
  1. Every package set user ends up bringing in extra dependencies anyways. I guess they just become part of the global store?

Could be a cool feature though! I already like to do ghcWithPackages for Nix. This is in spirit the same but lazily builds modules as needed.

1 Like

This is what it’s like using a monorepo. For my personal project, I build with shake, and I declare a list of ("name", "Path/To/Main.hs") and I can do mk build/opt/name. Whatever it imports, it imports. At work it’s similar except it’s a hermetic build thing so you have to declare all the files and deps so there are package declarations, and they add the ability to have private modules.

So haskell already doesn’t have to have packages, and the development experience is indeed better, but of course this only works in a system with global control. What if you want to share some modules with the outside world?

Authors are always breaking APIs and adding features, and users always want the new features from A without the new breaking API from B. A vetted snapshot of compatible packages like stackage is always too old or too incomplete, so a patchwork of semi-consistent versions seems unavoidable. So I don’t really see a way around in an uncoordinated world. But if you can make a little consistent island and live there, it’s nice. Making it is work though, the bigger it is, the more work!

But perhaps the Unison people have had some thoughts about how to do it. I see they have packages and versions, I haven’t actually used the language so I don’t know how they assemble mutually compatible subsets of them.

2 Likes