New Hackage Server Features

@nomeata Great feature, didn’t know about it.
packdeps does not really advertise it, does it; at least not on the homepage https://packdeps.haskellers.com/ nor on the linked github page GitHub - snoyberg/packdeps: Web app to track lagging package dependencies. .
So I raised Advertise the feed for one's outdated dependencies · Issue #57 · snoyberg/packdeps · GitHub

2 Likes

7 posts were split to a new topic: How do you display images on Hackage README.me?

There is my article about them and a section in the cabal docs. PRs to improve the docs are welcome!

2 Likes

Currently, Stack does not support dependency on public sublibraries. I consider it ‘No. 1’ in the list of priorities for Stack, but it has exceeded my own capabilities to date. Other people have started on the endeavor from time to time, but not yet seen it through. I think the practical problem is that ‘a package has no more than one library’ is deep-rooted in Stack’s code, and so a lot of work is required to change that: it is not a ‘bite-sized’ problem.

EDIT: To explain my prioritisation, my concern was that - as the Cabal package description format allows for public sublibraries (from cabal-version: 3.0), and Cabal (the library) supports them - then people might start producing packages that Stack cannot handle - including packages that have a central role in the Haskell ‘ecosystem’.

1 Like

Notice that cabal-install has issues with public sub libraries too. The solver just does not see them. See Make the cabal-install solver multilibs-aware · Issue #6039 · haskell/cabal · GitHub and other issues linked there.
If boot packages start using public sub-libraries, cabal-install will always recompile them and never use the pre-installed version.

1 Like

I’m trying to think of a use case that can’t be covered by a multi-project setup via cabal.project.

There’s a minor use case of reducing code duplication across *.cabal files. Not worth the added complexity, imo.

Then there’s the use case of re-using modules in different libraries. Only interesting if they are unexposed… otherwise you can just factor out the common parts into a package and re-use them.

So, what’s the point?

2 Likes

The difference is for package authors. You share a package, not a project. So if you have multiple libraries you want to share “as one”, putting them in a single package can save some headaches. E.g. you can choose to use a single version number, rather then being forced to use a version number per each library.

2 Likes

That sounds more like a drawback to me. Now I can’t version internals independently. Proper projects seem more flexible.

2 Likes

I imagine multilibs were primarily designed with Backpack in mind, because this is where you end up with insane amount of libraries/signatures, which are tedious to split into separate Cabal files. But otherwise it’s a very niche feature and I’m not sure it is worth the incurred complexity on tooling.

1 Like

Here are three reasons why I think you might want to use multilibs:

  1. Multiple packages that are versioned together

One good test is indeed “does it make sense to version these packages independently?”. Often in a multi-package project you have a situation like this:

  • You only ever use and test the packages together at the exact same version (same commit). Therefore you never test other version combinations.
  • You have no interest in e.g. using CPP to ensure that the packages are compatible with a range of the other packages.
  • The packages are tightly coupled, e.g. each new version of A requires the features from the new version of B, etc.

In this case you often end up using the same version for everything and advancing them in lockstep because that reflects the reality of how the packages are supposed to be used. Merging them into a single package with multiple sublibs does the same thing much more cleanly.

Examples:

  • HLS and its plugins
  • amazonka
  • Many industrial codebases I’ve seen

As a side note, if you can version things separately then it’s tempting to try and actually do it properly. I have seen people drive themselves a bit insane trying to properly follow PVP for many packages in the same repo (this is part of why we switched to lockstep versioning in HLS recently).

  1. Extra functionality that incurs more dependencies

Sometimes you want to expose just a little extra thing from your library, but that incurs a new dependency that you don’t want in the main library.

Examples

  • Typeclass instances (for scale, try searching Hackage for “instances”)
  • Test utility code
  • Just little extras, e.g. the lsp-types package now has an extra public library that exposes the model used by the code generator, which isn’t even a dependency of the main library

At the moment you have to put such things in separate packages, which is quite annoying. With multiple public libraries you can just have the test-utility code in an additional library with extra deps.

Of course, in principle you might want to version these libraries separately. Maybe the Arbitrary instances don’t change very often and work with many versions of the base package. But: probably nobody cares or would gain anything from the finer-grained versioning.

  1. Discoverability

In theory, it should be easier to discover a family of related libraries if they are packaged together. Consider again e.g. amazonka. Today the Hackage UI is not there, but in the future it would be nice to be able to pull up a package and go “oh, there’s also a library for quickcheck instances, that’s useful”.


I think some version of these conditions applies for a lot of packages.

9 Likes

Yes, but you can just use == x.y.z bounds too.

Right, something that you can do with real packages too.

It seems to me this is an issue with hackage not having better grouping mechanisms and sounds like a rather odd motivation to introduce public sublibraries.

All in all, I don’t feel it solves anything significant. It seems merely a convenience feature (good), but at a high cost.

I would agree that in a strictly formal sense there’s no more expressiveness than with multiple packages that move in lockstep. I do think the other ergonomic benefits are quite nice, and hope this will be able to improve organization of various packages and codebases in the future…

3 Likes

I’m considering using public sublibraries for publishing .Internal modules.

.Internal modules clutter the package’s module listing in Hackage and make it more confusing to scan. It’s no big deal but I find it annoying (see the module list for text for example).

Also, putting them in a public sublibrary would make cabal build-depends: sections more informative, because it would be evident at a glance if another library depends on the internals or not.

3 Likes

10 posts were split to a new topic: PVP Compliance of .Internal modules

@hasufell you can of course do all this with multiple packages. But it’s a pain.

We make all sorts of decisions about what should go in one package versus several. In the limit, we could package and version every module or function separately. That would give us the most flexibility… but would be an enormous pain, and nobody would benefit from the flexibility. Multiple libraries just lets people decide that more stuff should be versioned together, and there’s definitely demand for that :slight_smile:

Another example is executables. We’re very used to the existence of executables in cabal packages, but all the arguments against multiple public libraries apply to executables:

  • They’re standalone components that can have their dependencies solved for independently
  • They require cabal to have the concept that a package contains multiple things
  • Executables need not depend on the main library, so can be completely unrelated to it

I think the current situation is much better than a hypothetical one-component-per-package world where you would need stylish-haskell and stylish-haskell-exe packages versioned in lockstep. I think the situation is very similar for public sublibraries.

5 Likes

Email notifications from Hackage don’t work for me at all. Does the feature actually work? If not, I’m happy to try and debug what’s going wrong :slight_smile:

It’s working very nicely for me. I get an email when docs are built and when a dependency has a bounds-breaking release.

2 Likes

I also get updates via email when a dependency gets an update that my package(s) don’t support :+1:

If you have an email address from an uncommon domain or provider it could be there’s something specific with the ability to deliver to it.

1 Like

As of lately, the mails work for me again :tada: very useful feature! Thanks for any magic you might have been applying to the server.