If you were able to do this, you would had been also able to vendor the library in and patch it.
Circumvent /= fix. I did reach out to the author of the library, they acknowledged that using the library in a certain way would result in a space leak, but were not interested in fixing the behavior.
I’d rather that than having an author of a potentially useful library not to publish it at all, because they don’t want to bother maintaining backward compatibility for internal modules.
Hence it is perfectly reasonable for Hackage to have a higher curation standard and compliance requirement, and be a opionated package set if rules are clear and principled.
I will fully acknowledge that the "make two packages, one -internal" is the Right Solution. However, it’s also annoying, and I probably won’t suffer the extra effort, and will have .Internal modules where breaking changes aren’t reflected in major bumps.
PVP is a policy which should evolve as things like this come up. It is not handed down to us by a supreme authority.
The PVP bounds depend, in part, on how you import and use a given module. To be perfectly PVP compliant, if you have an “open import” of any module then you need an upper bound on the minor version you depend on. Otherwise, a new identifier may be introduced which causes a compile-time ambiguous name conflict. So we already have precedent for “the way you use a library influences the bounds you must set for compliance.”
Altering PVP and adding policy that .Internal. means you need to specify a patch-level upper version bound to remain in compliance is a totally valid approach to the problem.
That is a valid point. It makes sense to tie the upper bounds of dependencies to how you use a dependency. It’s useful when you’re setting the bounds as the maintainer of some package. After all, if a new version of the dependency is released, the version bound starts complaining, meaning that you need to check and fix your package.
The other perspective, though, is the end users of your library. The dependency version bound also starts complaining to them, and they will have to deal with it. This is often the situation I find myself in. Some packages have super strict version bounds, causing them to break every other nixpkgs update we do.
When I update dependencies, I often see package bound errors. It might sound like heresy, but my very first go-to then is the pkgs.haskell.lib.doJailbreak function. This function simply erases the version bounds of the package. This sounds reckless, but 9 times out of 10, the package builds and even functions perfectly fine. The tenth time I get a clear compiler error.
Intuitively I feel that the tightness of version bounds is a tradeoff: too loose, and you’re falsely promising to work with dependency updates forever. Too tight, and your package cries incompatibility very often and often wrongly.
It also depends on how closely your dependencies actually follow PVP. Any mistake or misinterpretation makes everything just so much more difficult.
One of the things I like about Haskell is the strong culture of adherence to the “pit of success” - The idea that the design of a system should be such that the default, easiest path is the correct path.
There are many aspects of Haskell that lend it itself to this notion - most prominently the focus on strong static typing, and separation of side effects from pure code.
If The Right Thing is such a pain that nobody wants to bother with it, the situation ought to be remedied. The solution of making an -internal library leaves a lot to be desired.
It’s a long discussion so I may have missed this, but one possible approach to improving the versioning situations would be to improve the ergonomics of cabal upload and related tooling. From the maintainer’s perspective, the trouble with splitting every .Internal module into a separate library is that it doubles the workload: whenever there’s a breaking change in the mylib-internals package I need to
bump that library’s major version
bump its upper dependency bound in mylib-stable
bump the stable library’s minor version
cabal sdist the mylib-internals package
cabal sdist the mylib-stable package
cabal upload both tarballs
I understand we don’t want to add multiple versions to .cabal file specification, but we could automate some of the above drudgery. After all the maintainer is likely to already have a cabal.project file that lists the two packages. We should be able to just do the first three steps above and run cabal upload against the project file – it would have enough information there to figure out what needs to be published.
The intent of exposing internals is so that sophisticated users can extend a library further without needing to first upstream their improvements. No matter how it’s structured, that is valuable.
If people are using internal apis and not engaging with upstream either for adding those capabilities back or for proactively auditing compat with new versions, they’re gonna have a bad day. It doesn’t matter what tooling is In play.
It’s important to remember that the intent of exposing .Internal modules is a way to address the ye olde Expression Problem for a library for its users.