That’s not how internal libraries work. They share the same version. To apply different versions, you have to create separate packages.
Anyway. I’m planning to write a small program that allows to:
- blacklist libraries that depend on another library
- apply a whitelist to those
That would allow you to e.g. blacklist all packages importing ghc-internals
, but whitelist base
and possibly other preludes.
I think this use case is general enough to warrant such a tool, but also specific enough that it doesn’t belong into cabal itself.
After all, there’s nothing special about -internal
packages other than an informal social contract (or the lack thereof). So I don’t see why Cabal needs to treat them specially.
Oh, then I misunderstood… I thought you and @tomjaguarpaw were talking about using a separate package! In that case, I agree that internal libraries are a bad solution
You had me at “prototype”. I think we agree. For a poc once, I just copied an internal data type and used unsafeCoerce
I also agree .Internal
modules (if you really want to expose them) shouldn’t be exempt from proper versioning.
Let me first say thank you for bringing a concrete example. I wasn’t affected by that issue and I had to read up about it, my understanding of its impact might be limited.
So you did use accursedUnutterablePerformIO
! (source)
I believe you can implement isValidUtf8
without using any internals either with useAsCString
or
with a fold. There are already a few options on Hackage for validating UTF8. I admit this won’t be as performant but, as you agree, it’s a temporary trade-off.
This is secondary but I don’t understand the reference to “the latest GHC”, since you can update bytestring independently from GHC. You mention “recompile GHC”, so maybe you link against GHC itself?
Again, I might not be understanding your situation and I have no intention of claiming I could have done anything better
I believe is a good solution.
I wish we could do this in Haskell. @ChShersh and @vrom911 made policeman but I believe it doesn’t work past GHC 8.10.7. Maybe someone can help updating it? I could help integrating it with cabal (didn’t I just say we shouldn’t ask cabal to do more? )
No accursed stuff. I just checked the implementation, and have to admit a mistake: we don’t use bytestring internals for this decodeUtf8 function, but we do use unsafePerformIO
and internals from Text. The latter specifically for the Text
constructor, as we build the array for it ourselves.
As for the recompiling, that is more of a Nix problem. Both bytestring and text are bootstrap packages, meaning that they are treated in a special way in nixpkgs. If you evaluate haskellPackages.text
in a REPL, you’ll see that it’s null. Overriding those can be a bit of a challenge, because of that bootstrap status. If I remember from memory, just creating an override in a Haskell overlay didn’t immediately work, and had to replace it at a deeper point, at which point Nix sees that the dependencies for GHC itself changed, leading to a rebuild.
I reckon with just cabal this probably wouldn’t be as much of a problem.
Haha yes, I have done the same!
My memory is that Python uses _
and __
prefixes by convention to indicate that something is private. (The difference being that __
gets name mangled but I don’t immediately remember the details or why that’s useful.)
Suppose the body of a function has changed. That sometimes wants a major version bump and sometimes not, but elm bump
surely can’t tell which is which. So I think it must either be too aggressive or too timid in updating, unless it gives the user some control.
Still, I can imagine this being better than trusting implementers to get versioning right.
PVP speaks nothing of it. You can do a major bump on semantic changes that don’t touch types or type signatures, but you don’t have to.
So obviously, a tool will not consider function body changes. That’s an unsolvable problem.
https://pvp.haskell.org/ has a decision tree which says that if the behavior of any exported functions changed, you need a major version bump. That said, this situation isn’t mentioned anywhere else on the page that I can see. But since it’s a backwards-incompatible change, a major version bump seems correct according to the spirit, to me.
(Though for elm bump
specifically, it doesn’t matter what PVP says, since it’s going for semver. I think semver would also say a behavior change needs a major version bump but I haven’t specifically looked.)
But in any case, “that’s an unsolvable problem” is what I was getting at. I think we agree that not all function body changes need a major bump, and we agree that some function body changes want a major bump (even if we disagree whether it’s “want” or “need”), and we agree that no tool can tell which is which. So any tool which tries to take full control of versioning is going to get it wrong, sometimes.
I think is unclear actually. IIRC the text of the policy does not mention the behaviour at all but only symbols, types and instances.
I fail to find the link now but I remember a recent discussion where it was agreed that a change in behaviour does not force a major bump. This actually makes lot of sense, since any bug fix can be considered a change in behaviour.
I see, thanks. I see there’s discussion at Clarify bumping policy for bug fixes · Issue #49 · haskell/pvp · GitHub.
Yes, that’s it! Thanks for finding it.
I hope the following question is not off-topic. One of the exposed .Internal
‘do not use’ modules of pantry
is for ‘testing’ only. Is there a recommended ‘pattern’ to adopt where you don’t want to expose the internals of a package’s library but you would like to bundle test suites with the package that test those internals?
You can use an internal sublibrary to expose those internals only to the components of the same package.
Is there a recommended ‘pattern’ to adopt where you don’t want to expose the internals of a package’s library but you would like to bundle test suites with the package that test those internals?
As @jaror says, you can use an internal sublibrary to do this if you just want to share it with the test suites in the same package, but sometimes you also want to expose them to downstream pacakges for use in their tests (e.g. Arbitrary
instances, where you don’t want a QuickCheck
dep on the main library). These “test libraries” are IMO one of the best use cases for public sub-libraries.
I’d rather have incorrect bounds than an unexposed Internal
module hiding something that I really need. I had this case where I needed to circumvent a space leak caused by a library, but couldn’t because the internals were not exported, so I had to write an entire small library from scratch.
To me:
- pulling internals into a separate library – perfect, but can be bothersome to the author of the library, so I don’t mind if they don’t do it
- breaking PVP for internal modules – fine, I know what I sign up for if I use an internal module
- not exporting internal modules at all to keep me “safe” – no you didn’t consider all possible use cases for your library and no you can’t be 100% sure it behaves in a perfect way with no rough edges to completely forbid any “unapproved” ways of using the library in some weird totalitarian fashion. Please just stop doing that
Thinking you as the library author understand the needs of your users better than they do is paternalistic IMO. It’s very frustrating to be on the other end of. Exposing internals is just fine, so as long as the library author makes it sufficiently clear that the interface is unsupported, subject to change without notice, and will give you nasal demons and void your warranty.
At that point the responsibility is on the users - having been supplied with such dire warnings, hopefully they’ll have the prudence to only use the interface if they really truly desperately need it. Often they legitimately do - library authors can’t be expected to foresee every possible way someone might want to use their code.
Ideally the users would be encouraged to then explain why the public interface is inadequate for their use case to the maintainers, so that a more principled, permanent solution can be arranged.
If you were able to do this, you would had been also able to vendor the library in and patch it.
That works ok, as long as you don’t want to publish your package to hackage. Then it becomes a disaster.