PVP Compliance of .Internal modules

How is that going to work? A package tarball has one distinct version… on hackage, in the cabal store and in the index (afaik), in stacks pantry etc… Sounds like a very large breaking change.

2 Likes

In that case, I share your concern over how this will affect PVP compliance. I don’t feel too strongly about it though.

FWIW, I don’t think multiple public libraries really help with the internal module problem much, except in the way originally mentioned by @danidiaz : moving them from the main library to a separately named sub library can act as a signal to users.

In particular, since they share a version, a PVP compatible package with multiple public libraries should bump the major version if any of the libraries needs a major version bump. We might need to update the PVP page to discuss this!

Well, I’m getting even less excited about public sublibraries now if people use them to violate PVP even more easily.

It’s not… easier? It’s exactly the same. Moving the code from the main library to a sub-library doesn’t change anything about the versioning going on, AFAICT. I think this whole discussion is just not really the point of public sublibs.

3 Likes

I feel like I’m missing something basic here. Isn’t the whole point of Internal modules that they aren’t compliant with the PVP? I thought they were supposed to signal that they may contain useful things, but they are not part of the stable API and therefore could change at any time.

Also I wanted to mention hackage-diff and cabal-diff, which are relevant to this discussion.

9 Likes

This has been practice, but is a mistake.

See Internal convention is a mistake – Functional programming debugs you

As part of GHC's base libraries: Combining stability with innovation by Ericson2314 · Pull Request #51 · haskellfoundation/tech-proposals · GitHub CLC and GHC devs have agreed that it’s a bad strategy and GHC internals will be following PVP.

Likewise, some of the boot libraries will start following this pattern as well.

4 Likes

Can we at least try not to make the same mistake twice here:

  • GHC extensions are not regular Haskell features,

  • and “internal”/“low-level” libraries are not regular libraries.

…otherwise we are practically inviting everyone (again) to rely on their preferred set of such libraries as though they were regular ones (with a similar reaction to anyone contemplating major changes to them e.g. their deprecation).

Having 2N combinations of GHC extensions is bad enough - do we really then want 2M combinations of internal libraries pretending to be regular ones? Forget “explosion”, that would have to be a combinatorial “big bang!” I recall this thread:

…has anyone tried using this guardian system yet?

Sorry, I have no idea what you are talking about. What does this have to do with GHC extensions?

That is absolutely fine, because PVP is guaranteed, but major versions of internals might increase VERY rapidly and constantly break API. That’s what you get.

1 Like

Like internal libraries, GHC extensions aren’t normally intended for novices e.g. not all 2N combinations of extensions work.

No - “what you get” is lots of unhappy users of those libraries, and therefore lots of unhappy maintainers left to deal with their complaints. If the solution was as simple as telling novices to "avoid using libraries labelled internals", then we could also just tell novices to "avoid using anything labelled unsafe or inline".

Or is Haskell now only intended for “power users”?

1 Like

I have still no idea what you mean. I’m not the GHC extension police.

So you’re suggesting to hide internals, so that power users (e.g. those that write alternative preludes) have an extra hard time achieving anything?

I’m not sure that’s a sensible thing to do.

We’re already discussing options to emit e.g. warnings when you depend on an internal package. Documentation for now will be enough.

2 Likes

…because telling novices to "avoid using anything labelled unsafe or inline" is working so well now. Based on that experience, finding a way to direct novices away from using internal libraries/packages first (e.g. with warnings) seems the more sensible option here.

1 Like

I wonder whether there’s a meaningful split between users of Internal modules for published library writers and application writers?

A study could probably be done on Hackage to determine use of Internal modules within published libraries.

If it’s substantial then the debate is the same. If it’s minimal, then perhaps we just need to make it really really trivial for application writers to use a patched version of a library or to use it with all modules exposed.

3 Likes

My POV is that we are all “novices”. I surely am, which is why I like to use Haskell to help me write correct code.

I believe it is easy, even for experts, to make wrong assumptions about the internal working of a library and cause bugs (I have seem this happening a few times).

IMHO every exposr api would be properly versioned. The perceived need to access “internal” functions should perhaps lead to adding those functionality to the library, rather than “crossing the API border”.

If you want to discourage people from using Internal modules, put deprecation pragmas on them. People hate generating warning messages.

More seriously, but not completely seriously, what do you think about functionality becoming exposed based on version constraint. Consider some examples like:

  • an instance is defined to exist @since 1.2.3 – if you depend on a lower version than that, you won’t see it even when building against a higher version.
  • If your version bound doesn’t lock down the 4th component of the version number you don’t see .Internal modules (or however they’re marked)

Is there any precedent for this kind of thing in other (even niche) languages?

Interesting proposition. This would help to identify broken lower bounds.

How about an {-# INTERNAL #-} pragma instead for generating an appropriate warning?

2 Likes

I knew this sounded eerie familiar. [ghc-steering-committee] Base library organisation

1 Like

(heh) …so something like {-# INTERNAL #-} has ben contemplated before:

  • was it considered to be workable?
  • if so, was a way for it to work ever devised?

You can just use a freeze file that serves as a company wide blacklist and has constraints like

constraints: ghc-internals < 0
          ,  filepath-internals < 0

I don’t see the problem. This is not the job of hackage or GHC devs. It’s a company policy. Other companies might have different policies or blacklist other packages (like foundation and its entire suite).

Those constraints won’t work. They’ll blacklist packages not only from being immediate deps, but also transitive deps, and obviously internals will be transitive deps of the packages they are the internals of.

4 Likes

Yes, that was the idea, but I didn’t think it through properly. It would only work for an internal package that isn’t used by anything else (unlikely to exist), so you effectively also blacklist filepath and base.

I’m guessing the solver doesn’t know the origin of a dependency? Would there be a way to express “reject (transitive) dependency, unless coming from packages x, y and z?”.


Otherwise I guess you’re left with identifying packages that abuse internals and blacklisting those. That still seems like something feasible to do.

Or… if hackage has an API of querying reverse deps… you could query that for internals, identify all non-core packages depending on them and generate a blacklist.