Namespaces, imports, plugins

This is gonna be a somewhat rambling request for comment/ideas about imports in Haskell. It’s spurred by Announcing Imp, a GHC plugin for automatically importing modules - #16 by atravers but is something I’ve thought about for a while.

I have been wondering about for a while is if an import plugin introducing a new import syntax, with the following features, would be possible:

  • Allow importing in a hierarchy a la Rust
  • Reduce the syntax needed to do the “type and qualified module” idiom that people (or at least, me) often use, like importing the type Vector and qualified importing all of Data.Vector under Vector name as well.
  • Support renaming
  • Explicit namespaces only

What I’d like is something like:

use Data.Text (type Text, module as Text, module Encoding as Text)

-- with renaming
use Data.Text.Lazy (type Text as LText, module as LText, module Encoding as LText)

to mean

import qualified Data.Text as Text
import qualified Data.Text.Encoding as Text
import Data.Text (Text)

-- renaming
import qualified Data.Text as LText
import qualified Data.Text.Encoding as LText
import qualified Data.Text.Lazy
-- something like `type LText = Data.Text.Lazy.Text`, except done by the renamer

It would not support unqualified-as imports at all - I just think that those are a mistake. I looked at GHC plugins briefly and it seems like extending what syntax import allows wouldn’t be possible, hence use. Can a renamer plugin do the renaming part? Is it possible at all?

There’s a few things disheartening me though: (1) required time and effort :slight_smile: (2) that the entire Haskell module system just doesn’t work that well, and such a plugin as above would be just papering over that fact, but then actually changing it is fairly unfathomable. I sometimes wish we had “default”/“index” modules in directory hierarchies, inline-defined modules, module exports / “qualified exports” like Data.Text exports a namespace/module called Encoding which, i.e. Data.Text.Encoding.

1 Like

…I hear you! There are some things I would like to work on, if it weren’t for some thoroughly-frustrating “normal space” matters.

(2) that the entire Haskell module system just doesn’t work that well

…perhaps the best way to think about the current module system is as an “exercise in minimalism”:

…and back in 2002, import lists probably were much shorter. So the current problems could simply be due to a matter of scale - Haskell programs are now larger, which usually means more modules and thus longer import lists in each module.

But others (who I want to think are) smarter then me have also contemplated alternative module systems for similar languages e.g. see page 107 of 210 in Functions, Frames and Interactions (1998).


  • if it isn’t…end of story.

  • if it can be done…others may contemplate their own import plugins to enable imports in the style of e.g. Miranda(R), Erlang or perhaps even Standard ML! Now imagine you have the oh-so-wonderful job of trying to make libraries that each use a different style/plugin to all work together in some new program…

I would like the ability to treat modules and the identifiers they export (and indeed packages) as first class entities. Anything short of that is just papering over the cracks.

3 Likes

See page 171 of 210 in Functions, Frames and Interactions.

As a prototype, perhaps a pre-processor might be the way to go? Assuming you are merely introducing new syntax that can be translated into the old syntax, then a pre-processor could implement the translation and integrate fairly nicely into GHC with the -F and -pgmF options (5.11. Options related to a particular phase — Glasgow Haskell Compiler 9.8.1 User's Guide).

Of course, using a pre-processor (alone) can pretty much only fiddle with syntax. And at this point it isn’t clear that merely tweaking syntax gives enough benefit to justify changing what exists already, even if what exists already is imperfect (osa1 - Some arguments against small syntax extensions in GHC).

Once you get beyond syntax extensions, the design space opens up rapidly, so there are many possibilities but it takes a lot of work to move forward (cf. the Local Modules proposal).

1 Like

I like your syntax, but I probably wouldn’t use a plugin just to reduce the import boilerplate. Did you consider submitting a GHC extension proposal?

1 Like

I’d like to give a contrary suggestion. Please make it a plugin! If it’s made into a GHC extension that means that only people who use GHC 9.12 or 9.14 (or whatever GHC it lands in) onwards will be able to use it, and its evolution will be tightly coupled to the GHC release cycle. As a plugin it will be available immediately to a wide range of GHCs and it will be possible to evolve it quickly according to user feedback and needs.

2 Likes

Trying to make a GHC plugin, I dunno if it’s possible. Such syntax would result in a fatal GHC parser error so a source plugin can’t meaningfully run in the first place. I think it could work with what GHC initially sees as a TH splice with syntax like use Data.Text (type_ Text, module_ as Text) but I’m not a fan of having to do that. But a preprocessor can surely do it and work for basically any GHC :slight_smile:

1 Like

My impression is back to 1990 the original Haskell designers felt they didn’t have anything innovative to say about a module system, and/or they were already biting off as much as they could chew in producing a thoroughgoing lazy functional language.

Is there anything specific about a (lazy) functional language that’s different vs a generic language X featuring separate compilation and strong type-checking across modules? IOW do we need to reinvent the wheel? Even if language X’s model is a little less than the theoretical ideal from starting with a clean sheet of paper, isn’t it better to follow some de facto best practice?

Haskell does feature some funky namespacing where usage sites can omit the module qualifier (modid) providing it’s unambiguous, and/or imports can introduce a new modid, and/or import the same module with all of unqualified, original qualifier, (multiple) new modids.

I have a nasty feeling your “first class” won’t be the same as my “first class” won’t be the same as Haskell-programmer-with-a-background-in-language-XYZ’s idea of “first class”.

CPP is pretty rubbish for Haskell [**]/there have been efforts for a more Haskell-friendly version of CPP, but they’ve fallen away because usually Haskell modules are embedded in bigger systems not all written in Haskell, so pre-processing has to conform to the lowest common denominator.

Are modern compilation systems not able to be a bit more clever? For modules with a .hs suffix, use this pre-processor; for perl use that; etc?

[**] Specifically, CPP is unaware of the importance of linebreaks and indenting for Haskell syntax.

I’m thinking no :

The early non-strict language (Chalmers) Lazy ML relies on CPP for imports (with an #include for each '.t' interface). Amanda (another such language) seems to use a CPP-inspired mechanism (instead of#include, there’s an #import for each '.ama' module).

The only potential difficulty for having a “grand unified module system” for all programming languages would be exports:

  • I don’t know of any preprocessors that support export lists,
  • and a generic “module manager” would need some way to mark the language-specific (or mangled ) symbols corresponding to those in an export list as being available for public use.

…it would make for an interesting project.