State monad - memory exhausted

Why is that? (Post must be 20 chars).

It’s fair to expect GHC to produce warnings if it fits into its compilation pipeline. But above I sketched an entirely new static analysis that is not relevant to compilation in any way, yet requires its own pass (multiple, probably) over the whole program. There’s no reason to burden development and every run of the compiler with this overhead; rather I’d expect some kind of static analysis tool to be run (perhaps nightly) by CI. That’s good: Such a tool (hlint, stan or a Core plugin) is not subject to the same stability requirements as GHC.

GHC has (semantic, hence non-trivial) analyses which are non-essential to compilation such as pattern-match coverage checking. But that analysis fits quite neatly into the structure of the desugaring pass. Even then, for some complicated test inputs you can observe a significant drop in compilation performance entirely due to coverage checking. I suggest we do not add to that.

…amongst others, here and elsewhere! But as it happens, maybe something can be done about the weather :

If the observations made in those articles:

  1. can be verified for the halting problem,

  2. then extended to Rice’s theorem,

it could be possible to “have it all”. Otherwise:

…or revitalising Robert Ennal’s previous work:

1 Like

So performance reasons essentially? Comparing to C, gcc -fanalyzer is expensive, but opt in.

https://gcc.gnu.org/onlinedocs/gcc-13.2.0/gcc/Static-Analyzer-Options.html

Yes, perf and stability. Personal opinion: Contributing to GHC, fulfilling as it might be, has lost quite a bit of momentum in recent years due to maturity of the project, multiplied with the churn introduced by such a large code base.

Edsko de Vries from Well-Typed has written the nothunks library which can give warnings if there are thunks in your code:

nothunks is great and I’m happy to have it, but IMHO its using the typeclass system to overcome a feature deficiency in GHC. If I could have my way, I would transform nothunk uses into Unlifted types. This way the my types just feel cleaner because Foo :: a isn’t masquerading as something (a thing that can be a thunk, and therefore includes \bot) as something that it isn’t (a thing that does not contain \bot as a value). So I would find this approach cleaner because I have type level witnesses instead of typeclass constraints that serve as witnesses. I guess I should help @jaror improving the ergonomics of Unlifted.

What about extending the use of strictness annotations to type signatures?

foo :: !T
foo = ...
1 Like

Would you expect this to type-check?

xs :: [!Int]
xs = map (+ 1) [error "blah"]

If so, what code would you generate for Data.List.map?

Essentially, ! in type signature is just syntactic sugar for a zero cost coercion into UnliftedType kind, and it is not entirely trivial to embrace that in our compilation pipeline.

1 Like

[…] what code would you generate […]?

Something like the code that presumably would be generated for:

xs :: [Int]
xs = map (+ 1) [error "blah"]

when using -XStrict in GHC.


Well, you could try approaching the problem from the “other direction” :

Lazy Evaluation for the Lazy: Automatically Transforming Call-by-Value into Call-by-Need (2023)

But map has not been compiled with -XStrict, so it won’t evaluate the list cells it returns.
Hence [!Int] (which to me says “When you evalute to (x:_)::[!Int], then x is also evaluated”) would be very misleading, because that is not at all what is guaranteed by what is returned by map.

The solution is that you need two versions of map: one that you call when the list cells are “lazy” (lifted) and one in which the list cells are “strict” (unlifted). With that in mind, map is actually pretty much an overloaded function. Of course, we wouldn’t want to pay for overloading, so we’d probably specialise every “levity polymorphic” function. But map has type forall a b. (a -> b) -> [a] -> [b] and we so far have only discussed levity polymorphism in b. What about levity polymorphism in a? That would lead to 4 specialisations for the same map function. Fortunately, it is OK to simply presume that a is lifted and insert an eval just in case (think of UnliftedType as a subtype of LiftedType with a zero-cost coercion), so 2 specialisations suffice, but that is not true in general and you can see why this doesn’t scale.

All that to say: It’s not as simple as “proposing” ! in types; that’s merely a piece of syntax without a specification of its non-compositional semantics.

Incidentally, we could make [] levity polymorphic today, e.g. [] :: forall (l::Levity). TYPE (BoxedRep l) -> LiftedType. This l defaults to Lifted anywhere it can’t be inferred, that would mean most written code out there should keep compiling. So actually we could write [!Int] as [Strict Int], where Strict :: LiftedType -> UnliftedType such as in Data.Elevator. So !a could just be syntactic sugar today, for Strict Int. But that does not help, because we can’t reuse all the existing definitions working just on the lifted variant of map.

While some functions, such as foldr, can easily be made levity polymorphic in the list element type parameter without requiring separate specialisations (foldl' even in both type params, I think, #15532: Relaxing Levity-Polymorphic Binder Check for Lifted vs Unlifted pointers · Issues · Glasgow Haskell Compiler / GHC · GitLab), in other cases such as map we can’t get around to generating twice the amount of code (or suffer from unknown calls to a dictionary carrying around the implementation of seq (unlifted) / flip const (lifted)). I argue that we’d require opt-in from the user to do so via a change in the type signature (map :: LevPoly l => forall a (b::TYPE (BoxedRep l)). (a -> b) -> [a] -> [b]).

2 Likes

But map has not been compiled with -XStrict, so it won’t evaluate the list cells it returns.

It shouldn’t need to - the call to map would be “strictness-lifted” by the implementation implicitly to provide that strict version of map, in a similar fashion to how e.g. the strict Complex a constructor (:+) is really a lazy constructor which has been “strictness-lifted” through the use of extra calls for evaluating its components.


All that to say: […]

…people are being annoyed by “type acrobatics” :

  • Monad, I Love You […] (2022)
    https://www.youtube.com/watch?v=2PxsyWqZ5dI

Wouldn’t that mean every function needs to have 2args versions for each choice of levity downstream? Isn’t proper levity polymorphism support a way more straightforward solution at that point?

…not according to the OP here, and others:


Wouldn’t that mean every function needs to have 2args versions for each choice of levity downstream?

That’s more of a “provide-it-all-now” solution. I’m thinking more “provide-only-as-needed”, where strictness annotations would be expanded as they are encountered by adding the extra calls needed to evaluate (sub)terms.

But calls to what? There is currently no strict version of Data.List.map (and no mechanism to select such versions either).

Or are you saying that it should insert calls to some standard evaluation function for each data type like deepseq before or after the map?

  • At this point in time - the (mis-named) Prelude.seq;
  • In the future - who knows; maybe strict patterns will be the basic mechanism, rather that calling a primitive definition…

Here’s another way to think about it - right now the strictness propagator (of which strictness analysis is a crucial part) uses evidence gleaned from the program. The strictness annotation would just be a form of evidence provided directly by the programmer.

Ah, but how do you apply seq inside arbitrary data structures like lists? I think that’s what @sgraf was asking with this example:

The seq strategy could work for the top level arguments of a function, but it seems more difficult to do it efficiently for types that are deeper inside other data structures.

I thought we were discussing matters pertaining to (ordinary) strictness, not hyper-strictness

Ah, then I think this whole discussion has been one big misunderstanding. The answer to @sgraf’s question:

Is simply that you aren’t allowed to put the annotions nested in a data type like that.

You could only use it on function arguments like this:

tuple :: !Int -> !Int -> (Int, Int)
tuple x y = (x, y)

And perhaps to variables like:

x :: !Int
x = 1 + 2 + 3

But that is difficult if that is a top-level binding (I have worked on that problem during my internship with Well-Typed).

Lazier-ness in action:

ghci> map (const '\a') [let e = error e in e, let s = '\a':s in s]
"\a\a"
ghci> 

So map is indifferent to its arguments being fully evaluated or not (or partially so). That would be modulated/adjusted/“fine-tuned” by using strictness annotations (however they may be implemented) as needed to obtain the result desired by the programmer.