Existential quantifier ForAny

I need some help and everyone’s comment is welcomed!

But first - a bit of history and why I’m asking for help and why I’m asking now.

History

Haskell language sometimes is not capable to describe a right type. And some of such types we could describe using existential quantifier (exists or ).

Most advanced Proposal was made in First-class existentials by GoldFirere (Richard Eisenberg). Unfortunately adding existential quantifier requires to add most elements from Calculus of constructions (with proofs, packing. unpacking, witnesses and so on). So, that work was frozen.

Mine idea was that it is possible to add existential quantifier (maybe not as universal as in Goldfire proposal) and to stay in System Fs.
I even wrote my Proposal Extractable Existentials with simple idea:

data Box = forall a. MkBox a

fromBox :: Box -> exists a. a
fromBox (MkBox x)  = x

But I was fairly asked “So, what is your proposal to Core language?”
And I was stack! I spent almost a year for this question.
Most interesting articles for my understanding of Lambda Calculus were
A Look at Typed Lambda Calculus by Nikolay Yakimov and
Haskell to Core: Understanding Haskell Features Through Their Desugaring by Vladislav Zavialov

Right now I understand Lambda Calculus much better. But even now these judgments looks for me as tautologies “If we have x:t then we have x:t”.
And that’s why I need your help to find and write those judgments.
But why I’m asking right now?
Because I made a small discovery recently!

Discovery 1. Forany

I opened the article “Haskell to core” in 1001 time and looked at

data E =  forall a. MkE [a]

lenE :: E -> Int
lenE (MkE xs) =
  List.length xs

-- is desugaring to Core

lenE =
  \(e :: E) ->
  case e of
    MkE @a xs ->
      List.length @a xs

I :star_struck: surprisingly found that Haskell already has all machinery and algorithms, but has not mathematical judgments.
I used a trick not from Lambda Calculus, but from Theory of Categories, which tells us that if exists such non-primitive function, we could split it in two:

f = f1 * f2

So, let’s do it with lenE function (just types):

lenE :: E -> Int
lenE = lenEa . unE

unE :: E -> forany a. [a]

lenEa :: forall a. [a] -> Int

Here I used forany quantifier instead of exixsts because this quantifier is not as universal as exists
And before we split function body we add some “hidden” expressions:

lenE =
  \(e :: E) ->
  case e of
    MkE (createNewTyVar @(a::k) in @a) xs ->
      saveTyVar @(a::k) in
        loadTyVar @(a::k) in
          List.length @a xs

-- split function into 2

unE =
  \(e :: E) ->
  case e of
    MkE (createNewTyVar @(a::k) in @a) xs ->
      saveTyVar @(a::k) in xs

lenEa =
    \(a :: Type) ->
    \(xs :: a) ->
          List.length @a xs

lenE = 
    lenEa (loadTyVar @(a::k) in @a) . unE

Huge Suscess!!

As we see, forany quantifier don’t require any additional type to Core types, type variable works just fine. This means, that we add to Core types ForAny a. T and all judgement for it are identical as for ForAll a. T types (but we need to find additional judgement for forany).
But it requires some additional Core expressions (saveTyVar and loadTyVar and hidden createNewTyVar) and special rules for those functions, which consists forany quantifier.

Discovery 2. Union types

Rght now forany is just Extractable Existential. Could we make something more, but still remain in System FS? Yes we can! We can add Depended sum / Union types.

For this we need to extend Core types to TUnion t1 t2 (but for simplicity I use here TUnion [t1]).
And we add just 1 special Core expression (caseDyn), which is a combination of createNewTyVar and case expressions:

data T a where
  MkInt  :: T Int
  MkBool :: T Bool

serialise :: T a -> String
serialise MkInt  = "int"
serialise MkBool = "bool"

deserialise :: String -> forany a. T a
deserialise "int"  = MkInt
deserialise "bool" = MkBool

-- alternative typing
deserialise :: String -> forany (Int | Bool). T (Int | Bool)
-- alternative typing with coerce
deserialise :: String -> forany a. a ~ (Int | Bool) => T a

And to core desugaring is looking like this:

deserialise =
  \(x :: String) ->
    createNewTyVarOfKind @(a:: Type) in 
	  caseDyn x of
	    "int"  -> setTyVar @Int  in saveTyVar @a in MkInt
	    "bool" -> setTyVar @Bool in saveTyVar @a in MkBool

Union types works like Set:

  • (a | b) ~ (b | a) - Commutativity rule
  • (a::k | b::k) - Same-Kind Rule for “sub”-types
  • (a | a) ~ a - Self-Reduction Rule
  • (a | Void) ~ a ~ (Void | a) - Void-Reduction Rule
  • a ⊆ (a | b) - Hierarchical Rule
  • (a | b) ⊆ ANY- Principal type Rule (if b !~ a)
  • (a !| a) ~ Void - Self-Anihilation Rule
  • (a !| b) ∈ ERROR if a, b !⊆ (c | d)(including ANY), and a, b !∈ TVAR - Wrong typed type

Is ForAny as powerful as Exists?

Unfortunately, it looks like forany quantifier is still weaker then exists.
Let’s try to write function toVec and filterVec:

data Nat = Zero | Succ Nat

type Vec :: Nat -> Type -> Type
data Vec n a where
  VNil :: Vec Zero a
  (:>) :: a -> Vec n a -> Vec (Succ n) a
infixr 5 :>

toVec :: forall a. 
         [a] -> 
         forany (n::Nat). n ~ ( ?? | ?? ) => Vec n a
toVec []     = VNil
toVec (x:xs) = x :> toVec xs

--

filterVec :: forall a (n::Nat). 
          (a -> Bool) -> 
          Vec n a -> 
          forany (m::Nat). m ~ (?? | ?? | ??) => Vec m a
filterVec p VNil = VNil
filterVec p (x :> xs)
  | p x       = x :> filterVec p xs
  | otherwise = filterVec p xs

What should we write to caseDyn expression? setTyVar @?? in .. ?
How to typecheck? How to evaluate types?
It is unclear.

Summary

I’m asking for help not for some homework, but to add to Haskell ForAny quantifier and still remain in System Fs.

Feedback is welcomed!

1 Like

Maybe we could write toVec and filerVec with forany
At least in theory:

toVec :: forall a. 
         [a] -> 
         forany (n::Nat) (m::Nat). n ~ ( Zero | Succ m ) => Vec n a
toVec []     = VNil
toVec (x:xs) = @(Succ m) x :> (toVec xs :: forany m. Vec m a)

--

filterVec :: forall a (n::Nat). 
          (a -> Bool) -> 
          Vec n a -> 
          forany (m::Nat) (r::Nat). m ~ (Zero | r | Succ r) => Vec m a
filterVec p VNil = VNil
filterVec p (x :> xs)
  | p x       = @(Succ r) x :> (filterVec p xs :: forany r. Vec r a)
  | otherwise = (filterVec p xs :: forany r. Vec r a)

But I’m sure some additional rules are required to write so.

1 Like

I don’t think the calculus of constructions is relevant as it does not support first class existentials and it thus does not have packing and unpacking of existentials (“proofs” and “witnesses” are very vague terms). But indeed, Richard Eisenberg stopped working on this since he felt that the (substantial) effort would be better spent on other aspects of the compiler.

2 Likes

The forany quantifier itself would be an additional type that needs to be added to Core.

I would recommend calling it something else than dependent sum because that is already a commonly used synonym of sigma types (which are not what you’re proposing here). Instead it seems you’re proposing union types. Union types are useful, but difficult to combine with type inference. I think adding union types is not easier than adding the first-class existentials that Richard Eisenberg proposed.

1 Like

I don’t think the calculus of constructions is relevant as it does not support first class existentials

I’m just use Lambda cube - Wikipedia explanations - allowing types to depend on terms leads us from System FS to Calculus of constructions.
Sure, Eisenberg system is different from those which is used in Coq.
As I understands it is called “Simply typed lambda calculus with existential types” - more here: [1907.11133] An Introduction to Logical Relations
Or maybe I understand something wrong ))

I would recommend calling it something else than dependent sum because that is already a commonly used synonym of sigma types (which are not what you’re proposing here). Instead it seems you’re proposing union types.

Ok. Thanks for advise! I’ve changed to Depended sum / Union types and TUnion t1 t2.

I think adding union types is not easier than adding the first-class existentials that Richard Eisenberg proposed.

Maybe, but creating Union types requires just 1 expression caseDyn and the rest is an ordinary existential type

The forany quantifier itself would be an additional type that needs to be added to Core.

Ok, maybe you are right. But right now judgement for forall and for forany are identical.
And we must find the rest of judgements

I can barely make heads or tails of most of what you’ve been writing, @VitWW, but as long as this is the peanut gallery firing off half-baked, simpler proposals to get to existential types, I’m not at all clear on why this needs to involve any extensions to Core.

In GHC, type checking happens in Haskell syntax, not Core, so that’s a whole separate thing to do. But having undertaken that, it seems to me that we can encode the existential types in Core as it exists today, using CPS.

Here’s my pitch:

  1. Introduce an internal newtype Exists a = MkExists a with a runExists (Exists a) = a pseudo-accessor. Users never work with or see this type directly. It’s there to make the Core work without having to change anything fundamental about Core.
  2. Encode the surface syntax exists a b c. body as Exists (forall r. (forall a b c. body -> r) -> r). Exists is only ever introduced internally and so it will always match this template. (We can’t encode that requirement in the type of Exists, but for an internal type that’d be just a nice-to-have.)
  3. While type checking in the surface syntax, using whatever algorithm makes the smart folks happy, also do a contification pass in which any non-existential term t being checked against an existential type becomes MkExists (\k -> k t), and any existential term e being checked against a non-existential type has its context rewritten as a continuation k, which is then used in runExists e k. Ordinarily, this sort of thing is underspecified because the order in which continuations are introduced matters. But in this case, it won’t, because after…
  4. completing type checking and converting to Core per usual…
  5. we’ll then decontify during simplification, using something equivalent to this rule:
{-# RULES "decontify" forall x y. runExists x y = y (unsafeCoerce x id) #-}

That should get us back to something with operational semantics sufficiently close to the original expression. Don’t worry about the unsafeCoerce. All it’s doing is letting us provide existential type args to y without knowing exactly what they should be according to x. If the only places where runExists is ever used are on types corresponding to the Exists (forall r. (forall a b c. body -> r) -> r) convention, and if runExists x y type-checked successfully, then y is guaranteed to be parametrically polymorphic and can’t care that we’re coercing the ‘real’ existential type variable into a bogus one.

This is good enough to, for example, recover laziness in the filter example from the proposal. Here is a working prototype in which I’ve done the contification by hand, but the decontification is done with (incredibly hacky) rewrite rules and TH. See how it terminates despite starting with an infinite list (and if an infinite Natural-indexed vector makes you give this the side-eye, you should also be side-eyeing the original proposal).

I’m sure there are devils lurking in the details of how the contification should happen (where should the boundary of the context be?) and other things that take place during type checking, but this should be enough of a proof of concept to demonstrate my thesis that Core doesn’t require extension and we don’t need to be discussing Core type judgments, provided we can stomach a looks-unsafe-but-isn’t cast being inserted during decontification. (Maybe that’s a big ‘provided’, though?)

It should go without saying, but this is all with the huge disclaimer that clearly I haven’t written a whole ICFP paper about this, and people who have are probably going to have thought about most of the ideas that might be better than theirs and found valid reasons to dismiss them.

6 Likes

… but as long as this is the peanut gallery firing off half-baked, simpler proposals to get to existential types, I’m not at all clear on why this needs to involve any extensions to Core.

In GHC, type checking happens in Haskell syntax, not Core, so that’s a whole separate thing to do. But having undertaken that, it seems to me that we can encode the existential types in Core as it exists today, using CPS.

Very interesting approach!!

This is good enough to, for example, recover laziness in the filter example from the proposal. Here is a working prototype in which I’ve done the contification by hand, but the decontification is done with (incredibly hacky) rewrite rules and TH. See how it terminates despite starting with an infinite list (and if an infinite Natural -indexed vector makes you give this the side-eye, you should also be side-eyeing the original proposal).

Unbelievable! And awesome! I never seen anything like this before! Very cool! :star_struck:

It should go without saying, but this is all with the huge disclaimer that clearly I haven’t written a whole ICFP paper about this, and people who have are probably going to have thought about most of the ideas that might be better than theirs and found valid reasons to dismiss them.

Those people, who writes ICFP paper are ordinary people and maybe they never wrote the code similar to yours. At least I never seen such code from them.
And maybe your approach deserves much more attention! )))

Right now your answer is very close to mark this topic as “Solved”!