haskell - How should the general type of a "lemma" function be understood? -
perhaps stupid question. here's quote the hasochism paper:
one approach resolving issue encode lemmas, given parameterised equations, haskell functions. in general, such lemmas may encoded functions of type:
∀ x1 ... xn. natty x1 → ... → natty xn → ((l ~ r) ⇒ t) → t
i thought understood rankntypes, can't make sense of last part of proposition. i'm reading informally "given term requires l ~ r, return term". i'm sure interpretation wrong because seems lead circularity: don't know l ~ r until conclusion of proof itself, how can expected provide assumption of proof term requires that?
i have expected equality proof have type more this:
natty x1 → ... → natty xn → l :~: r informally, "given bunch of nattys, return proof propositions l , r equal" (using ghc's data.type.equality). makes far more sense me, , seems align you'd in other dependently typed systems. i'm guessing it's equivalent version in paper, i'm struggling mentally square away 2 versions.
in short, i'm confused. feel i'm missing key insight. how should read type ((l ~ r) => t) -> t?
i'm reading "given term requires
l ~ r, return term"
it's "given term type contains l, return term ls being substituted rs in type" (or in other direction r -> l). it's neat trick, allows delegate cong, trans, subst , similar stuff ghc.
here example:
{-# language gadts, datakinds, polykinds, typefamilies, typeoperators, rankntypes #-}  data nat = z | s nat  data natty n     zy :: natty z     sy :: natty n -> natty (s n)  data vec n     nil  :: vec z     cons :: -> vec n -> vec (s n)  type family (n :: nat) :+ (m :: nat) :: nat     z     :+ m = m     (s n) :+ m = s (n :+ m)  assoc :: natty n -> natty m -> natty p -> (((n :+ m) :+ p) ~ (n :+ (m :+ p)) => t) -> t assoc  zy     py t = t assoc (sy ny) py t = assoc ny py t  coerce :: natty n -> natty m -> natty p -> vec ((n :+ m) :+ p) -> vec (n :+ (m :+ p)) coerce ny py xs = assoc ny py xs update
it's instructive specialize assoc:
assoc' :: natty n -> natty m -> natty p ->             (((n :+ m) :+ p) ~ (n :+ (m :+ p)) => vec (n :+ (m :+ p)))                                                -> vec (n :+ (m :+ p)) assoc'  zy     py t = t assoc' (sy ny) py t = assoc ny py t  coerce' :: natty n -> natty m -> natty p -> vec ((n :+ m) :+ p) -> vec (n :+ (m :+ p)) coerce' ny py xs = assoc' ny py xs daniel wagner explained what's going on in comments:
or, way, can read ((l ~ r) => t) -> t as, "given term typed assuming l ~ r, return same term context have proven l ~ r , discharged assumption".
let's elaborate proving part.
in assoc'  zy py t = t case n equal zy , hence have
((zy :+ m) :+ p) ~ (zy :+ (m :+ p)) which reduces to
(m :+ p) ~ (m :+ p) this identity , hence can discharge assumption , return t.
at each recursive step maintain the
((n :+ m) :+ p) ~ (n :+ (m :+ p)) equation. when assoc' (sy ny) py t = assoc ny py t equation becomes
((sy n :+ m) :+ p) ~ (sy n :+ (m :+ p)) which reduces to
 sy ((n :+ m) :+ p) ~ sy (n :+ (m :+ p)) due definition of (:+). , since constructors injective
constructors_are_injective :: s n ~ s m => vec n -> vec m constructors_are_injective xs = xs the equation becomes
((n :+ m) :+ p) ~ (n :+ (m :+ p)) and can call assoc' recursively.
finally in call of coerce' these 2 terms unified:
 1. ((n :+ m) :+ p) ~ (n :+ (m :+ p)) => vec (n :+ (m :+ p))  2.                                      vec ((n :+ m) :+ p) clearly vec ((n :+ m) :+ p) vec (n :+ (m :+ p)) under assumption ((n :+ m) :+ p) ~ (n :+ (m :+ p)).
Comments
Post a Comment