From what you describe, you have types b' :: * -> * -> *
for which you wish to constrain the applied b' t :: * -> *
(for all t
).
As you summise, you either need to deconstruct a type, which is your attempt here
starting from a b :: * -> *
assumed to be the result of a type
application b = b' t
, or to enforce a constraint on a "more-applied" type, instead
from the start-point of a b' :: * -> * -> *
.
Deconstructing a type is not possible, since the compiler does not know if b
is even "deconstructable". Indeed, it may not be, e.g., I can make an instance instance Bar Maybe
, but Maybe
cannot be deconstructed into a type b' :: * -> * -> *
and some type t :: *
.
Starting instead from a type b' :: * -> * -> *
, the constraints on an application of b'
can be moved into the body of the class, where the variables are quantified:
class Bar (b :: * -> * -> *) where
g :: (Foo (b q1), Foo (b q2)) => b q1 m -> b q2 m
For your example there is one further wrinkle: q1 and q2 may have their own constraints,
e.g. for the D2
instance you require the Integral
constraint. However, Bar
fixes the constraints on q1
and q2
for all instances (in this case the empty constraint). A solution is to use "constraint-kinded type families" which allow instances to specify their own constraints:
class Bar (b :: * -> * -> *) where
type Constr b t :: Constraint
g :: (Foo (b q1), Foo (b q2), Constr b q1, Constr b q2) => b q1 m -> b q2 m
(include {-# LANGUAGE ConstraintKinds #-}
and import GHC.Prim
)
Then you can write your D2
instance:
instance Bar (D2 t) where
type Constr (D2 t) q = Integral q
g (D2 q) = D2 $ fromIntegral q