The problem is that
data Trainable a b = forall n . Floating n => Trainable ([n] -> a -> b) (a -> b -> [n] -> n)
means that in
Trainable transfer cost
the type n
used is lost. All that is known is that there is some type Guessme
with a Floating
instance such that
transfer :: [Guessme] -> a -> b
cost :: a -> b -> [Guessme] -> Guessme
You can build Trainable
s with functions that only work for Complex Float
, or only for Double
, or ...
But in
trainSgdFull :: (Floating n, Ord n) => Trainable a b -> [n] -> a -> b -> [[n]]
trainSgdFull (Trainable _ cost) init input target = gradientDescent (cost input target) init
you are trying to use cost
with whatever Floating
type is supplied as an argument.
The Trainable
was built to work with type n0
, the user supplies type n1
, and those may or may not be the same. Thus the compiler can't deduce they are the same.
If you don't want to make n
a type parameter of Trainable
, you need to make it wrap polymorphic functions that work with every Floating
type the caller supplies
data Trainable a b
= Trainable (forall n. Floating n => [n] -> a -> b)
(forall n. Floating n => a -> b -> [n] -> n)
(needs Rank2Types
, or, since that is in the process of being deprecated, RankNTypes
).