I happened to see some strange behaviour during checking the size (minBound,maxBound) and "length in decimal representation" of different integral types.
Using GHCi:
Prelude> :{
Prelude| let mi = minBound
Prelude| ma = maxBound
Prelude| le = fromIntegral $ length $ show ma
Prelude| in [mi,ma,le] :: [Int]
Prelude| :}
[-9223372036854775808,922372036854775807,2]
^
in the last place I would expect 19.
My first guess is that maxBound defaults to () and thus yields 2, but I don't understand that because ma should be an Int by the explicit type annotation (:: [Int]) - and by referential transparency all symbols named ma should be equal.
If I put the statement above in a file and load it into GHCi, I get the correct result.
So why do I get a wrong result?
Confusingly, this is still the monomorphism restriction at play (or rather the lack thereof when in GHCi). Since GHCi doesn't have the monomorphism restriction enabled, your definitions of
miandmadon't get specialized toIntas you think they will - instead they stay general asmi, ma :: Bounded a => aand theavariable gets instantiated twice()infromIntegral $ length $ show ma(as you observed, this is a default)Intin[mi,ma,le] :: [Int]If you want
miandmato actually be of typeInt, annotate them as such directlyOr turn on the monormorphism restriction manually in GHCi