An incorrect warning about SPECIALIZE: "Forall'd type variable ‘k’ is not bound in RULE lhs"
Confirmed in GHC 9.4.6 and 9.8.-alpha1. Probably many others.
This SPECIALIZE pragma
generates the following warning, which seems wrong:
src/HordeAd/Core/Delta.hs:751:1: warning: [GHC-40548]
Forall'd type variable ‘k’ is not bound in RULE lhs
Orig bndrs: [k]
Orig lhs: let {
$dConvertTensor_a4lNs
:: ConvertTensor
(Flip @{Nat} @{Type} OR.Array) (Flip @{[Nat]} @{Type} OS.Array)
[LclId]
$dConvertTensor_a4lNs = $dConvertTensor_a4lKV } in
let {
$dShapedTensor_a4lNr
:: ShapedTensor (Flip @{[Nat]} @{Type} OS.Array)
[LclId]
$dShapedTensor_a4lNr = $dShapedTensor_a4lKU } in
let {
$dRankedTensor_a4lNq :: RankedTensor (Flip @{Nat} @{Type} OR.Array)
[LclId]
$dRankedTensor_a4lNq = $dRankedTensor_a4lKT } in
let {
$dGoodScalar_a4lNp :: GoodScalar Double
[LclId]
$dGoodScalar_a4lNp = $dGoodScalar_a4lKS } in
gradientFromDelta
@(Flip @{Nat} @{Type} OR.Array)
@(Flip @{[Nat]} @{Type} OS.Array)
@Double
$dGoodScalar_a4lNp
$dRankedTensor_a4lNq
$dShapedTensor_a4lNr
$dConvertTensor_a4lNs
optimised lhs: gradientFromDelta
@(Flip @{Nat} @{Type} OR.Array)
@(Flip @{[Nat]} @{Type} OS.Array)
@Double
$dGoodScalar_a4lNp
$dRankedTensor_a4lNq
$dShapedTensor_a4lNr
$dConvertTensor_a4lNs
|
751 | {-# SPECIALIZE gradientFromDelta
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^...
The mentioned "forall'd type variable ‘k’" is probably this one
which is fully determined by the pragma (no type variable left uninstantiated, so all kinds are determined as well). Also the pragma just below is just as fully determined and does not produce such a warning.
I haven't yet determined if the function gets specialized and if the pragma helps in that (GHC 9.4 often needs the pragmas despite -fexpose-all-unfoldings -fspecialise-aggressively
in the project's cabal file, while later GHCs cope much better). No idea if the warning means the pragma is disabled --- from the wording I'd understand it's not.
A related commit from #22471 (closed) is already in GHC 9.6, so that doesn't seem to fix the problem. Another related ticket: #10698