## Order-dependent type inference due to Note [Instance and Given overlap]

With some effort, I have produced

```
{-# LANGUAGE ScopedTypeVariables, AllowAmbiguousTypes, TypeApplications,
TypeFamilies, PolyKinds, DataKinds, FlexibleInstances,
MultiParamTypeClasses, FlexibleContexts, PartialTypeSignatures #-}
module Bug where
import Data.Proxy
class P a
class Q a
class R a b
newtype Tagged (t :: k) a = Tagged a
type family F a
type instance F (Tagged @Bool t a) = [a]
instance P x => Q [x]
instance (x ~ y) => R y [x]
wob :: forall a b. (Q [b], R b a) => a -> Int
wob = undefined
it'sABoolNow :: forall (t :: Bool). Int
it'sABoolNow = undefined
class HasBoolKind t
instance k ~ Bool => HasBoolKind (t :: k)
it'sABoolLater :: forall t. HasBoolKind t => Int
it'sABoolLater = undefined
g :: forall t a. Q (F (Tagged t a)) => Proxy t -> [a] -> _
g _ x = it'sABoolNow @t + wob x
g2 :: forall t a. Q (F (Tagged t a)) => Proxy t -> [a] -> _
g2 _ x = wob x + it'sABoolNow @t
g3 :: forall t a. Q (F (Tagged t a)) => Proxy t -> [a] -> _
g3 _ x = it'sABoolLater @t + wob x
g4 :: forall t a. Q (F (Tagged t a)) => Proxy t -> [a] -> _
g4 _ x = wob x + it'sABoolLater @t
```

All of `g`

, `g2`

, and `g3`

are accepted. `g4`

is rejected. Note that `g4`

is identical to `g3`

except for changing the order of the summands.

What's going on:

Here is an excerpt from `Note [Instance and Given overlap]`

:

```
Example, from the OutsideIn(X) paper:
instance P x => Q [x]
instance (x ~ y) => R y [x]
wob :: forall a b. (Q [b], R b a) => a -> Int
g :: forall a. Q [a] => [a] -> Int
g x = wob x
From 'g' we get the implication constraint:
forall a. Q [a] => (Q [beta], R beta [a])
If we react (Q [beta]) with its top-level axiom, we end up with a
(P beta), which we have no way of discharging. On the other hand,
if we react R beta [a] with the top-level we get (beta ~ a), which
is solvable and can help us rewrite (Q [beta]) to (Q [a]) which is
now solvable by the given Q [a].
```

Here is the scenario in the Note, when type-checking the Note's `g`

:

```
[G] Q [a]
[W] Q [beta]
[W] R beta [a]
```

When we process the first Wanted, we might be tempted to use the instance, simplifying the Wanted to `P beta`

. Doing this leads to failure. What we'd prefer is to recognize that the Given might be helpful later, and just to leave the Wanted as-is. Then, processing the `[W] R beta [a]`

tells us that `beta ~ a`

, and we go home happy.

The scenario in this ticket is rather more complicated. The instances are unchanged, but now we have a type family `F`

in the mix. It all looks like

```
t :: kappa
[G] Q (F (Tagged @kappa t a))
[W] Q [beta]
[W] R beta [a]
```

The question is: when processing the first Wanted, should the Given block GHC from using the global instance? (Sidenote: I've used partial type signatures and polykinds here as a sneaky way of getting a metavariable into a Given. The partial type signature stops GHC from kind-generalizing the type, and thus `t`

's kind remains as `kappa`

when processing the body.) If `kappa`

is `Bool`

, then the Given will eventually match and should block us from using the instance. If `kappa`

is not `Bool`

, then the Given will not apply. Yet in all of `g`

..`g4`

, `kappa`

turns out to be `Bool`

.

In `g`

, we have `it'sABoolNow @t`

, and the eager unifier sets `kappa := Bool`

before the constraint solver gets a crack. The application of `F`

reduces and the Given is recognized as relevant, blocking the use of the global instance.

In `g2`

, the same happens.

In `g3`

, we get a new `[W] HasBoolKind @kappa t`

. This gets processed *before* the `Q [beta]`

Wanted, and processing this sets `kappa := Bool`

, leading the Given to reduce and block the use of the global instance.

In `g4`

, however, the new `[W] HasBoolKind @kappa t`

comes *after* the others. So we process `Q [beta]`

*before* learning about `kappa`

. The global instance is used, and we are left stuck with `[W] P a`

that cannot be solved.

This all boils down to the `not (isFskTyVar tv)`

check in `GHC.Tc.Solver.Monad.mightMatchLater`

, which (effectively) says that type family applications in Givens match only themselves, not any arbitrary type. Removing that check fixes this problem. But it introduces another. The following program is accepted today:

```
{-# LANGUAGE TypeFamilies, FlexibleInstances, FlexibleContexts #-}
module Bug2 where
class C a where
meth :: a -> ()
instance C Int where
meth _ = ()
type family F a
foo :: C (F a) => a -> Int -> ()
foo _ n = meth n
```

If we strike that check from `mightMatchLater`

, we get

```
• Overlapping instances for C Int arising from a use of ‘meth’
Matching instances:
instance C Int -- Defined at CbvOverlap.hs:10:10
There exists a (perhaps superclass) match:
from the context: C (F a)
bound by the type signature for:
foo :: forall a. C (F a) => a -> Int -> ()
at CbvOverlap.hs:15:1-32
(The choice depends on the instantiation of ‘’
To pick the first instance above, use IncoherentInstances
when compiling the other instance declarations)
• In the expression: meth n
In an equation for ‘foo’: foo _ n = meth n
```

The (unrelated) `[G] C (F a)`

blocks the useful `instance C Int`

. Worse, the error message says that the overlap depends on the instantiation of <>.

This new problem is not just theoretical. In my !4149 (closed) patch, I essentially made this change unintentionally, and GHC's Parser.hs stopped compiling (in stage 2), due to its `DisambECP`

shenanigans. It's easy for me to replicate the current behavior, so my branch can continue, but this bug will remain.

I'm not sure what the solution is. If we allow type family applications in Givens to match any potential type, we will unavoidably cause the second bug. So perhaps the best course of action is to let sleeping dogs lie. Yet I was uncomfortable reinstating the old behavior, because it smelled off... and I wanted to understand it better. Thus this ticket.