# In test EtaExpandLevPoly STG sees a levity polymorphic function being applied to concrete arguments.

In this test we have this particular piece of code

```
{-# LANGUAGE UnboxedTuples, MagicHash, GADTs,
DataKinds, PolyKinds, ExplicitForAll #-}
module M where
import GHC.Exts
data H a where
MkH :: H IntRep
-- tests that we don't push coercions that make args representation-polymorphic
-- see Simplify.simplCast
bar :: forall (r :: RuntimeRep) (a :: TYPE r). H r -> (a -> a -> (# a, a #)) -> a -> (# a, a #)
bar MkH = (\f x -> f x x) :: forall (b :: TYPE IntRep). (b -> b -> (# b, b #)) -> b -> (# b, b #)
```

Here the argument to `MkH`

determines the representation of `a`

. But by the time we hit codegen we no longer want to be representation polymorphic and this test is supposed to check for that.

However while working on linting kind missmatches this test paniced.

Looking at the result of core prep we get:

```
-- RHS size: {terms: 13, types: 34, coercions: 67, joins: 0/1}
M.bar
:: forall (r :: GHC.Types.RuntimeRep) (a :: TYPE r).
M.H r -> (a -> a -> (# a, a #)) -> a -> (# a, a #)
[GblId,
Arity=2,
Str=<1!P(L)><1C1(C1(!P(L,L)))><L>,
Unf=OtherCon []]
M.bar
= \ (@(r_azB :: GHC.Types.RuntimeRep))
(@(a_azC :: TYPE r_azB))
(ds_sEk [Occ=Once1!] :: M.H r_azB)
(eta_sEl [Occ=OnceL1] :: a_azC -> a_azC -> (# a_azC, a_azC #)) ->
case ds_sEk of { M.MkH co_azF ->
let {
sat_sEo [Occ=Once1]
:: (a_azC |> (TYPE co_azF)_N)
-> (# (a_azC |> (TYPE co_azF)_N), (a_azC |> (TYPE co_azF)_N) #)
[LclId]
sat_sEo
= \ (x_sEn :: (a_azC |> (TYPE co_azF)_N)) ->
(eta_sEl
`cast` <Co:37> :: (a_azC -> a_azC -> (# a_azC, a_azC #))
~R# ((a_azC |> (TYPE co_azF)_N)
-> (a_azC |> (TYPE co_azF)_N)
-> (# (a_azC |> (TYPE co_azF)_N),
(a_azC |> (TYPE co_azF)_N) #)))
x_sEn x_sEn } in
sat_sEo
`cast` <Co:30> :: ((a_azC |> (TYPE co_azF)_N)
-> (# (a_azC |> (TYPE co_azF)_N), (a_azC |> (TYPE co_azF)_N) #))
~R# (a_azC -> (# a_azC, a_azC #))
}
```

Which result in this final STG

```
M.bar
:: forall (r :: GHC.Types.RuntimeRep) (a :: TYPE r).
M.H r -> (a -> a -> (# a, a #)) -> a -> (# a, a #)
[GblId,
Arity=2,
Str=<1!P(L)><1C1(C1(!P(L,L)))><L>,
Unf=OtherCon []] =
{} \r [ds_sF9 eta_sFa]
case ds_sF9 of {
M.MkH ->
let {
sat_sFd [Occ=Once1]
:: (a_azR |> (TYPE co_azU)_N)
-> (# (a_azR |> (TYPE co_azU)_N), (a_azR |> (TYPE co_azU)_N) #)
[LclId] =
{eta_sFa} \r [x_sFc] eta_sFa x_sFc x_sFc;
} in sat_sFd;
};
```

I think the issue is that in core we have a coercion of `eta_sEl`

from `forall r (a :: TYPE r). a>a->a`

to `forall (a :: Type IntRep). a -> a -> a)`

so everything is fine.

In the code generator in Cmm we actually never look into the type of the function for infomation about it's arguments so there we don't care either.

But in STG we just throw away the cast and so when inspecting the type of `eta_sEl`

we see `forall r (a :: TYPE r). a>a->a`

. If we then try to compare the runtime rep of the arguments in this type with the ones passed to the function then `typePrimRep`

blows up.

At this point in my understanding the type of `eta_sFa`

is just wrong at the end of the STG pipeline, because the runetime rep of the type should match it's implementation when compiled. But it just so happens that it doesn't matter currently.

Not sure how to best go about this. @sheaf You worked a little bit on taming levity polymorphism. Maybe you have any suggestions?