GHC issueshttps://gitlab.haskell.org/ghc/ghc/-/issues2019-07-07T18:00:21Zhttps://gitlab.haskell.org/ghc/ghc/-/issues/16368Boot GHC's libffi installation path leaks into stage1 build2019-07-07T18:00:21ZSebastian GrafBoot GHC's libffi installation path leaks into stage1 buildStarting yesterday, I'm having problems building GHC from a `nix-shell --pure https://github.com/alpmestan/ghc.nix/archive/master.tar.gz`.
Build comes to a halt after/while configuring the RTS, because it can't find libffi.h in a global...Starting yesterday, I'm having problems building GHC from a `nix-shell --pure https://github.com/alpmestan/ghc.nix/archive/master.tar.gz`.
Build comes to a halt after/while configuring the RTS, because it can't find libffi.h in a global nix store installation that I can't find anywhere in my `env`. That nix-store path certainly exists and has an installation of `libffi`, but without an include path (probably since recently, haven't had problems before).
After some debugging, I figured out that this path leaks in through `ghc-cabal` crawling through the Package DB of the host GHC. These are the libraries it finds out about this way (dumped by inserting `putStrLn (unlines (forDeps Installed.includeDirs))` in line 388 of `utils/ghc-cabal/Main.hs`:
```
/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/base-4.12.0.0/include
/nix/store/5c9pfgazxid22ik3smh8zi805cp1i03y-gmp-6.1.2-dev/include
/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/integer-gmp-1.0.2.0/include
/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/include
/nix/store/karxq4hlfmfj0c3yk4wv5mfaz06p70k8-libffi-3.2.1/include
```
They are then passed on to `DEP_INCLUDE_DIRS_SINGLE_QUOTED` and wreak havoc from there.
Which of these paths are vital? The `libffi` path at least seems to shadow the local tarballs for me. This concerns Hadrian and the Make-based build system.
For completeness, this is the error I'm eventually seeing:
```
FFI.hsc:9:10: fatal error: ffi.h: No such file or directory
compilation terminated.
compiling _build/stage0/libraries/ghci/build/GHCi/FFI_hsc_make.c failed (exit code 1)
command was: /nix/store/8zfm4i1aw4c3l5n6ay311ds6l8vd9983-gcc-wrapper-7.4.0/bin/cc -c _build/stage0/libraries/ghci/build/GHCi/FFI_hsc_make.c -o _build/stage0/libraries/ghci/build/GHCi/FFI_hsc_make.o -I/nix/store/891h83mar65k138156v41kzryc9ij0v3-ghc-build-environment/include -I_build/generated -I_build/stage0/libraries/ghci/build -I/nix/store/891h83mar65k138156v41kzryc9ij0v3-ghc-build-environment/include -I/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/unix-2.7.2.2/include -I/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/time-1.8.0.2/include -I/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/bytestring-0.10.8.2/include -I/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/base-4.12.0.0/include -I/nix/store/5c9pfgazxid22ik3smh8zi805cp1i03y-gmp-6.1.2-dev/include -I/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/integer-gmp-1.0.2.0/include -I/nix/store/7874h075nf8yikvr47642xqrwqwyv99s-ghc-8.6.3/lib/ghc-8.6.3/include -I/nix/store/karxq4hlfmfj0c3yk4wv5mfaz06p70k8-libffi-3.2.1/include -Wall -Werror=unused-but-set-variable -Wno-error=inline -include _build/stage0/libraries/ghci/build/autogen/cabal_macros.h -Dx86_64_HOST_ARCH=1 -Dlinux_HOST_OS=1 -D__GLASGOW_HASKELL__=806
```⊥https://gitlab.haskell.org/ghc/ghc/-/issues/16335CPR Analysis is too conservative for certain inductive cases2022-09-27T09:02:04ZSebastian GrafCPR Analysis is too conservative for certain inductive casesWhile investigating the generated Core for a simple toy benchmark, I came up with the following minimal example:
```hs
data Expr
= Lit Int
| Plus Expr Expr
eval :: Expr -> Int
eval (Lit n) = n
eval (Plus a b) = eval a + eval b
```
...While investigating the generated Core for a simple toy benchmark, I came up with the following minimal example:
```hs
data Expr
= Lit Int
| Plus Expr Expr
eval :: Expr -> Int
eval (Lit n) = n
eval (Plus a b) = eval a + eval b
```
resulting in the following Core:
```
eval
= \ (ds_d112 :: Expr) ->
case ds_d112 of {
Lit n_aXf -> n_aXf;
Plus a_aXg b_aXh ->
case eval a_aXg of { GHC.Types.I# x_a1bK ->
case eval b_aXh of { GHC.Types.I# y_a1bO ->
GHC.Types.I# (GHC.Prim.+# x_a1bK y_a1bO)
}
}
}
```
Note that this needlessly unboxes and boxes primitive Ints. Lifting this is precisely the job of CPR, but `eval` doesn't exactly have the CPR property: the `Lit` case doesn't return a product. But we're punishing ourselves for the base case when *even the function itself* recurses multiple times!
The code resulting from WWing here wouldn't even look bad in the `Lit` case:
```
Foo.$weval
= \ (w_s1cn :: Expr) ->
case w_s1cn of {
Lit dt_d11b -> case dt_d11b { I# ds_abcd -> ds_abcd };
Plus a_aXf b_aXg ->
case Foo.$weval a_aXf of ww_s1cq { __DEFAULT ->
case Foo.$weval b_aXg of ww1_X1dh { __DEFAULT ->
GHC.Prim.+# ww_s1cq ww1_X1dh
}
}
}
eval
= \ (w_s1cn :: Expr) ->
case Foo.$weval w_s1cn of ww_s1cq { __DEFAULT ->
GHC.Types.I# ww_s1cq
}
```
Granted, this is bad for the case where there is no recursion happening and we need the result of `eval` boxed, but that's a small price to pay IMO.
I begin to think of CPR more of as the "dual" to SpecConstr than to Strictness Analysis. Return pattern specialisation, so to speak.
-----
I'll add some more examples here when such a return-pattern specialisation might be useful.
- Nested CPR may not unbox if termination of nested components doesn't permit it to. We leave a -17% improvement in `fish` on the table [here](https://gitlab.haskell.org/ghc/ghc/-/merge_requests/1866#note_272983).
- #4267
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ------------ |
| Version | 8.6.3 |
| Type | Task |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Make CPR Analysis more aggressive for inductive cases","status":"New","operating_system":"","component":"Compiler","related":[],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.3","keywords":["CPRAnalysis"],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Task","description":"While investigating the generated Core for a simple toy benchmark, I came up with the following minimal example:\r\n\r\n{{{#!hs\r\n\r\ndata Expr\r\n = Lit Int\r\n | Plus Expr Expr\r\n\r\neval :: Expr -> Int\r\neval (Lit n) = n\r\neval (Plus a b) = eval a + eval b\r\n}}}\r\n\r\nresulting in the following Core:\r\n\r\n{{{\r\neval\r\n = \\ (ds_d112 :: Expr) ->\r\n case ds_d112 of {\r\n Lit n_aXf -> n_aXf;\r\n Plus a_aXg b_aXh ->\r\n case eval a_aXg of { GHC.Types.I# x_a1bK ->\r\n case eval b_aXh of { GHC.Types.I# y_a1bO ->\r\n GHC.Types.I# (GHC.Prim.+# x_a1bK y_a1bO)\r\n }\r\n }\r\n }\r\n}}}\r\n\r\nNote that this needlessly unboxes and boxes primitive Ints. Lifting this is precisely the job of CPR, but `eval` doesn't exactly have the CPR property: the `Lit` case doesn't return a product. But we're punishing ourselves for the base case when ''even the function itself'' recurses multiple times!\r\n\r\nThe code resulting from WWing here wouldn't even look bad in the `Lit` case:\r\n\r\n{{{\r\nFoo.$weval\r\n = \\ (w_s1cn :: Expr) ->\r\n case w_s1cn of {\r\n Lit dt_d11b -> case dt_d11b { I# ds_abcd -> ds_abcd };\r\n Plus a_aXf b_aXg ->\r\n case Foo.$weval a_aXf of ww_s1cq { __DEFAULT ->\r\n case Foo.$weval b_aXg of ww1_X1dh { __DEFAULT ->\r\n GHC.Prim.+# ww_s1cq ww1_X1dh\r\n }\r\n }\r\n }\r\n\r\neval\r\n = \\ (w_s1cn :: Expr) ->\r\n case Foo.$weval w_s1cn of ww_s1cq { __DEFAULT ->\r\n GHC.Types.I# ww_s1cq\r\n }\r\n}}}\r\n\r\nGranted, this is bad for the case where there is no recursion happening and we need the result of `eval` boxed, but that's a small price to pay IMO.\r\n\r\nI begin to think of CPR more of as the \"dual\" to SpecConstr than to Strictness Analysis. Return pattern specialisation, so to speak.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/16284Abortion of fixed-point iteration in Demand Analyser discards sound results2019-07-24T13:42:46ZSebastian GrafAbortion of fixed-point iteration in Demand Analyser discards sound resultsConsider the following program:
```hs
-- Extracted from T4903
module Foo where
data Tree = Bin Tree Tree
tree :: Tree
tree = Bin tree tree
eq :: Tree -> Bool
eq (Bin l r) = eq l && eq r
```
`eq` amounts to the following Core:
```hs...Consider the following program:
```hs
-- Extracted from T4903
module Foo where
data Tree = Bin Tree Tree
tree :: Tree
tree = Bin tree tree
eq :: Tree -> Bool
eq (Bin l r) = eq l && eq r
```
`eq` amounts to the following Core:
```hs
eq
= \ (ds_d20O :: Tree) ->
case ds_d20O of { Bin l_aY8 r_aY9 ->
case eq l_aY8 of {
False -> False;
True -> eq r_aY9
}
}
```
It clearly diverges. That's also what pure strictness/termination/CPR analysis would find out. But because usage analysis can't find out in finite time (and that's fine) that `eq` uses its argument completely, we abort fixed-point iteration \[1\] and return `nopSig`, discarding useful and sound strictness information we found out along the way, like the fact that it diverges.
This isn't hard to fix: Just track which parts of the signature were still unsound during abortion and only zap them. I'm only recording this for posterity and as an argument in a discussion I'll maybe start in a few months of time...
\[1\] This is the ascending chain of demands on `ds_d20O` during fixed-point iteration:
```
H
U(1*H,1*H)
U(1*U(1*H,1*H),1*U(1*H,1*H))
...
```
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ------------ |
| Version | 8.6.3 |
| Type | Task |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Abortion of fixed-point iteration in Demand Analyser discards sound results","status":"New","operating_system":"","component":"Compiler","related":[],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.3","keywords":["DemandAnalysis"],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Task","description":"Consider the following program:\r\n\r\n{{{#!hs\r\n-- Extracted from T4903\r\nmodule Foo where\r\n\r\ndata Tree = Bin Tree Tree\r\n\r\ntree :: Tree\r\ntree = Bin tree tree\r\n\r\neq :: Tree -> Bool\r\neq (Bin l r) = eq l && eq r\r\n}}}\r\n\r\n`eq` amounts to the following Core:\r\n\r\n{{{#!hs\r\neq\r\n = \\ (ds_d20O :: Tree) ->\r\n case ds_d20O of { Bin l_aY8 r_aY9 ->\r\n case eq l_aY8 of {\r\n False -> False;\r\n True -> eq r_aY9\r\n }\r\n }\r\n}}}\r\n\r\nIt clearly diverges. That's also what pure strictness/termination/CPR analysis would find out. But because usage analysis can't find out in finite time (and that's fine) that `eq` uses its argument completely, we abort fixed-point iteration [1] and return `nopSig`, discarding useful and sound strictness information we found out along the way, like the fact that it diverges.\r\n\r\nThis isn't hard to fix: Just track which parts of the signature were still unsound during abortion and only zap them. I'm only recording this for posterity and as an argument in a discussion I'll maybe start in a few months of time...\r\n\r\n[1] This is the ascending chain of demands on `ds_d20O` during fixed-point iteration:\r\n{{{\r\nH\r\nU(1*H,1*H)\r\nU(1*U(1*H,1*H),1*U(1*H,1*H))\r\n...\r\n}}}","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/16106Remove PowerPC OS X (Darwin) support2022-04-06T09:20:42ZPeter Trommlerptrommler@acm.orgRemove PowerPC OS X (Darwin) supportOS X for PowerPC is not supported by Apple anymore. GHC's support has most likely bit-rotted and we have no test machine. Let's go ahead and remove support for PowerPC Darwin.
The following components contain PowerPC Darwin specific cod...OS X for PowerPC is not supported by Apple anymore. GHC's support has most likely bit-rotted and we have no test machine. Let's go ahead and remove support for PowerPC Darwin.
The following components contain PowerPC Darwin specific code:
- native code generator
- RTS
- driver
- build system
- testsuite⊥Peter Trommlerptrommler@acm.orgPeter Trommlerptrommler@acm.orghttps://gitlab.haskell.org/ghc/ghc/-/issues/16065Don't do stack squeezing during context switches in single-threaded programs ...2020-01-16T14:35:38ZSebastian GrafDon't do stack squeezing during context switches in single-threaded programs to guarantee determinism in allocationsAs #4450 and #8611 show, stack squeezing in the RTS makes allocation numbers between two different runs of the same binary non-deterministic, because its effect is depending on when context switches are bound to happen.
A short-term sol...As #4450 and #8611 show, stack squeezing in the RTS makes allocation numbers between two different runs of the same binary non-deterministic, because its effect is depending on when context switches are bound to happen.
A short-term solution might be to deactivate stack squeezing for vulnerable benchmarks with ~~`+RTS -Z`~~ `+RTS -V0` like in [D5460](https://phabricator.haskell.org/D5460), but IMO a more elegant solution would be to only deactivate stack squeezing in `threadPause` calls that happen due to context switches. Would you agree?⊥https://gitlab.haskell.org/ghc/ghc/-/issues/16010Broken Link in Data.OldList2019-07-07T18:02:02ZSven TennieBroken Link in Data.OldListIn `Data.OldList`:
```haskell
infix 5 \\ -- comment to fool cpp: https://www.haskell.org/ghc/docs/latest/html/users_guide/options-phases.html#cpp-string-gaps
```
Looks like the URL changed to: https://downloads.haskell.org/\~ghc/latest...In `Data.OldList`:
```haskell
infix 5 \\ -- comment to fool cpp: https://www.haskell.org/ghc/docs/latest/html/users_guide/options-phases.html#cpp-string-gaps
```
Looks like the URL changed to: https://downloads.haskell.org/\~ghc/latest/docs/html/users_guide/phases.html\#cpp-and-string-gaps
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | -------------- |
| Version | 8.7 |
| Type | Task |
| TypeOfFailure | OtherFailure |
| Priority | lowest |
| Resolution | Unresolved |
| Component | libraries/base |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Broken Link in Data.OldList","status":"New","operating_system":"","component":"libraries/base","related":[],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.7","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Task","description":"In `Data.OldList`:\r\n{{{\r\n#!haskell\r\ninfix 5 \\\\ -- comment to fool cpp: https://www.haskell.org/ghc/docs/latest/html/users_guide/options-phases.html#cpp-string-gaps\r\n}}}\r\n\r\nLooks like the URL changed to: https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/phases.html#cpp-and-string-gaps","type_of_failure":"OtherFailure","blocking":[]} -->⊥Sven TennieSven Tenniehttps://gitlab.haskell.org/ghc/ghc/-/issues/16007Implement `-Os`2019-07-07T18:02:03ZSebastian GrafImplement `-Os`Popular C compilers like GCC or clang allow to optimise for binary size.
Since there are multiple ways in which GHC trades code size for faster programs, it might make sense to follow suit. This ticket is for tracking which optimisation...Popular C compilers like GCC or clang allow to optimise for binary size.
Since there are multiple ways in which GHC trades code size for faster programs, it might make sense to follow suit. This ticket is for tracking which optimisations/hacks in the compiler might be affected.
Feel free to add items to the following list:
- potentially huge impact due to specialisation, liberate case and inlining
- potential of -0.9% due to a second run of common block elimination (#14226)
- +0.1% increase in code size due to known calls instead of re-using generic apply thunks in #16005
- -0.1% due to `-fstg-lift-lams` #9476⊥https://gitlab.haskell.org/ghc/ghc/-/issues/16003Average and maximum residency numbers in nofib broken2019-07-07T18:02:05ZSebastian GrafAverage and maximum residency numbers in nofib brokenMeasuring average and maximum residency in nofib is currently broken:
```
<<ghc: 55717832 bytes, 32 GCs (29 + 3), 0/0 avg/max bytes residency (0 samples), 58859672 bytes GC work, ...
```
I think that [this line](https://github.com/ghc/...Measuring average and maximum residency in nofib is currently broken:
```
<<ghc: 55717832 bytes, 32 GCs (29 + 3), 0/0 avg/max bytes residency (0 samples), 58859672 bytes GC work, ...
```
I think that [this line](https://github.com/ghc/nofib/blob/f87d446b4e361cc82f219cf78917db9681af69b3/runstdtest/runstdtest.prl#L403) in runstdtest.prl was responsible for matching on `-S` output, but since [this commit](https://phabricator.haskell.org/rNOFIB4e12b05c76e98aa8e32c9f867519c8187e69e12b), we only do `-s` for performance reasons. There's [this bit of outcommented code](https://github.com/ghc/nofib/blob/f87d446b4e361cc82f219cf78917db9681af69b3/runstdtest/runstdtest.prl#L417-L419) that probably does what we want.
It seems that residency numbers were the only metrics depending on `-S`. I don't think there's a way to recover average residency with `-s` alone.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | --------------------- |
| Version | 8.6.2 |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | NoFib benchmark suite |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | #5793 |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Average and maximum residency numbers in nofib broken","status":"New","operating_system":"","component":"NoFib benchmark suite","related":[5793],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.2","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Bug","description":"Measuring average and maximum residency in nofib is currently broken:\r\n\r\n{{{\r\n<<ghc: 55717832 bytes, 32 GCs (29 + 3), 0/0 avg/max bytes residency (0 samples), 58859672 bytes GC work, ...\r\n}}}\r\n\r\nI think that [https://github.com/ghc/nofib/blob/f87d446b4e361cc82f219cf78917db9681af69b3/runstdtest/runstdtest.prl#L403 this line] in runstdtest.prl was responsible for matching on `-S` output, but since [https://phabricator.haskell.org/rNOFIB4e12b05c76e98aa8e32c9f867519c8187e69e12b this commit], we only do `-s` for performance reasons. There's [https://github.com/ghc/nofib/blob/f87d446b4e361cc82f219cf78917db9681af69b3/runstdtest/runstdtest.prl#L417-L419 this bit of outcommented code] that probably does what we want.\r\n\r\nIt seems that residency numbers were the only metrics depending on `-S`. I don't think there's a way to recover average residency with `-s` alone.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15999Stabilise nofib runtime measurements2020-10-13T09:30:43ZSebastian GrafStabilise nofib runtime measurementsWith [D4989](https://phabricator.haskell.org/D4989) (cf. #15357) having hit `nofib` master, there are still many benchmarks that are unstable in one way or another. I identified three causes for unstability in #5793\##15999. With system ...With [D4989](https://phabricator.haskell.org/D4989) (cf. #15357) having hit `nofib` master, there are still many benchmarks that are unstable in one way or another. I identified three causes for unstability in #5793\##15999. With system overhead mostly out of the equation, there are still two related tasks left:
1. Identify benchmarks with GC wibbles. Plan: Look at how productivity rate changes while increasing gen 0 heap size. A GC-sensitive benchmark should have a non-monotonic or discontinuous productivity-rate-over-nursery-size curve. Then fix these by iterating `main` often enough for the curve to become smooth and monotone.
1. Now, all benchmarks should have monotonically decreasing instruction count for increasing nursery sizes. If not, maybe there's another class of benchmarks I didn't identify yet in #5793. Of these benchmarks, there are a few, like `real/eff/CS`, that still have highly code layout-sensitive runtimes. Fix these 'microbenchmarks' by hiding them behind a flag.⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15976Can't run nofib in parallel2019-07-07T18:02:11ZSebastian GrafCan't run nofib in parallelI was under the impression that `make -j$n` within `nofib` would run benchmarks in parallel, but it doesn't. The following warning is indicative:
```
make[1]: warning: -jN forced in submake: disabling jobserver mode.
```
Whenever `make...I was under the impression that `make -j$n` within `nofib` would run benchmarks in parallel, but it doesn't. The following warning is indicative:
```
make[1]: warning: -jN forced in submake: disabling jobserver mode.
```
Whenever `make` is called recursively, any `-j` flags will lead to this warning and consequently disable parallelisation (which would lead to a number of jobs exponential in the depth of recursive calls).
Parallel benchmarks are useful to get deterministic metrics such as allocations or counted instructions really fast.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | --------------------- |
| Version | |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | NoFib benchmark suite |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Can't run nofib in parallel","status":"New","operating_system":"","component":"NoFib benchmark suite","related":[],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Bug","description":"I was under the impression that `make -j$n` within `nofib` would run benchmarks in parallel, but it doesn't. The following warning is indicative:\r\n\r\n{{{\r\nmake[1]: warning: -jN forced in submake: disabling jobserver mode.\r\n}}}\r\n\r\nWhenever `make` is called recursively, any `-j` flags will lead to this warning and consequently disable parallelisation (which would lead to a number of jobs exponential in the depth of recursive calls).\r\n\r\nParallel benchmarks are useful to get deterministic metrics such as allocations or counted instructions really fast.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15931MonoLocalBinds + MonomorphismRestriction prevents generalization for a top le...2019-07-07T18:02:24ZVarun GandhiMonoLocalBinds + MonomorphismRestriction prevents generalization for a top level definitionConsider the following code sample
```haskell
{-# LANGUAGE MonoLocalBinds #-}
{-# LANGUAGE MonomorphismRestriction #-}
tmp = 10
picker x y = if tmp > 11 then x else y
main = do
print (picker "x" "y")
print (picker 10 11)
```
It fai...Consider the following code sample
```haskell
{-# LANGUAGE MonoLocalBinds #-}
{-# LANGUAGE MonomorphismRestriction #-}
tmp = 10
picker x y = if tmp > 11 then x else y
main = do
print (picker "x" "y")
print (picker 10 11)
```
It fails with the misleading error message "\* No instance for (Num \[Char\]) arising from the literal \`10'...", from what seems to be an interaction between MonoLocalBinds and MonomorphismRestriction (turn either off and the error goes away).
Should this be happening only for local bindings, or is it correct for this error to occur for top-level definitions too?
In either case, would it be possible to give a better error message here?
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ------------ |
| Version | 8.6.2 |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | low |
| Resolution | Unresolved |
| Component | Compiler |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"MonoLocalBinds + MonomorphismRestriction prevents generalization for a top level definition","status":"New","operating_system":"","component":"Compiler","related":[],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.2","keywords":["MonoLocalBinds,","MonomorphismRestriction"],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Bug","description":"Consider the following code sample\r\n\r\n{{{#!haskell\r\n{-# LANGUAGE MonoLocalBinds #-}\r\n{-# LANGUAGE MonomorphismRestriction #-}\r\ntmp = 10\r\npicker x y = if tmp > 11 then x else y\r\nmain = do\r\n print (picker \"x\" \"y\")\r\n print (picker 10 11)\r\n}}}\r\n\r\nIt fails with the misleading error message \"* No instance for (Num [Char]) arising from the literal `10'...\", from what seems to be an interaction between MonoLocalBinds and MonomorphismRestriction (turn either off and the error goes away).\r\n\r\nShould this be happening only for local bindings, or is it correct for this error to occur for top-level definitions too?\r\n\r\nIn either case, would it be possible to give a better error message here?","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15918mkCastTy sometimes drops insoluble (Type ~ Constraint) coercions2023-06-07T22:28:36ZIcelandjackmkCastTy sometimes drops insoluble (Type ~ Constraint) coercions**EDIT: See [ticket:15918\#comment:164033](https://gitlab.haskell.org//ghc/ghc/issues/15918#note_164033) for the real cause and [ticket:15918\#comment:169361](https://gitlab.haskell.org//ghc/ghc/issues/15918#note_169361) for the current ...**EDIT: See [ticket:15918\#comment:164033](https://gitlab.haskell.org//ghc/ghc/issues/15918#note_164033) for the real cause and [ticket:15918\#comment:169361](https://gitlab.haskell.org//ghc/ghc/issues/15918#note_169361) for the current plan.**
See also:
- TyCoRep `Note [Respecting definitional equality]` and its supporting `Note [Non-trivial definitional equality]`
- #11715 Constraint vs \*
- #15799
- #13650
TL;DR: there's a bug in here, that we don't know how to solve. But it's not biting us, and once we solve #11715 that may point the way.
---------------------------
Minimized from https://gist.github.com/Icelandjack/683bd4b79027695ffc31632645c9d58b, I don't expect `Build []` to kind check but ti shouldn't crash.
```hs
{-# Language PolyKinds #-}
{-# Language TypeFamilies #-}
{-# Language ConstraintKinds #-}
{-# Language FlexibleContexts #-}
{-# Language QuantifiedConstraints #-}
{-# Language UndecidableInstances #-}
import Data.Kind
class Rev f where
rev :: f a
instance (forall xx. cls xx => Rev xx) => Rev (Build cls) where
rev = undefined
data Build :: ((k -> Type) -> Constraint) -> (k -> Type)
uu = rev :: Build [] a
```
gives a panic
```
$ ./ghc-stage2 --interactive -ignore-dot-ghci ~/hs/711_bug.hs
GHCi, version 8.7.20181029: http://www.haskell.org/ghc/ :? for help
[1 of 1] Compiling Main ( ~/hs/711_bug.hs, interpreted )
ghc-stage2: panic! (the 'impossible' happened)
(GHC version 8.7.20181029 for x86_64-unknown-linux):
ASSERT failed!
irred_a1zW :: [(xx_a1zV[sk:3] |> Sym {co_a1zu})]
Call stack:
CallStack (from HasCallStack):
callStackDoc, called at compiler/utils/Outputable.hs:1160:37 in ghc:Outputable
pprPanic, called at compiler/utils/Outputable.hs:1219:5 in ghc:Outputable
assertPprPanic, called at compiler/typecheck/TcType.hs:1826:53 in ghc:TcType
Please report this as a GHC bug: http://www.haskell.org/ghc/reportabug
>
```⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15901Assert and record that code generation requires distinct uiques for let-binders2019-07-07T18:02:31ZSebastian GrafAssert and record that code generation requires distinct uiques for let-bindersSimon writes in [\#15754](https://ghc.haskell.org/trac/ghc/ticket/15754#comment:10):
Another invariant that the code generator needs is (I believe) that every let-binder has a distinct unique. We can't re-use the same unique, even in a ...Simon writes in [\#15754](https://ghc.haskell.org/trac/ghc/ticket/15754#comment:10):
Another invariant that the code generator needs is (I believe) that every let-binder has a distinct unique. We can't re-use the same unique, even in a different scope, let alone shadowing.
Why? Because (I believe) that the code generator uses these uniques to generate unique top level labels for the entry code and info table for the closure.
Now, it's probably the case that they are unique anyway; but we should either
1. Write this down as a invariant of STG, or
1. Establish this invariant in the immediately-before-codegen pass that gets the free vars right, or
1. Not assume it in the code generator ⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15893View Patterns affect typechecking in an unpredictable manner2019-07-07T18:02:33ZVarun GandhiView Patterns affect typechecking in an unpredictable mannerCopying the text from my [StackOverflow question](https://stackoverflow.com/questions/53294823/viewpatterns-affects-typechecking-in-an-unpredictable-manner).
```
{-# LANGUAGE ViewPatterns #-}
import Data.Text (Text)
import qualified Da...Copying the text from my [StackOverflow question](https://stackoverflow.com/questions/53294823/viewpatterns-affects-typechecking-in-an-unpredictable-manner).
```
{-# LANGUAGE ViewPatterns #-}
import Data.Text (Text)
import qualified Data.Text as T
import qualified Data.Vector.Unboxed as UV
import qualified Data.Vector.Generic as V
bar :: Int -> UV.Vector Char -> (Text, Text)
bar i v = (t_pre, t_post)
where
f = T.pack . V.toList
(f -> t_pre, f -> t_post) = V.splitAt i v
```
This gives an unexpected type error (tested with GHC 8.4.3), concerning ambiguous type variables and `f` being out of scope.
```
• Ambiguous type variable ‘v0’ arising from a use of ‘V.toList’
prevents the constraint ‘(V.Vector v0 Char)’ from being solved.
Relevant bindings include
f :: v0 Char -> Text (bound at Weird.hs:11:5)
Probable fix: use a type annotation to specify what ‘v0’ should be.
These potential instances exist:
instance V.Vector UV.Vector Char
-- Defined in ‘Data.Vector.Unboxed.Base’
...plus one instance involving out-of-scope types
instance primitive-0.6.3.0:Data.Primitive.Types.Prim a =>
V.Vector Data.Vector.Primitive.Vector a
-- Defined in ‘Data.Vector.Primitive’
• In the second argument of ‘(.)’, namely ‘V.toList’
In the expression: T.pack . V.toList
In an equation for ‘f’: f = T.pack . V.toList
|
11 | f = T.pack . V.toList
| ^^^^^^^^
Weird.hs:13:6: error:
Variable not in scope: f :: UV.Vector Char -> t
|
13 | (f -> t_pre, f -> t_post) = V.splitAt i v
| ^
Weird.hs:13:18: error:
Variable not in scope: f :: UV.Vector Char -> t1
|
13 | (f -> t_pre, f -> t_post) = V.splitAt i v
| ^
```
Some of the other answers on SO have done some digging (e.g. enabling FlexibleContexts / NoMonomorphismRestriction doesn't solve the issue). I haven't yet figured out how to minimize the example further to remove the dependency on Vector...
Is this a compiler bug or a documentation bug?
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ------------ |
| Version | 8.4.3 |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"View Patterns affect typechecking in an unpredictable manner","status":"New","operating_system":"","component":"Compiler","related":[],"milestone":"⊥","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.4.3","keywords":["ViewPatterns"],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Bug","description":"Copying the text from my [https://stackoverflow.com/questions/53294823/viewpatterns-affects-typechecking-in-an-unpredictable-manner StackOverflow question].\r\n\r\n\r\n{{{\r\n{-# LANGUAGE ViewPatterns #-}\r\n\r\nimport Data.Text (Text)\r\nimport qualified Data.Text as T\r\nimport qualified Data.Vector.Unboxed as UV\r\nimport qualified Data.Vector.Generic as V\r\n\r\nbar :: Int -> UV.Vector Char -> (Text, Text)\r\nbar i v = (t_pre, t_post)\r\n where\r\n f = T.pack . V.toList\r\n (f -> t_pre, f -> t_post) = V.splitAt i v\r\n}}}\r\n\r\nThis gives an unexpected type error (tested with GHC 8.4.3), concerning ambiguous type variables and `f` being out of scope.\r\n\r\n{{{\r\n • Ambiguous type variable ‘v0’ arising from a use of ‘V.toList’\r\n prevents the constraint ‘(V.Vector v0 Char)’ from being solved.\r\n Relevant bindings include\r\n f :: v0 Char -> Text (bound at Weird.hs:11:5)\r\n Probable fix: use a type annotation to specify what ‘v0’ should be.\r\n These potential instances exist:\r\n instance V.Vector UV.Vector Char\r\n -- Defined in ‘Data.Vector.Unboxed.Base’\r\n ...plus one instance involving out-of-scope types\r\n instance primitive-0.6.3.0:Data.Primitive.Types.Prim a =>\r\n V.Vector Data.Vector.Primitive.Vector a\r\n -- Defined in ‘Data.Vector.Primitive’\r\n • In the second argument of ‘(.)’, namely ‘V.toList’\r\n In the expression: T.pack . V.toList\r\n In an equation for ‘f’: f = T.pack . V.toList\r\n |\r\n11 | f = T.pack . V.toList\r\n | ^^^^^^^^\r\n\r\nWeird.hs:13:6: error:\r\n Variable not in scope: f :: UV.Vector Char -> t\r\n |\r\n13 | (f -> t_pre, f -> t_post) = V.splitAt i v\r\n | ^\r\n\r\nWeird.hs:13:18: error:\r\n Variable not in scope: f :: UV.Vector Char -> t1\r\n |\r\n13 | (f -> t_pre, f -> t_post) = V.splitAt i v\r\n | ^\r\n}}}\r\n\r\nSome of the other answers on SO have done some digging (e.g. enabling FlexibleContexts / NoMonomorphismRestriction doesn't solve the issue). I haven't yet figured out how to minimize the example further to remove the dependency on Vector...\r\n\r\nIs this a compiler bug or a documentation bug?","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15690Add Eq1/Ord1/Read1/Show1 instances to newtypes in Data.Semigroup2023-04-01T17:19:02ZBodigrimAdd Eq1/Ord1/Read1/Show1 instances to newtypes in Data.SemigroupI'd like to have `Eq1`/`Ord1`/`Read1`/`Show1` instances for newtypes in `Data.Semigroup`, at least for `Min`, `Max` and `Option`. Is there a specific reason for their absence? If no, I'll be happy to prepare a PR.
<details><summary>Trac...I'd like to have `Eq1`/`Ord1`/`Read1`/`Show1` instances for newtypes in `Data.Semigroup`, at least for `Min`, `Max` and `Option`. Is there a specific reason for their absence? If no, I'll be happy to prepare a PR.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | -------------- |
| Version | 8.6.1 |
| Type | FeatureRequest |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | libraries/base |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Add Eq1/Ord1/Read1/Show1 instances to newtypes in Data.Semigroup","status":"New","operating_system":"","component":"libraries/base","related":[],"milestone":"8.8.1","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.1","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"FeatureRequest","description":"I'd like to have `Eq1`/`Ord1`/`Read1`/`Show1` instances for newtypes in `Data.Semigroup`, at least for `Min`, `Max` and `Option`. Is there a specific reason for their absence? If no, I'll be happy to prepare a PR.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15681Take exhaustiveness checking into consideration when using MonadFailDesugaring2024-03-27T10:21:00ZDmitrii KovanikovTake exhaustiveness checking into consideration when using MonadFailDesugaringConsider the following code:
```hs
import Data.List.NonEmpty (NonEmpty (..))
foo :: Monad m => m (NonEmpty a) -> m a
foo m = do
(x :| _) <- m
pure x
```
It works completely fine on GHC 8.6.1 and doesn't require `MonadFail` con...Consider the following code:
```hs
import Data.List.NonEmpty (NonEmpty (..))
foo :: Monad m => m (NonEmpty a) -> m a
foo m = do
(x :| _) <- m
pure x
```
It works completely fine on GHC 8.6.1 and doesn't require `MonadFail` constraint because `NonEmpty` has only single constructor so there're no other cases in pattern-matching. Howewer, if I rewrite this code using `-XPatternSynonyms` with `{-# COMPLETE #-}` pragma, it doesn't work anymore.
```hs
{-# LANGUAGE PatternSynonyms #-}
import Data.List.NonEmpty (NonEmpty (..))
newtype Foo a = Foo (NonEmpty a)
pattern (:||) :: a -> [a] -> Foo a
pattern x :|| xs <- Foo (x :| xs)
{-# COMPLETE (:||) #-}
foo :: Monad m => m (Foo a) -> m a
foo m = do
(x :|| _) <- m
pure x
```
And I see the following error:
```
• Could not deduce (Control.Monad.Fail.MonadFail m)
arising from a do statement
with the failable pattern ‘(x :|| _)’
from the context: MonadFoo m
bound by the type signature for:
foo :: forall (m :: * -> *) a. MonadFoo m => m (Foo a) -> m a
at /Users/fenx/haskell/sandbox/Fail.hs:13:1-37
Possible fix:
add (Control.Monad.Fail.MonadFail m) to the context of
the type signature for:
foo :: forall (m :: * -> *) a. MonadFoo m => m (Foo a) -> m a
• In a stmt of a 'do' block: (x :|| _) <- m
In the expression:
do (x :|| _) <- m
pure x
In an equation for ‘foo’:
foo m
= do (x :|| _) <- m
pure x
|
15 | (x :|| _) <- m
| ^^^^^^^^^^^^^^
```
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ------------ |
| Version | 8.6.1 |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Take {-# COMPLETE #-} pragma into consideration when using MonadFailDesugaring","status":"New","operating_system":"","component":"Compiler","related":[],"milestone":"8.6.1","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.1","keywords":["pattern-matching,monadfail,desugaring"],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Bug","description":"Consider the following code:\r\n\r\n{{{#!hs\r\nimport Data.List.NonEmpty (NonEmpty (..))\r\n\r\nfoo :: Monad m => m (NonEmpty a) -> m a\r\nfoo m = do\r\n (x :| _) <- m\r\n pure x\r\n}}}\r\n\r\nIt works completely fine on GHC 8.6.1 and doesn't require `MonadFail` constraint because `NonEmpty` has only single constructor so there're no other cases in pattern-matching. Howewer, if I rewrite this code using `-XPatternSynonyms` with `{-# COMPLETE #-}` pragma, it doesn't work anymore.\r\n\r\n{{{#!hs\r\n{-# LANGUAGE PatternSynonyms #-}\r\n\r\nimport Data.List.NonEmpty (NonEmpty (..))\r\n\r\nnewtype Foo a = Foo (NonEmpty a)\r\n\r\npattern (:||) :: a -> [a] -> Foo a\r\npattern x :|| xs <- Foo (x :| xs)\r\n{-# COMPLETE (:||) #-}\r\n\r\nfoo :: Monad m => m (Foo a) -> m a\r\nfoo m = do\r\n (x :|| _) <- m\r\n pure x\r\n}}}\r\n\r\nAnd I see the following error:\r\n\r\n{{{\r\n • Could not deduce (Control.Monad.Fail.MonadFail m)\r\n arising from a do statement\r\n with the failable pattern ‘(x :|| _)’\r\n from the context: MonadFoo m\r\n bound by the type signature for:\r\n foo :: forall (m :: * -> *) a. MonadFoo m => m (Foo a) -> m a\r\n at /Users/fenx/haskell/sandbox/Fail.hs:13:1-37\r\n Possible fix:\r\n add (Control.Monad.Fail.MonadFail m) to the context of\r\n the type signature for:\r\n foo :: forall (m :: * -> *) a. MonadFoo m => m (Foo a) -> m a\r\n • In a stmt of a 'do' block: (x :|| _) <- m\r\n In the expression:\r\n do (x :|| _) <- m\r\n pure x\r\n In an equation for ‘foo’:\r\n foo m\r\n = do (x :|| _) <- m\r\n pure x\r\n |\r\n15 | (x :|| _) <- m\r\n | ^^^^^^^^^^^^^^\r\n}}}","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15678Provide the provenance of unification variables in error messages when possible2023-12-14T10:21:25ZRyan ScottProvide the provenance of unification variables in error messages when possibleConsider the following code:
```hs
module Foo where
x :: Int
x = const 42 _
```
When compiles, this gives the following suggestion:
```
$ /opt/ghc/8.6.1/bin/ghc Bug.hs
[1 of 1] Compiling Foo ( Bug.hs, Bug.o )
Bug.hs:4:1...Consider the following code:
```hs
module Foo where
x :: Int
x = const 42 _
```
When compiles, this gives the following suggestion:
```
$ /opt/ghc/8.6.1/bin/ghc Bug.hs
[1 of 1] Compiling Foo ( Bug.hs, Bug.o )
Bug.hs:4:14: error:
• Found hole: _ :: b0
Where: ‘b0’ is an ambiguous type variable
• In the second argument of ‘const’, namely ‘_’
In the expression: const 42 _
In an equation for ‘x’: x = const 42 _
• Relevant bindings include x :: Int (bound at Bug.hs:4:1)
Valid hole fits include
x :: Int (bound at Bug.hs:4:1)
otherwise :: Bool
(imported from ‘Prelude’ at Bug.hs:1:8-10
(and originally defined in ‘GHC.Base’))
False :: Bool
(imported from ‘Prelude’ at Bug.hs:1:8-10
(and originally defined in ‘GHC.Types’))
True :: Bool
(imported from ‘Prelude’ at Bug.hs:1:8-10
(and originally defined in ‘GHC.Types’))
lines :: String -> [String]
(imported from ‘Prelude’ at Bug.hs:1:8-10
(and originally defined in ‘base-4.12.0.0:Data.OldList’))
unlines :: [String] -> String
(imported from ‘Prelude’ at Bug.hs:1:8-10
(and originally defined in ‘base-4.12.0.0:Data.OldList’))
(Some hole fits suppressed; use -fmax-valid-hole-fits=N or -fno-max-valid-hole-fits)
|
4 | x = const 42 _
| ^
```
One thing that's rather ugly about this is the use of the type `b0`. What exactly //is// `b0` anyway? The only hint that the error message gives is that it's an ambiguous type variable. But that's not terribly helpful to figure out where `b0` arises from. Ambiguous type variables like this one arise quite frequently when writing Haskell code, and it can often take some sleuthing to figure out why they pop up.
simonpj had one suggestion for making ambiguous type variables less confusing: report their provenance whenever possible. There is one notable example of a situation where it's simple to explain from where exactly in the source code a unification variable originates: function applications. In particular, the program above applies the function `const 42` to `_`, which means that the type of `const 42` is instantiated to be `b0 -> Int`. Let's report this! Something like:
```
• Found hole: _ :: b0
Where: ‘b0’ is an ambiguous type variable
Arising from an application of
(const 42 :: b0 -> Int)
In the expression: const 42 _
```
This would go a long way to clearing up what GHC is thinking when it reports these ambiguous type variable errors. While we can't easily report the provenance of //every// ambiguous type variables, those arising from function applications are quite doable. We might be able to reuse the `CtOrigin` machinery (or take heavy inspiration from it) to accomplish this feat.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ----------------------- |
| Version | 8.6.1 |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler (Type checker) |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | Tritlo |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Provide the provenance of unification variables in error messages when possible","status":"New","operating_system":"","component":"Compiler (Type checker)","related":[],"milestone":"8.8.1","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.1","keywords":["TypeErrors"],"differentials":[],"test_case":"","architecture":"","cc":["Tritlo"],"type":"Bug","description":"Consider the following code:\r\n\r\n{{{#!hs\r\nmodule Foo where\r\n\r\nx :: Int\r\nx = const 42 _\r\n}}}\r\n\r\nWhen compiles, this gives the following suggestion:\r\n\r\n{{{\r\n$ /opt/ghc/8.6.1/bin/ghc Bug.hs\r\n[1 of 1] Compiling Foo ( Bug.hs, Bug.o )\r\n\r\nBug.hs:4:14: error:\r\n • Found hole: _ :: b0\r\n Where: ‘b0’ is an ambiguous type variable\r\n • In the second argument of ‘const’, namely ‘_’\r\n In the expression: const 42 _\r\n In an equation for ‘x’: x = const 42 _\r\n • Relevant bindings include x :: Int (bound at Bug.hs:4:1)\r\n Valid hole fits include\r\n x :: Int (bound at Bug.hs:4:1)\r\n otherwise :: Bool\r\n (imported from ‘Prelude’ at Bug.hs:1:8-10\r\n (and originally defined in ‘GHC.Base’))\r\n False :: Bool\r\n (imported from ‘Prelude’ at Bug.hs:1:8-10\r\n (and originally defined in ‘GHC.Types’))\r\n True :: Bool\r\n (imported from ‘Prelude’ at Bug.hs:1:8-10\r\n (and originally defined in ‘GHC.Types’))\r\n lines :: String -> [String]\r\n (imported from ‘Prelude’ at Bug.hs:1:8-10\r\n (and originally defined in ‘base-4.12.0.0:Data.OldList’))\r\n unlines :: [String] -> String\r\n (imported from ‘Prelude’ at Bug.hs:1:8-10\r\n (and originally defined in ‘base-4.12.0.0:Data.OldList’))\r\n (Some hole fits suppressed; use -fmax-valid-hole-fits=N or -fno-max-valid-hole-fits)\r\n |\r\n4 | x = const 42 _\r\n | ^\r\n}}}\r\n\r\nOne thing that's rather ugly about this is the use of the type `b0`. What exactly //is// `b0` anyway? The only hint that the error message gives is that it's an ambiguous type variable. But that's not terribly helpful to figure out where `b0` arises from. Ambiguous type variables like this one arise quite frequently when writing Haskell code, and it can often take some sleuthing to figure out why they pop up.\r\n\r\nsimonpj had one suggestion for making ambiguous type variables less confusing: report their provenance whenever possible. There is one notable example of a situation where it's simple to explain from where exactly in the source code a unification variable originates: function applications. In particular, the program above applies the function `const 42` to `_`, which means that the type of `const 42` is instantiated to be `b0 -> Int`. Let's report this! Something like:\r\n\r\n{{{\r\n • Found hole: _ :: b0\r\n Where: ‘b0’ is an ambiguous type variable\r\n Arising from an application of\r\n (const 42 :: b0 -> Int)\r\n In the expression: const 42 _\r\n}}}\r\n\r\nThis would go a long way to clearing up what GHC is thinking when it reports these ambiguous type variable errors. While we can't easily report the provenance of //every// ambiguous type variables, those arising from function applications are quite doable. We might be able to reuse the `CtOrigin` machinery (or take heavy inspiration from it) to accomplish this feat.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15677Valid hole fits and GADT type variable names2021-09-07T15:56:45ZRyan ScottValid hole fits and GADT type variable namesConsider the following code:
```hs
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE GADTs #-}
{-# LANGUAGE KindSignatures #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TypeOperators #-}
module Foo where
import Data.Kind
data HList :: [...Consider the following code:
```hs
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE GADTs #-}
{-# LANGUAGE KindSignatures #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TypeOperators #-}
module Foo where
import Data.Kind
data HList :: [Type] -> Type where
HNil :: HList '[]
HCons :: x -> HList xs -> HList (x:xs)
foo :: HList a -> HList a
foo HNil = HNil
foo (HCons (b :: bType) bs) = HCons _ bs
```
Here is the suggestion that the typed hole in `foo` provides:
```
$ /opt/ghc/8.6.1/bin/ghc Bug.hs
[1 of 1] Compiling Foo ( Bug.hs, Bug.o )
Bug.hs:16:37: error:
• Found hole: _ :: x
Where: ‘x’ is a rigid type variable bound by
a pattern with constructor:
HCons :: forall x (xs :: [*]). x -> HList xs -> HList (x : xs),
in an equation for ‘foo’
at Bug.hs:16:6-26
• In the first argument of ‘HCons’, namely ‘_’
In the expression: HCons _ bs
In an equation for ‘foo’: foo (HCons (b :: bType) bs) = HCons _ bs
• Relevant bindings include
bs :: HList xs (bound at Bug.hs:16:25)
b :: x (bound at Bug.hs:16:13)
foo :: HList a -> HList a (bound at Bug.hs:15:1)
Constraints include a ~ (x : xs) (from Bug.hs:16:6-26)
Valid hole fits include b :: x (bound at Bug.hs:16:13)
|
16 | foo (HCons (b :: bType) bs) = HCons _ bs
| ^
```
One thing immediately stands out here: the hole has type `x`, but `x` appears no where in the definition of `foo`! I had expected this suggestion to mention `bType`, since I went through the effort of declaring `b` to have that type through a pattern signature, but GHC instead uses types from the definition of the `HCons` constructor itself. This seems less than ideal, since one would expect GHC to only ever mention types that are lexically in scope at a particular definition site.
One thing which complicates this idea is that there can be multiple in-scope type variables that all refer to the same type. For instance, if I define this function:
```hs
bar :: HList a -> HList a -> HList a
bar HNil HNil = HNil
bar (HCons (b :: bType) bs) (HCons (c :: cType) cs) = HCons _ bs
```
What should the suggested type of the hole be: `bType`, or `cType`? Either choice is equally valid. After talking with Tritlo and simonpj about this, we came to the consensus that we should just pick one of the type variables to report at the top of the error message:
```
• Found hole: _ :: bType
```
And then later in the message, include any type variable synonyms that have been brought into scope (via pattern signatures or otherwise). I imagine this might look something like:
```
• Type variable synonyms include
`cType` equals `bType`
```
This is quite similar to an existing feature of valid hole fits where we report `Constraints include`. (Indeed, we briefly considered just reporting these type variable synonyms as explicit equality constraints, but doing so would be somewhat misleading, since that's not how pattern signatures actually work in practice.)
One implementation challenge is to figure out how to construct a mapping from `x` to `bType`. One place where inspiration can be drawn from is the `ATyVar` constructor of `TcTyThing`:
```hs
data TcTyThing
= ...
| ATyVar Name TcTyVar -- The type variable to which the lexically scoped type
-- variable is bound. We only need the Name
-- for error-message purposes; it is the corresponding
-- Name in the domain of the envt
```
`ATyVar` already stores a "reverse mapping" of sorts to give better a more accurate `Name` in the event that it is pretty-printed, which is quite similar to what we need to do with `x` and `bType`.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ------------ |
| Version | 8.6.1 |
| Type | Bug |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | Tritlo |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Valid hole fits and GADT type variable names","status":"New","operating_system":"","component":"Compiler","related":[],"milestone":"8.8.1","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.1","keywords":["GADTs","TypedHoles,"],"differentials":[],"test_case":"","architecture":"","cc":["Tritlo"],"type":"Bug","description":"Consider the following code:\r\n\r\n{{{#!hs\r\n{-# LANGUAGE DataKinds #-}\r\n{-# LANGUAGE GADTs #-}\r\n{-# LANGUAGE KindSignatures #-}\r\n{-# LANGUAGE ScopedTypeVariables #-}\r\n{-# LANGUAGE TypeOperators #-}\r\nmodule Foo where\r\n\r\nimport Data.Kind\r\n\r\ndata HList :: [Type] -> Type where\r\n HNil :: HList '[]\r\n HCons :: x -> HList xs -> HList (x:xs)\r\n\r\nfoo :: HList a -> HList a\r\nfoo HNil = HNil\r\nfoo (HCons (b :: bType) bs) = HCons _ bs\r\n}}}\r\n\r\nHere is the suggestion that the typed hole in `foo` provides:\r\n\r\n{{{\r\n$ /opt/ghc/8.6.1/bin/ghc Bug.hs\r\n[1 of 1] Compiling Foo ( Bug.hs, Bug.o )\r\n\r\nBug.hs:16:37: error:\r\n • Found hole: _ :: x\r\n Where: ‘x’ is a rigid type variable bound by\r\n a pattern with constructor:\r\n HCons :: forall x (xs :: [*]). x -> HList xs -> HList (x : xs),\r\n in an equation for ‘foo’\r\n at Bug.hs:16:6-26\r\n • In the first argument of ‘HCons’, namely ‘_’\r\n In the expression: HCons _ bs\r\n In an equation for ‘foo’: foo (HCons (b :: bType) bs) = HCons _ bs\r\n • Relevant bindings include\r\n bs :: HList xs (bound at Bug.hs:16:25)\r\n b :: x (bound at Bug.hs:16:13)\r\n foo :: HList a -> HList a (bound at Bug.hs:15:1)\r\n Constraints include a ~ (x : xs) (from Bug.hs:16:6-26)\r\n Valid hole fits include b :: x (bound at Bug.hs:16:13)\r\n |\r\n16 | foo (HCons (b :: bType) bs) = HCons _ bs\r\n | ^\r\n}}}\r\n\r\nOne thing immediately stands out here: the hole has type `x`, but `x` appears no where in the definition of `foo`! I had expected this suggestion to mention `bType`, since I went through the effort of declaring `b` to have that type through a pattern signature, but GHC instead uses types from the definition of the `HCons` constructor itself. This seems less than ideal, since one would expect GHC to only ever mention types that are lexically in scope at a particular definition site.\r\n\r\nOne thing which complicates this idea is that there can be multiple in-scope type variables that all refer to the same type. For instance, if I define this function:\r\n\r\n{{{#!hs\r\nbar :: HList a -> HList a -> HList a\r\nbar HNil HNil = HNil\r\nbar (HCons (b :: bType) bs) (HCons (c :: cType) cs) = HCons _ bs\r\n}}}\r\n\r\nWhat should the suggested type of the hole be: `bType`, or `cType`? Either choice is equally valid. After talking with Tritlo and simonpj about this, we came to the consensus that we should just pick one of the type variables to report at the top of the error message:\r\n\r\n{{{\r\n • Found hole: _ :: bType\r\n}}}\r\n\r\nAnd then later in the message, include any type variable synonyms that have been brought into scope (via pattern signatures or otherwise). I imagine this might look something like:\r\n\r\n{{{\r\n • Type variable synonyms include\r\n `cType` equals `bType`\r\n}}}\r\n\r\nThis is quite similar to an existing feature of valid hole fits where we report `Constraints include`. (Indeed, we briefly considered just reporting these type variable synonyms as explicit equality constraints, but doing so would be somewhat misleading, since that's not how pattern signatures actually work in practice.)\r\n\r\nOne implementation challenge is to figure out how to construct a mapping from `x` to `bType`. One place where inspiration can be drawn from is the `ATyVar` constructor of `TcTyThing`:\r\n\r\n{{{#!hs\r\ndata TcTyThing\r\n = ...\r\n | ATyVar Name TcTyVar -- The type variable to which the lexically scoped type\r\n -- variable is bound. We only need the Name\r\n -- for error-message purposes; it is the corresponding\r\n -- Name in the domain of the envt\r\n}}}\r\n\r\n`ATyVar` already stores a \"reverse mapping\" of sorts to give better a more accurate `Name` in the event that it is pretty-printed, which is quite similar to what we need to do with `x` and `bType`.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15651Check if some auto apply code is dead and remove if appropriate.2021-09-07T15:56:33ZAndreas KlebingerCheck if some auto apply code is dead and remove if appropriate.There is a whole family of stg_ap_stk_\* and stg_stk_save_\* functions generated in AutoApply.cmm which as far as I can tell is not used anywhere in the compiler and can likely be removed.
In particular this would involve:
- Stop the c...There is a whole family of stg_ap_stk_\* and stg_stk_save_\* functions generated in AutoApply.cmm which as far as I can tell is not used anywhere in the compiler and can likely be removed.
In particular this would involve:
- Stop the code in question from being generated. (utils/genapply)
- Make sure that doesn't break things. (Validate/Run the testsuite).
- Grep for stk in the compiler to make sure we don't build calls to these in rare circumstances with string concatenation for extra safety.
- Document the change in patch notes just in case.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | -------------- |
| Version | 8.4.3 |
| Type | Task |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Runtime System |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Check if some auto apply code is dead and remove if appropriate.","status":"New","operating_system":"","component":"Runtime System","related":[],"milestone":"8.6.1","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.4.3","keywords":["Newcomer"],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"Task","description":"There is a whole family of stg_ap_stk_* and stg_stk_save_* functions generated in AutoApply.cmm which as far as I can tell is not used anywhere in the compiler and can likely be removed.\r\n\r\nIn particular this would involve:\r\n* Stop the code in question from being generated. (utils/genapply)\r\n* Make sure that doesn't break things. (Validate/Run the testsuite).\r\n* Grep for stk in the compiler to make sure we don't build calls to these in rare circumstances with string concatenation for extra safety.\r\n* Document the change in patch notes just in case.","type_of_failure":"OtherFailure","blocking":[]} -->⊥https://gitlab.haskell.org/ghc/ghc/-/issues/15642Improve the worst case performance of weak pointers2021-09-07T15:55:31ZDavid FeuerImprove the worst case performance of weak pointersGarbage collecting weak pointers involves repeatedly traversing the weak pointer list, checking which keys are still alive, and in response marking the relevant values and finalizers live. This works well in most cases, but if there are ...Garbage collecting weak pointers involves repeatedly traversing the weak pointer list, checking which keys are still alive, and in response marking the relevant values and finalizers live. This works well in most cases, but if there are long chains of weak pointers, it's terrible. I wonder if we can do something about that without significantly hurting performance elsewhere. Here's the (probably naive) idea:
1. Maintain a hash table mapping weak pointer keys to lists of weak pointers. This replaces the current weak pointer list.
1. Use a bit in the info table pointer of every heap object to indicate whether that object is referenced from any `Weak#`. This is the weakest link: if we don't have a spare bit there, or don't want to use one, then I think this whole idea is sunk.
When creating a weak pointer, set the appropriate bit in the key and insert the key and pointer into the hash table. When performing early finalization, clear the bit in the key if no other `Weak#` points to it.
When performing garbage collection: check the bit in each object as it is marked. If the bit is set, move the key and its attached `Weak#`s from the hash table into a new hash table (or an association list that gets turned into a hash table after evacuation or whatever), and mark all the linked weak pointer targets and finalizers live. In the end, the old hash table contains only unreachable objects. Now mark those objects live (for finalization and in case of resurrection), and queue up the finalizers.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | -------------- |
| Version | 8.6.1-beta1 |
| Type | FeatureRequest |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Runtime System |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | simonmar |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Improve the worst case performance of weak pointers","status":"New","operating_system":"","component":"Runtime System","related":[],"milestone":"8.8.1","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.6.1-beta1","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":["simonmar"],"type":"FeatureRequest","description":"Garbage collecting weak pointers involves repeatedly traversing the weak pointer list, checking which keys are still alive, and in response marking the relevant values and finalizers live. This works well in most cases, but if there are long chains of weak pointers, it's terrible. I wonder if we can do something about that without significantly hurting performance elsewhere. Here's the (probably naive) idea:\r\n\r\n1. Maintain a hash table mapping weak pointer keys to lists of weak pointers. This replaces the current weak pointer list.\r\n\r\n2. Use a bit in the info table pointer of every heap object to indicate whether that object is referenced from any `Weak#`. This is the weakest link: if we don't have a spare bit there, or don't want to use one, then I think this whole idea is sunk.\r\n\r\nWhen creating a weak pointer, set the appropriate bit in the key and insert the key and pointer into the hash table. When performing early finalization, clear the bit in the key if no other `Weak#` points to it.\r\n\r\nWhen performing garbage collection: check the bit in each object as it is marked. If the bit is set, move the key and its attached `Weak#`s from the hash table into a new hash table (or an association list that gets turned into a hash table after evacuation or whatever), and mark all the linked weak pointer targets and finalizers live. In the end, the old hash table contains only unreachable objects. Now mark those objects live (for finalization and in case of resurrection), and queue up the finalizers.","type_of_failure":"OtherFailure","blocking":[]} -->⊥