GHC issueshttps://gitlab.haskell.org/ghc/ghc/-/issues2023-06-13T18:04:00Zhttps://gitlab.haskell.org/ghc/ghc/-/issues/23494"Deriving" plugins2023-06-13T18:04:00Zsheafsam.derbyshire@gmail.com"Deriving" pluginsI think it would be nice if the `deriving` infrastructure was extensible with plugins. This will probably need a GHC proposal in the end, but first I'd like to figure out how we would go about it.
I imagine it being a new deriving strat...I think it would be nice if the `deriving` infrastructure was extensible with plugins. This will probably need a GHC proposal in the end, but first I'd like to figure out how we would go about it.
I imagine it being a new deriving strategy which specifies a plugin name (module name):
```haskell
data P a = P3 a a a
deriving (Semigroup, Monoid)
plugin MyPlugin
```
In [Sam's plugin](https://github.com/sheaf/sams-plugin), I hijacked the `deriving via` mechanism to emulate this within a typechecking plugin:
```haskell
{-# OPTIONS_GHC -fplugin=Sam'sPlugin #-}
import Sam'sOptics
data D a = MkD { fld1 :: a, fld2 :: Int# }
deriving Sam'sOptics via Sam'sPlugin
```
This would create a collection of `HasField` instances:
```haskell
instance HasField "fld1" (D a) a
instance HasField "fld2" (D a) Int#
```
(for a custom `HasField` class which is representation-polymorphic).
The problem with this approach is that a typechecking plugin can't add instances to the instance environment without a lot of hacks. If we instead integrate it with `deriving`, then plugin-written instances will work as usual, without special logic.
One difficulty in the above example, which I believe is a key usability feature, is that we wrote a single deriving clause but it derived a bunch of instances. It would be quite onerous, for a record with many fields, to have to derive all of the `HasField` instances individually.https://gitlab.haskell.org/ghc/ghc/-/issues/22401Plugin-Swappable Parser2022-11-14T10:49:08ZKai-Oliver ProttPlugin-Swappable Parser*I originally started writing this as a GHC proposal, but apparently Plugin-API changes do not need one. This is a work-in-progress. Feedback welcome*
*Update 1:* Added 2 parameters to the function in `ReplacementParser`.
# Plugin-Swa...*I originally started writing this as a GHC proposal, but apparently Plugin-API changes do not need one. This is a work-in-progress. Feedback welcome*
*Update 1:* Added 2 parameters to the function in `ReplacementParser`.
# Plugin-Swappable Parser
For many people in the community, GHC plugins are a quick and easy way to extend the compiler with additional features or to prototype implementations.
While plugins allow their author to transform or analyze a module as well as changing the (type) constraint solver, it is currently not possible to change the static syntax accepted by GHC.
I propose to make the parser implementation replaceable using a GHC plugin. This way, a plugin author can not only change the semantics, but also the accepted syntax for a module.
## Motivation
At the moment, the only way to use different syntax in a Haskell module is to use quasiquotation.
However, this leads to some ugly code without any support from IDEs.
In the case that someone just wants to alter GHC's accepted syntax in a tiny way, this is unsatisfactory.
By replacing the parser using a plugin, most syntax highlighter will keep working even when a definition contains a new keyword.
Other features of an IDE might also continue working, such as hovering over a definition to see its type.
This is not possible if the full definition was created by a quasi quoter.
I can see different use cases for such a parser plugin.
1. Using GHC to implement Haskell-adjacent languages. In my case, I am trying to implement Curry using GHC, a language that uses Haskell's syntax with a few modifications (new keyword for example). Maybe the people working on Mu (Haskell dialect from Standard Chartered) would be interested in this as well.
2. Prototyping proposed, syntactic changes to GHC. Many GHC proposals introduce new syntax. I am not very familiar with the process of implementing a proposal, but I could imagine that it would be useful to prototype an implementation to test syntactic changes without re-building the compiler every time.
3. Restricting the syntax accepted by GHC for teaching.
Racket offers [different languages specifically for teaching](https://docs.racket-lang.org/drracket/htdp-langs.html).
These restrict the language to certain features, depending on the level, to make it easier to use for beginners.
At the moment, Haskell can be overwhelming for beginner students.
Having a way to restrict the language can help with that.
Although the current plugin infrastructure might be enough to implement a purely restrictive change.
## Proposed Change Specification
I propose to augment the `Plugin` data type as follows:
```haskell
data Plugin = Plugin {
-- keep all current constructors
, parserPlugin :: ParserPlugin
}
type ParserPlugin = [CommandLineOption] -> ParserSpecification
data ParserSpecification = DefaultParser -- ^ Do not replace the parser
| ReplacementParser (DynFlags -> RealSrcLoc -> StringBuffer -> ParseResult ParsedModule) -- ^ Replace the parser with the given function.
```
Here, `DynFlags`, `RealSrcLoc`, `StringBuffer` and `ParseResult` are currently used by GHC's parser and my proposal does not affect them.
We need all of these parameters to at least be able to use GHC's own parser in a plugin.
As long as none of the plugins active for a module uses the `ReplacementParser` constructor, nothing special happens.
When more than one plugin specifies a `ReplacementParser`, compilation is aborted with an error.
We can only replace the parser with a single definiton , so having two plugins specifying different parsers is a problem and we should abort.
In the case that exactly one plugin specifies a new parser to be used, the module is parsed according to the function given in the `ReplacementParser` constructor.
Note that the new parser still has to use GHC's current AST representation for its output.
Thus, any plugin has to transform its own syntax to fit into the current AST.
When a GHC (error) message refers back to these transformed sections, e.g. in a type error, it should continue using the existing `SrcLoc` (and friends) annotations.
Any plugin author can decide to give meaningful source locations and fix any error messages via different mechanisms, if desired.
## Effect and Interactions
The effect of a replaceable parser is that it allows a plugin author to modify the syntax accepted by GHC.
This increases the flexibility and usability of GHC as a compiler tool.
I am not involved in GHC development, so it is hard for me do imagine any interactions with this feature.
However, I will extend this section with any interactions from the comments.
## Costs and Drawbacks
I estimate that the cost of this change is relatively low, since it is only concerned with a small part of the compiler.
The change is also backwards compatible and should have a low maintenance cost.
Drawbacks:
- Completely replacing the parser does not compose when using multiple parser plugins as mentioned before. Adopting this solution makes it harder to reach a composable solution without being backwards-incompatible. However, I do not believe that there will be a better solution in the foreseeable future.
- *To be extended*
## Alternatives
I can imagine that a parser based on parser combinators (i.e. not using Happy) is easier to extend with a plugin. Thus, one alternative could be to replace the current parser implementation with an implementation based on parser combinators. At each syntax rule, the parser could then look at all active parser plugins and try any parsing combinators as an alternative.
Such a solution would compose better with multiple plugins, but it would incur a significant implementation and maintenance burden for a small benefit.
Additionally, the performance of the parser will probably suffer as well.
## Unresolved Questions
- Should a plugin be allowed to change the syntax of the module header and pragmas at the top of a module? Often, a plugin is activated per-module using `OPTIONS_GHC`. Specifying a plugin that alters the parsing of the header in the header itself is pretty weird.
- Should the GHC AST be adapted to accommodate plugin-parsed syntax by adding a constructor that behaves similarly as the `HsExpansion` data type?
E.g.
```haskell
data HsPluginExpansion a = HsPluginExpansion String a
^ -- original source plus generated AST`
```
This makes the changes more involved and the new data type would need to be added in many places in the AST.
I do not really see the benefit, since GHC currently uses the original source code text to show the source in error messages (if i remember correctly...).
## Implementation Plan
I could implement this, but would probably need some guidance.Kai-Oliver ProttKai-Oliver Protthttps://gitlab.haskell.org/ghc/ghc/-/issues/20303Flags to produce full interface files and stop2021-09-09T04:44:47ZZiyang Liuunsafefixio@gmail.comFlags to produce full interface files and stop<!--
READ THIS FIRST: If the feature you are proposing changes the language that GHC accepts
or adds any warnings to `-Wall`, it should be written as a [GHC Proposal](https://github.com/ghc-proposals/ghc-proposals/).
Other features, appr...<!--
READ THIS FIRST: If the feature you are proposing changes the language that GHC accepts
or adds any warnings to `-Wall`, it should be written as a [GHC Proposal](https://github.com/ghc-proposals/ghc-proposals/).
Other features, appropriate for a GitLab feature request, include GHC API/plugin
innovations, new low-impact compiler flags, or other similar additions to GHC.
-->
## Motivation
`-fno-code -fwrite-interface` produces the interface files and stop, which is convenient for typechecking. The interface files produced with these flags do not contain unfoldings, which is not needed for typechecking. But when writing GHC Core plugins, we do need the unfoldings - so that the plugin has something to work with - but often don't need anything beyond the interface files.
## Proposal
I'm not aware of any flag or combination of flags that behaves like `-fno-code -fwrite-interface`, but includes unfoldings in the interface files. So the proposal is to add such flag(s).https://gitlab.haskell.org/ghc/ghc/-/issues/18147spurious output with BuiltinRule2022-02-10T01:43:11ZGreg Pfeilspurious output with BuiltinRule## Summary
Compiling with `-fplugin=Foo -fno-omit-interface-pragmas` causes GHC to output `toHsRule: builtin <ru_name>` if `Foo` adds a `BuiltinRule`.
The output comes from https://gitlab.haskell.org/ghc/ghc/-/blob/master/compiler/GHC/...## Summary
Compiling with `-fplugin=Foo -fno-omit-interface-pragmas` causes GHC to output `toHsRule: builtin <ru_name>` if `Foo` adds a `BuiltinRule`.
The output comes from https://gitlab.haskell.org/ghc/ghc/-/blob/master/compiler/GHC/Iface/Make.hs#L696
## Steps to reproduce
Hopefully the summary suffices.
## Expected behavior
Not to output `toHsRule: builtin ...`. If the message indicates a potential issue with generating interface pragmas with `BuiltinRule`s, then the message should at least be more descriptive.
## Environment
* GHC version used: 8.8.3 (this still persists in master, though)https://gitlab.haskell.org/ghc/ghc/-/issues/15745Panicking typechecker plugins2019-07-07T18:03:09ZPhil de JouxPanicking typechecker pluginsIf I use a typechecker plugin that fails then ghc panics and I'm asked to report a bug with GHC;
```
ghc: panic! (the 'impossible' happened)
(GHC version 8.2.2 for x86_64-apple-darwin):
Prelude.undefined
CallStack (from H...If I use a typechecker plugin that fails then ghc panics and I'm asked to report a bug with GHC;
```
ghc: panic! (the 'impossible' happened)
(GHC version 8.2.2 for x86_64-apple-darwin):
Prelude.undefined
CallStack (from HasCallStack):
error, called at libraries/base/GHC/Err.hs:79:14 in base:GHC.Err
undefined, called at plugin/Undefined/Solve/Plugin.hs:14:39 in
undefined-solve-plugin-1.0.0.0-56evBabJYBHHTUlrE3HO5m:Undefined.Solve.Plugin
Please report this as a GHC bug: http://www.haskell.org/ghc/reportabug
```
Could we please say that it is the typechecker plugin that is panicking and ask for a bug report for the faulty typechecker, giving the issues URL for the plugin if we know it?
The following [undefined plugins](https://github.com/BlockScope/undefined-plugin) fail for init, solve and stop functions.
```haskell
module Undefined.Init.Plugin (plugin) where
import Plugins (Plugin(..), tcPlugin, defaultPlugin)
import TcRnTypes (TcPluginM, TcPluginResult(..), Ct, TcPlugin(..))
import GHC.TcPluginM.Extra (tracePlugin)
plugin :: Plugin
plugin = defaultPlugin { tcPlugin = const $ Just undefinedPlugin }
undefinedPlugin :: TcPlugin
undefinedPlugin = tracePlugin "undefined-init-plugin" $
TcPlugin
{ tcPluginInit = undefined
, tcPluginSolve = \_ _ _ _ -> return $ TcPluginOk [] []
, tcPluginStop = const $ return ()
}
```
```haskell
module Undefined.Solve.Plugin (plugin) where
import Plugins (Plugin(..), tcPlugin, defaultPlugin)
import TcRnTypes (TcPluginM, TcPluginResult, Ct, TcPlugin(..))
import GHC.TcPluginM.Extra (tracePlugin)
plugin :: Plugin
plugin = defaultPlugin { tcPlugin = const $ Just undefinedPlugin }
undefinedPlugin :: TcPlugin
undefinedPlugin = tracePlugin "undefined-solve-plugin" $
TcPlugin
{ tcPluginInit = return ()
, tcPluginSolve = \_ _ _ _ -> undefined
, tcPluginStop = const $ return ()
}
```
```haskell
module Undefined.Stop.Plugin (plugin) where
import Plugins (Plugin(..), tcPlugin, defaultPlugin)
import TcRnTypes (TcPluginM, TcPluginResult(..), Ct, TcPlugin(..))
import GHC.TcPluginM.Extra (tracePlugin)
plugin :: Plugin
plugin = defaultPlugin { tcPlugin = const $ Just undefinedPlugin }
undefinedPlugin :: TcPlugin
undefinedPlugin = tracePlugin "undefined-stop-plugin" $
TcPlugin
{ tcPluginInit = return ()
, tcPluginSolve = \_ _ _ _ -> return $ TcPluginOk [] []
, tcPluginStop = const $ undefined
}
```
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | ----------------------- |
| Version | 8.2.2 |
| Type | FeatureRequest |
| TypeOfFailure | OtherFailure |
| Priority | normal |
| Resolution | Unresolved |
| Component | Compiler (Type checker) |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Panicking typechecker plugins","status":"New","operating_system":"","component":"Compiler (Type checker)","related":[],"milestone":"","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"8.2.2","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"FeatureRequest","description":"If I use a typechecker plugin that fails then ghc panics and I'm asked to report a bug with GHC;\r\n\r\n{{{\r\nghc: panic! (the 'impossible' happened)\r\n (GHC version 8.2.2 for x86_64-apple-darwin):\r\n \tPrelude.undefined\r\n CallStack (from HasCallStack):\r\n error, called at libraries/base/GHC/Err.hs:79:14 in base:GHC.Err\r\n undefined, called at plugin/Undefined/Solve/Plugin.hs:14:39 in\r\n undefined-solve-plugin-1.0.0.0-56evBabJYBHHTUlrE3HO5m:Undefined.Solve.Plugin\r\n\r\n Please report this as a GHC bug: http://www.haskell.org/ghc/reportabug\r\n}}}\r\n\r\nCould we please say that it is the typechecker plugin that is panicking and ask for a bug report for the faulty typechecker, giving the issues URL for the plugin if we know it?\r\n\r\nThe following [https://github.com/BlockScope/undefined-plugin undefined plugins] fail for init, solve and stop functions.\r\n\r\n{{{\r\n#!haskell\r\nmodule Undefined.Init.Plugin (plugin) where\r\n\r\nimport Plugins (Plugin(..), tcPlugin, defaultPlugin)\r\nimport TcRnTypes (TcPluginM, TcPluginResult(..), Ct, TcPlugin(..))\r\nimport GHC.TcPluginM.Extra (tracePlugin)\r\n\r\nplugin :: Plugin\r\nplugin = defaultPlugin { tcPlugin = const $ Just undefinedPlugin }\r\n\r\nundefinedPlugin :: TcPlugin\r\nundefinedPlugin = tracePlugin \"undefined-init-plugin\" $\r\n TcPlugin\r\n { tcPluginInit = undefined\r\n , tcPluginSolve = \\_ _ _ _ -> return $ TcPluginOk [] []\r\n , tcPluginStop = const $ return ()\r\n }\r\n}}}\r\n\r\n\r\n{{{\r\n#!haskell\r\nmodule Undefined.Solve.Plugin (plugin) where\r\n\r\nimport Plugins (Plugin(..), tcPlugin, defaultPlugin)\r\nimport TcRnTypes (TcPluginM, TcPluginResult, Ct, TcPlugin(..))\r\nimport GHC.TcPluginM.Extra (tracePlugin)\r\n\r\nplugin :: Plugin\r\nplugin = defaultPlugin { tcPlugin = const $ Just undefinedPlugin }\r\n\r\nundefinedPlugin :: TcPlugin\r\nundefinedPlugin = tracePlugin \"undefined-solve-plugin\" $\r\n TcPlugin\r\n { tcPluginInit = return ()\r\n , tcPluginSolve = \\_ _ _ _ -> undefined\r\n , tcPluginStop = const $ return ()\r\n }\r\n}}}\r\n\r\n{{{\r\n#!haskell\r\nmodule Undefined.Stop.Plugin (plugin) where\r\n\r\nimport Plugins (Plugin(..), tcPlugin, defaultPlugin)\r\nimport TcRnTypes (TcPluginM, TcPluginResult(..), Ct, TcPlugin(..))\r\nimport GHC.TcPluginM.Extra (tracePlugin)\r\n\r\nplugin :: Plugin\r\nplugin = defaultPlugin { tcPlugin = const $ Just undefinedPlugin }\r\n\r\nundefinedPlugin :: TcPlugin\r\nundefinedPlugin = tracePlugin \"undefined-stop-plugin\" $\r\n TcPlugin\r\n { tcPluginInit = return ()\r\n , tcPluginSolve = \\_ _ _ _ -> return $ TcPluginOk [] []\r\n , tcPluginStop = const $ undefined\r\n }\r\n}}}\r\n","type_of_failure":"OtherFailure","blocking":[]} -->https://gitlab.haskell.org/ghc/ghc/-/issues/24486ignored unrecognised input when compiling with a plugin2024-03-05T14:33:15ZMarcin Szamotulskiignored unrecognised input when compiling with a plugin## Summary
When I compile a package with `ghc-tags-plugin` enabled, e.g. passing the following options (one will need to modify it depending on your system):
```
package io-classes
ghc-options: -package-db=/home/coot/.local/state/caba...## Summary
When I compile a package with `ghc-tags-plugin` enabled, e.g. passing the following options (one will need to modify it depending on your system):
```
package io-classes
ghc-options: -package-db=/home/coot/.local/state/cabal/store/ghc-9.6.4/package.db/
-plugin-package=ghc-tags-plugin-0.6.1.0
-fplugin=Plugin.GhcTags
-fplugin-opt=Plugin.GhcTags:../tags
```
I am getting the following warnings:
```
Warning: ignoring unrecognised input `/home/coot/clients/iog/io-sim/dist-newstyle/build/x86_64-linux/ghc-9.6.4/io-classes-1.4.1.0/build/Control/Concurrent/Class/M
onadMVar.dyn_o'
...
```
It doesn't really matter what package I try to compile; I haven't tried to use any other plugin, but I don't get that warning when I disable the `ghc-tags-plugin`.
## Steps to reproduce
Given above. To install `ghc-tags-plugin` follow instruction in the [readme](https://github.com/coot/ghc-tags-plugin/)
## Expected behavior
No warning.
## Environment
* GHC version used: `ghc-9.6.4`
Optional:
* Operating System: Linux
* System Architecture: x86_649.10.2ZubinZubinhttps://gitlab.haskell.org/ghc/ghc/-/issues/24402GHC Plugin hook for end of the compilation run2024-02-07T17:36:14ZparsonsmattGHC Plugin hook for end of the compilation run<!--
READ THIS FIRST: If the feature you are proposing changes the language that GHC accepts
or adds any warnings to `-Wall`, it should be written as a [GHC Proposal](https://github.com/ghc-proposals/ghc-proposals/).
Other features, appr...<!--
READ THIS FIRST: If the feature you are proposing changes the language that GHC accepts
or adds any warnings to `-Wall`, it should be written as a [GHC Proposal](https://github.com/ghc-proposals/ghc-proposals/).
Other features, appropriate for a GitLab feature request, include GHC API/plugin
innovations, new low-impact compiler flags, or other similar additions to GHC.
-->
## Motivation
I am working on a GHC plugin [`opentelemetry-plugin`](https://github.com/MercuryTechnologies/opentelemetry-plugin) that is used to report module-level build timing to an OpenTelemetry provider, like Honeycomb. I would like to record the number of modules that GHC compiles in a given build. However, it is impossible to determine when compilation ends, and using `addFinalizer` does not work because the program exits and GC does not run.
If the GHC plugin interface allowed me to install a hook for plugin cleanup, that would allow me to close the span when compilation is done.
In general, this would allow *all* plugins the ability to cleanup any resources they should want to cleanup after the program exits.
## Proposal
Extending the GHC plugin type with a hook that runs directly after compilation is done. This could be an `IO ()` for my own needs, though it may be nice to have an input parameter representing the result of compilation (possibly a `Maybe (NonEmpty Error)`?)https://gitlab.haskell.org/ghc/ghc/-/issues/24395Profiled GHC flavour segfaults when building a target project with TH or GHC ...2024-03-18T13:14:06ZIan-Woo KimProfiled GHC flavour segfaults when building a target project with TH or GHC plugin## Summary
When I use GHC built with the `profiled_ghc` flavour, GHC segfaults when it compiles a target code with TH or with a GHC plugin (with profiling on for the target project).
The segfaults happened in most of time on my machine,...## Summary
When I use GHC built with the `profiled_ghc` flavour, GHC segfaults when it compiles a target code with TH or with a GHC plugin (with profiling on for the target project).
The segfaults happened in most of time on my machine, but sometimes it passes, so the error happens nondeterministically.
With a smaller probability, it sometimes leads to GHC internal errors showing this is related to code relocation on linking:
```
ghc: internal error: Relocation out of range for SUBTRACTOR
```
By the way, this problem happens almost in building many of dependency packages that use TH or GHC plugins in a big private project with `library_profiling: True`.
## Steps to reproduce
I made a sample project here:
https://gist.github.com/wavewave/97c802cb80aecf8146a7d5af21cf4bf1
and one can reproduce it by the following build command:
```
$ cabal build --project-file=ghcHEAD-cabal-profiledGHC.project
Build profile: -w ghc-9.9.20240115 -O0
In order, the following will be built (use -v for more details):
- myfailure-2.11.0 (lib) (file MyFailure.hs changed)
Preprocessing library for myfailure-2.11.0..
Building library for myfailure-2.11.0..
<no location info>: warning: [GHC-42258] [-Wunused-packages]
The following packages were specified via -package or -package-id flags,
but were not needed for compilation:
- bytestring-0.12.0.2 (exposed by flag -package-id bytestring-0.12.0.2-inplace)
- http-conduit-2.3.8.3 (exposed by flag -package-id http-cndt-2.3.8.3-68b1c994)
- microlens-0.4.13.1 (exposed by flag -package-id mcrlns-0.4.13.1-4239c11e)
- aeson-2.2.0.0 (exposed by flag -package-id sn-2.2.0.0-921d37e9)
[1 of 1] Compiling MyFailure ( MyFailure.hs, /Users/ianwookim/tmp/profiledGHC/hoauth2-2.11.0/dist-newstyle/build/aarch64-osx/ghc-9.9.20240115/myfailure-2.11.0/noopt/build/MyFailure.o ) [Source file changed]
Error: cabal: Failed to build myfailure-2.11.0. The build process segfaulted
(i.e. SIGSEGV).
```
Note that I intentionally add redundant deps. When I remove one of those deps, the probability of the segfaults got reduced. (probably, another indicator that this bug is related to code relocation offset amount)
## Expected behavior
Segfault should not happen in any case. If something is wrong (such as linker couldn't find some symbols etc), it should show that error.
## Environment
* GHC version used: GHC master, 9.9.20240115 version (`da908790670cd5161cffc3d83757fb8048bdafda`)
Optional:
* Operating System: macOS Sonoma 14.2.1
* System Architecture: Apple M1 Max (AArch64)ZubinZubinhttps://gitlab.haskell.org/ghc/ghc/-/issues/23932Defaulting plugins are not run at the right time2023-09-12T16:22:31Zsheafsam.derbyshire@gmail.comDefaulting plugins are not run at the right timeIt seems to me that defaulting plugins are not run at the right point in time, because they are handed a whole implication tree. This means that defaulting plugins only get to see things "after the fact", e.g. after we have already decid...It seems to me that defaulting plugins are not run at the right point in time, because they are handed a whole implication tree. This means that defaulting plugins only get to see things "after the fact", e.g. after we have already decided on quantification for top-level bindings.
## Problem 1: quantification
Let me give an example with a partial type signature for clarity (similar examples can be written without any type signature at all):
```haskell
{-# LANGUAGE PartialTypeSignatures, NamedWildCards #-}
fun1 :: _a -> Bool
fun1 x = x == x
```
Here GHC will decide to quantify `_a` into a skolem, and then the defaulting plugin will see an implication with `a[sk]`:
```
WC {wc_impl =
Implic {
TcLevel = 1
Skolems = a_a599[sk:1]
Given-eqs = NoGivenEqs
Status = Unsolved
Given =
Wanted =
WC {wc_simple = [W] $dEq_a59a {0}:: Eq a_a599[sk:1] (CDictCan)
wc_errors = {_a_a4Xk:a_a599[sk:1]}}
Binds = EvBindsVar<a59e>
the inferred type of fun1 :: a_a599[sk:1] -> Bool }}
```
What if the plugin wanted to default `_a` in order to solve `Eq _a`? It has no means of doing so; in fact defaulting plugins will never see any metavariables arising from such a program.
# Problem 2: nested implications
There is no way for a plugin to do defaulting inside an implication with Given constraints, as the `solveSimpleWanteds` run happens outside of the implication when the Givens are no longer available.
A slightly contrived example:
```haskell
type C :: Type -> Type -> Constraint
class C a b where
meth :: a -> b -> ()
type D :: Type -> Type
class D b where { }
instance D b => C Int b where { meth _ _ = () }
data Dict ct where
Dict :: ct => Dict ct
mk_dict :: b -> (b, Dict (D b))
mk_dict = undefined
baz = case mk_dict undefined of
(b, Dict) -> foo undefined b
```
Here we will end up with an implication of the form
```
[G] D beta => [W] C alpha beta
```
However, if we suggest defaulting `alpha := Int`, we end up in a solver run WITHOUT the Given `[G] D beta`, because it comes from an inner implication. Then, when using the instance `C Int b`, we end up with `[W] D beta` which we can't solve, because we don't have the Given around anymore. This then causes us to abort the defaulting attempt (due to unsolved residual Wanted constraints).sheafsam.derbyshire@gmail.comsheafsam.derbyshire@gmail.comhttps://gitlab.haskell.org/ghc/ghc/-/issues/22122Records with field names sharing the same underlying string are broken2023-05-02T10:33:12Zsheafsam.derbyshire@gmail.comRecords with field names sharing the same underlying string are brokenDeclaring records where the field `Name`s are different, but the `Name`s have the same underlying `FastString` causes a myriad of issues.
Test case:
```haskell
{-# LANGUAGE DeriveTraversable #-}
{-# LANGUAGE DerivingStrategies #-}
{-#...Declaring records where the field `Name`s are different, but the `Name`s have the same underlying `FastString` causes a myriad of issues.
Test case:
```haskell
{-# LANGUAGE DeriveTraversable #-}
{-# LANGUAGE DerivingStrategies #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE TemplateHaskell #-}
module T22122_aux where
import Language.Haskell.TH.Syntax
( Name, Type(ConT), Lit(CharL, StringL)
, Dec(DataD, FunD), Con(RecC), Exp(LitE, VarE, RecUpdE), Pat(VarP)
, Clause(Clause), Body(NormalB)
, Bang(..), SourceUnpackedness(..), SourceStrictness(..)
, newNameIO )
import System.IO.Unsafe
( unsafePerformIO )
data Names a
= Names { d1_name, d2_name
, mkd1_name, mkd2a_name, mkd2b_name
, d1_fld1_name, d1_fld2_name, d2_fld1_name, d2_fld2_name
, upd_name, upd_var_name :: a }
deriving stock ( Functor, Foldable, Traversable )
string_names :: Names String
string_names =
Names
{ d1_name = "D1"
, d2_name = "D2"
, mkd1_name = "MkD1"
, mkd2a_name = "MkD2A"
, mkd2b_name = "MkD2B"
, d1_fld1_name = "fld" -- these are deliberately the same,
, d1_fld2_name = "fld" -- to check that we correctly use the exact Names
, d2_fld1_name = "fld" -- in a record update, and not simply the
, d2_fld2_name = "fld" -- field label strings
, upd_name = "upd"
, upd_var_name = "r"
}
names :: Names Name
names = unsafePerformIO $ traverse newNameIO string_names
noBang :: Bang
noBang = Bang NoSourceUnpackedness NoSourceStrictness
-- data D1 = MkD1 { fld1 :: Char, fld2 :: String }
-- data D2 = MkD2A { fld1 :: Char } | MkD2B { fld2 :: String }
data_decls :: [ Dec ]
data_decls = [ d1, d2 ]
where
Names { .. } = names
d1 = DataD [] d1_name [] Nothing [mkd1] []
d2 = DataD [] d2_name [] Nothing [mkd2_a, mkd2_b] []
mkd1 = RecC mkd1_name [(d1_fld1_name, noBang, ConT ''Char), (d1_fld2_name, noBang, ConT ''String)]
mkd2_a = RecC mkd2a_name [(d2_fld1_name, noBang, ConT ''Char)]
mkd2_b = RecC mkd2b_name [(d2_fld2_name, noBang, ConT ''String)]
-- rec_upd r = r { fld1 = 'c', fld2 = "foo" }
record_upds :: [ Dec ]
record_upds = [ rec_upd ]
where
Names { .. } = names
rec_upd = FunD upd_name [upd_clause]
upd_clause = Clause [VarP upd_var_name] (NormalB rec_upd_body) []
rec_upd_body = RecUpdE (VarE upd_var_name) [ (d1_fld1_name, LitE (CharL 'c')), (d1_fld2_name, LitE (StringL "foo")) ]
====================================
{-# LANGUAGE DuplicateRecordFields #-}
{-# LANGUAGE NoFieldSelectors #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE TemplateHaskell #-}
module T22122 where
import T22122_aux ( data_decls, record_upds )
$(return data_decls)
$(return record_upds)
```
## Issue 1
```
testsuite\tests\th\THRecordUpd.hs:10:2: error:
* Constructors MkD2A and MkD2B give different types for field `fld'
* In the data type declaration for `D2'
|
10 | $(return data_decls)
| ^^^^^
```
## Issue 2
Changing `"fld", "fld", "fld", "fld"` to `"fld1", "fld2", "fld1", "fld2"` causes the following error:
```
C:\Users\sheaf\AppData\Local\Temp\ghc6132_0\ghc_1.s:133:1: error:
error: symbol 'THRecordUpd_fld2_info' is already defined
|
133 | THRecordUpd_fld2_info:
| ^
THRecordUpd_fld2_info:
^
C:\Users\sheaf\AppData\Local\Temp\ghc6132_0\ghc_1.s:174:1: error:
error: symbol 'THRecordUpd_fld2_closure' is already defined
|
174 | THRecordUpd_fld2_closure:
| ^
THRecordUpd_fld2_closure:
^
C:\Users\sheaf\AppData\Local\Temp\ghc6132_0\ghc_1.s:190:1: error:
error: symbol 'THRecordUpd_fld1_info' is already defined
|
190 | THRecordUpd_fld1_info:
| ^
THRecordUpd_fld1_info:
^
C:\Users\sheaf\AppData\Local\Temp\ghc6132_0\ghc_1.s:231:1: error:
error: symbol 'THRecordUpd_fld1_closure' is already defined
|
231 | THRecordUpd_fld1_closure:
| ^
THRecordUpd_fld1_closure:
^
PLEASE submit a bug report to https://bugs.llvm.org/ and include the crash backtrace, preprocessed source, and associated run script.
Stack dump:
0. Program arguments: C:\\Haskell\\ghc\\record-update\\_build\\stage1\\lib\\..\\../mingw/bin/clang.exe --rtlib=compiler-rt -iquotetestsuite\\tests\\th -Wa,-mbig-obj -Qunused-arguments -x assembler -c C:\\Users\\sheaf\\AppData\\Local\\Temp\\ghc6132_0\\ghc_1.s -o testsuite\\tests\\th\\THRecordUpd.o.tmp
#0 0x00007ff9668fce2b llvm::MCWinCOFFObjectTargetWriter::recordRelocation(llvm::MCFixup const&) const (C:\Haskell\ghc\record-update\_build\mingw\bin\libLLVM-13.dll+0x174ce2b)
#1 0x00007ff96689cda7 llvm::MCAssembler::Finish() (C:\Haskell\ghc\record-update\_build\mingw\bin\libLLVM-13.dll+0x16ecda7)
#2 0x00007ff9668cb3f6 llvm::MCObjectStreamer::finishImpl() (C:\Haskell\ghc\record-update\_build\mingw\bin\libLLVM-13.dll+0x171b3f6)
#3 0x00007ff9668e07cc llvm::MCWinCOFFStreamer::finishImpl() (C:\Haskell\ghc\record-update\_build\mingw\bin\libLLVM-13.dll+0x17307cc)
#4 0x00007ff96690e193 llvm::createMCAsmParser(llvm::SourceMgr&, llvm::MCContext&, llvm::MCStreamer&, llvm::MCAsmInfo const&, unsigned int) (C:\Haskell\ghc\record-update\_build\mingw\bin\libLLVM-13.dll+0x175e193)
#5 0x00007ff6a53ec2e2 cc1as_main(llvm::ArrayRef<char const*>, char const*, void*) (C:\Haskell\ghc\record-update\_build\mingw\bin\clang.exe+0xc2e2)
#6 0x00007ff6a53e4a70 llvm::InitializeAllTargets() (C:\Haskell\ghc\record-update\_build\mingw\bin\clang.exe+0x4a70)
#7 0x00007ff96b5662e6 llvm::SmallVectorTemplateBase<llvm::SmallString<128u>, false>::grow(unsigned long long) (C:\Haskell\ghc\record-update\_build\mingw\bin\libclang-cpp.dll+0x17b62e6)
#8 0x00007ff965236e53 llvm::CrashRecoveryContext::RunSafely(llvm::function_ref<void ()>) (C:\Haskell\ghc\record-update\_build\mingw\bin\libLLVM-13.dll+0x86e53)
#9 0x00007ff96b565eb5 clang::driver::CC1Command::Execute(llvm::ArrayRef<llvm::Optional<llvm::StringRef> >, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >*, bool*) const (C:\Haskell\ghc\record-update\_build\mingw\bin\libclang-cpp.dll+0x17b5eb5)
#10 0x00007ff96b53795f clang::driver::Compilation::ExecuteCommand(clang::driver::Command const&, clang::driver::Command const*&) const (C:\Haskell\ghc\record-update\_build\mingw\bin\libclang-cpp.dll+0x178795f)
#11 0x00007ff96b537d39 clang::driver::Compilation::ExecuteJobs(clang::driver::JobList const&, llvm::SmallVectorImpl<std::__1::pair<int, clang::driver::Command const*> >&) const (C:\Haskell\ghc\record-update\_build\mingw\bin\libclang-cpp.dll+0x1787d39)
#12 0x00007ff96b54d4b6 clang::driver::Driver::ExecuteCompilation(clang::driver::Compilation&, llvm::SmallVectorImpl<std::__1::pair<int, clang::driver::Command const*> >&) (C:\Haskell\ghc\record-update\_build\mingw\bin\libclang-cpp.dll+0x179d4b6)
#13 0x00007ff6a53e3eac main (C:\Haskell\ghc\record-update\_build\mingw\bin\clang.exe+0x3eac)
#14 0x00007ff6a53e13da WinMainCRTStartup (C:\Haskell\ghc\record-update\_build\mingw\bin\clang.exe+0x13da)
#15 0x00007ff6a53e1436 mainCRTStartup (C:\Haskell\ghc\record-update\_build\mingw\bin\clang.exe+0x1436)
#16 0x00007ffa4cbe54e0 (C:\WINDOWS\System32\KERNEL32.DLL+0x154e0)
#17 0x00007ffa4db6485b (C:\WINDOWS\SYSTEM32\ntdll.dll+0x485b)
clang: error: clang integrated assembler command failed due to signal (use -v to see invocation)
clang version 13.0.0
Target: x86_64-w64-windows-gnu
Thread model: posix
InstalledDir: C:/Haskell/ghc/record-update/_build/stage1/lib/../../mingw/bin
clang: note: diagnostic msg: Error generating preprocessed source(s) - no preprocessable inputs.
<no location info>: error:
`clang.exe' failed in phase `Assembler'. (Exit code: 1)
```
## Issue 3
Changing `"fld", "fld", "fld", "fld"` to `"fld1", "fld1", "fld2", "fld3"` causes the following:
```
* GHC internal error: `THRecordUpd.fld1' is not in scope during type checking, but it passed the renamer
tcl_env of environment: [a :-> Identifier[r_a::p1, NotLetBound]]
* In the expression: r_a {fld1_5 = 'c', fld1_6 = "foo"}
In an equation for `upd':
upd r_a = r_a {fld1_5 = 'c', fld1_6 = "foo"}
|
12 | $(return record_upds)
| ^^^^^^^^^^^^^^^^^^^^
```https://gitlab.haskell.org/ghc/ghc/-/issues/21730Adding import in GHC plugin shows unused-import2022-10-02T18:19:24ZBrandon ChinnAdding import in GHC plugin shows unused-import## Summary
In a compiler plugin, I'm hooking into `parsedActionResult` to modify the module being compiled. I'm adding an import and then using something from the import, but GHC still complains about unused-import.
*Update 1*: It seem...## Summary
In a compiler plugin, I'm hooking into `parsedActionResult` to modify the module being compiled. I'm adding an import and then using something from the import, but GHC still complains about unused-import.
*Update 1*: It seems like this happens because of `generatedSrcSpan`. See Reddit thread for more details.
More generally, is there a way to use an identifier without needing to import it? I know with Template Haskell, if you use a ticked Name, it'll resolve successfully, even if it's not imported.
Original Reddit post: https://www.reddit.com/r/haskell/comments/vbwuzk/writing_a_ghc_plugin_autogenerated_import/
## Steps to reproduce
Minimal repro:
```hs
module Plugin (plugin) where
import GHC.Hs
import GHC.Plugins
import qualified GHC.Types.Name.Occurrence as NameSpace
plugin :: Plugin
plugin =
defaultPlugin
{ parsedResultAction = \_ _ modl -> pure modl{hpm_module = update <$> hpm_module modl}
}
update :: HsModule -> HsModule
update modl =
modl
{ hsmodImports =
[ genLoc $ simpleImportDecl $ mkModuleName "Data.List"
]
, hsmodDecls =
[ genLoc . genFuncDecl (mkRdrName "intercalate2") [] $
genLoc $ HsVar NoExtField (mkRdrName "intercalate")
]
}
-- | Make simple function declaration of the form `<funcName> <funcArgs> = <funcBody>`
genFuncDecl :: Located RdrName -> [LPat GhcPs] -> LHsExpr GhcPs -> HsDecl GhcPs
genFuncDecl funcName funcArgs funcBody =
(\body -> ValD NoExtField $ FunBind NoExtField funcName body [])
. (\match -> MG NoExtField (genLoc [genLoc match]) Generated)
. Match NoExtField (FunRhs funcName Prefix NoSrcStrict) funcArgs
. (\grhs -> GRHSs NoExtField [genLoc grhs] funcWhere)
$ GRHS NoExtField [] funcBody
where
funcWhere = genLoc $ EmptyLocalBinds NoExtField
genLoc :: e -> Located e
genLoc = L generatedSrcSpan
mkRdrName :: String -> Located RdrName
mkRdrName = genLoc . mkRdrUnqual . mkOccName NameSpace.varName
```
```hs
{-# OPTIONS_GHC -fplugin=Plugin #-}
module Foo where
```
```bash
stack ghc --resolver=ghc-9.0 --package ghc -- Foo.hs -Wunused-imports -Werror -dynamic-too
```
## Expected behavior
This should generate the equivalent of
```hs
module Foo where
import Data.List
intercalate2 = intercalate
```
So `Data.List` is being used; indeed, when removing the import, the use of `intercalate` fails. So `Data.List` shouldn't be marked unused.
## Environment
* GHC version used: 9.0 (I haven't tested with 9.2 because I'm using the 9.0 version of the GHC API, and I didn't want to update it to 9.2 yet)
Optional:
* Operating System:
* System Architecture:9.4.3https://gitlab.haskell.org/ghc/ghc/-/issues/21643What determines whether or not a function in `base` has unfolding?2022-06-01T07:59:54ZZiyang Liuunsafefixio@gmail.comWhat determines whether or not a function in `base` has unfolding?I was compiling a function `f` whose definition uses `Data.Monoid.getFirst`, and it turns out when I use `cabal build`, the unfolding of `getFirst` is available to the simplifier, and so in `f`'s unfolding, `getFirst` is inlined into a `...I was compiling a function `f` whose definition uses `Data.Monoid.getFirst`, and it turns out when I use `cabal build`, the unfolding of `getFirst` is available to the simplifier, and so in `f`'s unfolding, `getFirst` is inlined into a `Coercion`; whereas when I use `nix build`, the unfolding is not available. The GHC version and the set of optimisation flags are the same.
Since `base` ships with GHC I thought it should always behave exactly the same way with the same GHC version, but apparently it doesn't. What could possibly cause this to happen?https://gitlab.haskell.org/ghc/ghc/-/issues/21599Expose `GHC.Driver.Session.updOptLevelChanged`2022-06-12T11:14:07ZparsonsmattExpose `GHC.Driver.Session.updOptLevelChanged`I'm updating `inspection-testing` as part of ecosystem prep for GHC 9.4, and I'm porting the following line:
```haskell
let noopt = optLevel dflags < 1
```
Grepping through GHC, the function `updOptLevelChanged` is perfect - I can ...I'm updating `inspection-testing` as part of ecosystem prep for GHC 9.4, and I'm porting the following line:
```haskell
let noopt = optLevel dflags < 1
```
Grepping through GHC, the function `updOptLevelChanged` is perfect - I can write `snd $ updOptLevelChanged 1 dflags`. Unfortunately, this is not exposed, and `DynFlags` does not have an `Eq` instance, so I cannot do `dflags == updOptLevel 1 dflags`.https://gitlab.haskell.org/ghc/ghc/-/issues/21543`flagRecompile` is unreliable, because modifying a function doesn't necessari...2022-05-10T17:41:00ZZiyang Liuunsafefixio@gmail.com`flagRecompile` is unreliable, because modifying a function doesn't necessarily modify its fingerprintI ran into the following problem:
1. Module `Foo` contains a function `foo`. In `Foo.hi`, `foo` has no unfolding (because of the lack of `INLINABLE`, or `NOINLINE`, or `foo` is too large)
2. One modifies `Foo.foo`. Now `foo` is a differ...I ran into the following problem:
1. Module `Foo` contains a function `foo`. In `Foo.hi`, `foo` has no unfolding (because of the lack of `INLINABLE`, or `NOINLINE`, or `foo` is too large)
2. One modifies `Foo.foo`. Now `foo` is a different function, but its fingerprint in `Foo.hi` remains the same (this is not always the case, but it is sometimes).
3. There's a GHC plugin which depends on `Foo.foo`, and whose `pluginRecompile` is `flagRecompile`.
4. The plugin does not re-run after modifying `foo` because its fingerprint remains the same, even though its behavior has now changed.
So it seems 2 is the culprit. Is it possible to compute the fingerprint of a function from its definition, which ensures a different definition always has a different fingerprint?Matthew PickeringMatthew Pickeringhttps://gitlab.haskell.org/ghc/ghc/-/issues/21388Add support for STG plugins2022-05-26T22:17:51ZOleg GrenrusAdd support for STG pluginsThere is `installCoreToDos`, but there isn't `installStgToDos`. These could be handy (to e.g. dump STG in alternative formats).There is `installCoreToDos`, but there isn't `installStgToDos`. These could be handy (to e.g. dump STG in alternative formats).https://gitlab.haskell.org/ghc/ghc/-/issues/20419Should InScopeSet only include variables actually in scope.2021-11-08T22:54:09ZZiyang Liuunsafefixio@gmail.comShould InScopeSet only include variables actually in scope.## Summary
In a module with two top-level bindings:
```haskell
a = let foo = ... in ...
b = ...
```
When simplifying `a`, `foo` essentially becomes another top-level binding, in the sense that it appears in the `InScopeSet` when simplif...## Summary
In a module with two top-level bindings:
```haskell
a = let foo = ... in ...
b = ...
```
When simplifying `a`, `foo` essentially becomes another top-level binding, in the sense that it appears in the `InScopeSet` when simplifying `b`.
However, from `PostInlineUnconditionally`'s point of view, `foo` is not a top-level binding, so it may be inlined, and cease to exist, before we start simplifying `b`.
But when that happens, `foo` remains to be in the `InScopeSet` when simplifying `b`. If the simplified `b` refers to `foo`, GHC would crash.
## Steps to reproduce
It took me a while to find a small repro example. It turns out there must be at least two local bindings (`foo` and `bar` in the example below) for the bug to occur, which I didn't realize at first..
Anyway, here's the smallest example I found. To run: `ghc -package ghc -dynamic A.hs`. Result:
```
panic! (the 'impossible' happened)
GHC version 9.3.20210925:
refineFromInScope
InScope {wild_00 bar_a7ae a b $trModule $trModule_s7cH
$trModule_s7cI $trModule_s7cJ $trModule_s7cK}
foo_a7ad
```
```haskell
-- A.hs
{-# OPTIONS_GHC -fplugin Plugin #-}
module A where
import Plugin (runPlugin)
a, b :: Bool
a =
-- OccInfo foo = OneOcc { occ_n_br = 2 } (PostInlineUnconditionally)
-- OccInfo bar = ManyOccs (not inlined)
let foo = (True == True)
bar = (True == True)
in case not bar of
True -> foo
False -> foo && bar
-- `b` will be rewritten by the plugin into `not foo`.
b = runPlugin (True == True)
```
```haskell
{-# LANGUAGE OverloadedStrings #-}
module Plugin (plugin, runPlugin) where
import Data.List (find, isPrefixOf)
import Data.Maybe (fromJust)
import GHC (RdrName (..), mkModuleName)
import qualified GHC.Plugins as P
import GHC.Runtime.Loader (lookupRdrNameInModuleForPlugins)
import GHC.Types.TyThing (lookupId)
plugin :: P.Plugin
plugin = P.defaultPlugin {P.installCoreToDos = \_opts -> pure . install}
runPlugin :: Bool -> Bool
runPlugin = undefined
{-# NOINLINE runPlugin #-}
install :: [P.CoreToDo] -> [P.CoreToDo]
install = (todo :)
where
todo :: P.CoreToDo
todo = P.CoreDoPluginPass ("my pass") $ \guts -> do
hscEnv <- P.getHscEnv
runPluginId <- findId hscEnv "Plugin" "runPlugin"
notId <- findId hscEnv "Data.Bool" "not"
let try :: P.RuleFun
try _dflags inScope ident _exprs
| ident == runPluginId =
let inScopeVars = P.nonDetEltsUniqSet . P.getInScopeVars $ fst inScope
in Just $ case find
(\v -> "foo" `isPrefixOf` P.occNameString (P.nameOccName (P.varName v)))
inScopeVars of
-- `foo` is in `inScope`. Return `not foo`.
Just x -> P.App (P.Var notId) (P.Var x)
-- `foo` is not in `inScope`.
Nothing -> undefined
| otherwise = Nothing
rule =
P.BuiltinRule
{ P.ru_name = "my rule",
P.ru_fn = P.varName runPluginId,
P.ru_nargs = 1,
P.ru_try = try
}
-- Insert a rule that rewrites `runPlugin`.
pure $ guts {P.mg_rules = rule : P.mg_rules guts}
findId :: P.HscEnv -> String -> String -> P.CoreM P.Id
findId env modu occ =
lookupId
=<< P.liftIO
( fst . fromJust
<$> lookupRdrNameInModuleForPlugins
env
(mkModuleName modu)
(Unqual (P.mkVarOcc occ))
)
```
## Environment
* GHC version used: 9.3.20210925 (4b7ba3ae671ebd70fa7bdf744d30b8061db13016)https://gitlab.haskell.org/ghc/ghc/-/issues/20339Inconsistent naming of specialized functions in Core2021-09-18T05:11:56ZZiyang Liuunsafefixio@gmail.comInconsistent naming of specialized functions in CoreFor a method `meth` in class `Foo a`, when `a` is specialized to `A` in Core, the name of the method is sometimes `$fFooA_$cmeth`, and sometimes something like `$fFooA3`.
Also, when `Foo` has a subclass, say `Foo a => Bar a`, sometimes ...For a method `meth` in class `Foo a`, when `a` is specialized to `A` in Core, the name of the method is sometimes `$fFooA_$cmeth`, and sometimes something like `$fFooA3`.
Also, when `Foo` has a subclass, say `Foo a => Bar a`, sometimes the name of the method becomes `$fBarA_$meth`.
So there are multiple different namings of the same method. This can make writing Core plugins more difficult. For example, among other things, it makes it more difficult to detect whether a function is recursive. I've encountered cases where `$fFooA_$cmeth` is recursive, but its unfolding contains `$fFooA3` rather than `$fFooA_$cmeth`.
Is it possible to make such naming consistent?https://gitlab.haskell.org/ghc/ghc/-/issues/18977Allow type-checking plugins to emit custom error messages2020-12-04T21:02:49Zsheafsam.derbyshire@gmail.comAllow type-checking plugins to emit custom error messages@rae has indicated that the type-checking plugin API is going to change with the upcoming refactors to constraint solving (removal of flattening variables, removal of derived constraints), which will require plugin authors to adjust thei...@rae has indicated that the type-checking plugin API is going to change with the upcoming refactors to constraint solving (removal of flattening variables, removal of derived constraints), which will require plugin authors to adjust their plugins.
This seems like it could be a good time to add a new piece of functionality to the API, by allowing plugins to emit an error message directly rather than reporting insoluble constraints. I suggest changing `TcPluginResult` from
```haskell
data TcPluginResult
= TcPluginContradiction [Ct]
| TcPluginOk [(EvTerm,Ct)] [Ct]
```
to
```haskell
data Insolubles
= Insolubles
{ insolubleWantedConstraints :: [Ct]
, insolubleErrorMessages :: [SDoc]
}
deriving (Show, Semigroup, Monoid)
data TcPluginResult
= TcPluginInsolubles Insolubles
| TcPluginOk [(EvTerm,Ct)] [Ct]
```
allowing the plugin to directly report error messages in the form of `SDoc`. We would require a similar property as currently: one of the fields of `Insolubles` must be non-empty.
This would allow one to write **error-message plugins** which would never produce any evidence, i.e. they would always return either `TcPluginInsolubles ( mempty { insolubleErrorMessages = messages } )` or `TcPluginOk [] cts`. This could also be a way to experiment with interactive error messages, if we parametrise over the error message field:
```haskell
class DisplayableFormat messageFormat where { ... }
data Insolubles messageFormat
= Insolubles
{ insolubleWantedConstraints :: [Ct]
, insolubleErrorMessages :: [messageFormat]
}
data TcPluginResult where
TcPluginInsolubles
:: DisplayableFormat messageFormat
=> Insolubles messageFormat
-> TcPluginResult
TcPluginOk
:: [(EvTerm,Ct)]
-> [Ct]
-> TcPluginResult
pattern TcPluginContradiction :: [Ct] -> TcPluginResult
pattern TcPluginContradiction insolubles = TcPluginInsolubles (Insolubles insolubles ([] :: [SDoc]))
```
Currently, if a plugin wants to report a specific error message, I believe the only option is to emit constraints using the `TypeError` mechanism.https://gitlab.haskell.org/ghc/ghc/-/issues/18671Allow access to "unoptimised" Core binds in GHC Plugins2020-09-23T08:54:40ZAlfredo Di NapoliAllow access to "unoptimised" Core binds in GHC Plugins## Summary
cc @rae
As part of my work on Liquid Haskell, I needed access to the Core binds of the input program as GHC would generate them
when compiling with optimisations turned off (i.e. `-O0`). To say it differently, regardless of...## Summary
cc @rae
As part of my work on Liquid Haskell, I needed access to the Core binds of the input program as GHC would generate them
when compiling with optimisations turned off (i.e. `-O0`). To say it differently, regardless of which is the optimisation
level the user picked when compiling the program, my plugin would need to be able to extract the "unoptimised" Core binds,
to be fed to the LH API later on.
Unfortunately this doesn't seem to be possible using the current GHC API, and led us to replicate some work, as
described in [this blog post](http://www.well-typed.com/blog/2020/08/implementing-a-ghc-plugin-for-liquid-haskell/), and
seems to happen when `UNPACK` is used. Am I missing something glaringly obvious here?
## Steps to reproduce
I have created a small reproduction example here:
https://github.com/adinapoli/ghc-unoptimised-core-binds-repro
It can be compiled with `cabal v2-build all`, and a `README` is provided. When compiled, it will print two sets of Core binds,
one that should be allegedly "unoptimised", the other is the standard binds as we would get from the `ModGuts` from the
`Core` pass (i.e. optimised, as the input program is compiled with `-O2`).
## Expected behavior
I would expect that the first Core binds printed on stdout to be something like (this is what I'd get when compiling
the whole project with `-O0`):
```
unFoo :: Foo -> Int
unFoo
= \ (ds_d3hC :: Foo) -> case ds_d3hC of { Foo ds_d3hD -> ds_d3hD }
```
Instead, no matter which magic GHC API incantation I try, I get:
```
unFoo :: Foo -> Int
unFoo
= \ (ds_d3pn :: Foo) -> case ds_d3pn of { Foo dt_d3pS -> I# dt_d3pS }
```
If I change `unoptFlags` to also change the `hscTarget` like this:
```
unoptFlags :: DynFlags -> DynFlags
unoptFlags df = updOptLevel 0 df
{ debugLevel = 1
, ghcLink = LinkInMemory
, hscTarget = HscInterpreted
, ghcMode = CompManager
}
```
This would yield something that still looks different from the first snippet:
```
unFoo :: Foo -> Int
unFoo
= \ (ds_dXp :: Foo) ->
src<Main.hs:6:18-22>
case ds_dXp of { Foo dt_dXt ->
let {
unFoo :: Int
unFoo = I# dt_dXt } in
src<Main.hs:6:18-22> break<2>(unFoo) unFoo
}
```
## Environment
* GHC version used: 8.10.2
Optional:
* Operating System: Mac OS X
* System Architecture: x86_64https://gitlab.haskell.org/ghc/ghc/-/issues/18503Plugins should be able to write data to extensible interface fields during th...2021-02-25T21:32:47ZJosh MeredithPlugins should be able to write data to extensible interface fields during the pipeline## Motivation
Implemented in https://gitlab.haskell.org/ghc/ghc/-/merge_requests/3758
Currently, extensible interface files (https://gitlab.haskell.org/ghc/ghc/-/wikis/Extensible-Interface-Files) are able to either have fields added by...## Motivation
Implemented in https://gitlab.haskell.org/ghc/ghc/-/merge_requests/3758
Currently, extensible interface files (https://gitlab.haskell.org/ghc/ghc/-/wikis/Extensible-Interface-Files) are able to either have fields added by GHC itself, or by users of the GHC API - but the latter case only allows this to be done after the `.hi` exists on disk.
These extensible fields are significantly more useful if plugins are also able to add data to the interface while the pipeline is still in progress.
## Proposal
Add functions for plugins to write to a set of interface fields that will be later written to the disk, alongside the rest of the interface file:
- `registerInterfaceData :: Binary a => FieldName -> HscEnv -> a -> IO ()`
- `registerInterfaceDataWith :: FieldName -> HscEnv -> (BinHandle -> IO ()) -> IO ()`
- `unregisterInterfaceData :: FieldName -> HscEnv -> IO ()`
Note that `Binary` and `BinHandle` are from GHC's internal binary implementation.
These data are then stored serialised in an `IORef` within the `HscEnv` and written with the `ModIface`.