GHC issueshttps://gitlab.haskell.org/ghc/ghc/-/issues2023-10-05T09:35:28Zhttps://gitlab.haskell.org/ghc/ghc/-/issues/16737cpp is not run when building .c or .cpp files2023-10-05T09:35:28ZZejun Wucpp is not run when building .c or .cpp files# Summary
C pre-processor is not run and `-D` or `-optP` flags are not respected when building `.c` or `.cpp` files.
# Steps to reproduce
```bash
$ cat test.c
#ifndef TEST
#define TEST 0
#endif
#include <stdio.h>
int main(void) {
pr...# Summary
C pre-processor is not run and `-D` or `-optP` flags are not respected when building `.c` or `.cpp` files.
# Steps to reproduce
```bash
$ cat test.c
#ifndef TEST
#define TEST 0
#endif
#include <stdio.h>
int main(void) {
printf("%d\n", TEST);
return 0;
}
$ ghc -DTEST=9 test.c -no-hs-main && ./a.out
0
```
or
```bash
$ mkdir -p x && touch x/x.h
$ echo '#include "x.h"' > y.c
$ ghc -Ix -c y.c
$ ghc -optP=-Ix -c y.c
y.c:1:10: error:
fatal error: x.h: No such file or directory
#include "x.h"
^~~~~
|
1 | #include "x.h"
| ^
compilation terminated.
`gcc' failed in phase `C Compiler'. (Exit code: 1)
```
# Expected behavior
Output `9` for the first example.
Both `ghc -Ix -c y.c` and `ghc -optP=-Ix -c y.c` work in the second example.
This is because we don't run cpp for `Cc`, `Ccxx`, `Cobjc` and `Cobjcxx` phase and we probably should do so.
For `HCc` phase, we usually run cpp in previous phase and probably don't want to run it again.8.8.1Zejun WuZejun Wuhttps://gitlab.haskell.org/ghc/ghc/-/issues/15848ghc builds cbits with -fPIC even when -fPIC is not passed to ghc on linux2019-07-07T18:02:44ZZejun Wughc builds cbits with -fPIC even when -fPIC is not passed to ghc on linuxghc assumes that on linux, C compiler (e.g. gcc) will build non-PIC object code when no `-fPIC` is passed. But this is no longer true for recent gcc version on some distributions. e.g.
```
$ uname -a
Linux watashi-arch32 4.18.5-arch1-1....ghc assumes that on linux, C compiler (e.g. gcc) will build non-PIC object code when no `-fPIC` is passed. But this is no longer true for recent gcc version on some distributions. e.g.
```
$ uname -a
Linux watashi-arch32 4.18.5-arch1-1.0-ARCH #1 SMP PREEMPT Tue Aug 28 20:45:30 CEST 2018 i686 GNU/Linux
$ gcc --version
gcc (GCC) 7.3.1 20180312
$ touch dummy.c
$ gcc -Q -v dummy.c 2>&1 | grep PIC
options enabled: -fPIC -fPIE -faggressive-loop-optimizations
```
We need pass `-fno-PIC` explicitly just like we have done for `-no-pie` (See #12759)
In particular, this results in 300+ ext-interp related tests to fail on i386 when built with a gcc that has `-fPIC` on by default, as we don't support loading non-PIC .o built from .c on i386. (See #15847, fix this bug will mitigate #15847)8.8.1Zejun WuZejun Wuhttps://gitlab.haskell.org/ghc/ghc/-/issues/10869Option to dump preprocessed source2019-07-07T18:33:25ZphischuOption to dump preprocessed sourceIt would be awesome if GHC had an option `-ddump-preprocessed` that dumps the source code for each module after preprocessing. I am not sure what the current definition of "preprocessing" is but I mean the output of at least the followin...It would be awesome if GHC had an option `-ddump-preprocessed` that dumps the source code for each module after preprocessing. I am not sure what the current definition of "preprocessing" is but I mean the output of at least the following tools: happy, alex, c2hs, hsc2hs and cpp. Additionally even if a module was not subject to any preprocessing it should be dumped anyway.
Use case: I want to parse module files from packages from hackage with `haskell-src-exts` but find it prohibitively difficult to get the preprocessing right. The idea is that after `cabal install` with ghc options `-ddump-preprocessed -ddump-to-file -dumpdir real_modules` you get a complete working set of haskell modules that can be parsed directly without any preprocessing in folder `real_modules`.
<details><summary>Trac metadata</summary>
| Trac field | Value |
| ---------------------- | -------------- |
| Version | 7.10.2 |
| Type | FeatureRequest |
| TypeOfFailure | OtherFailure |
| Priority | low |
| Resolution | Unresolved |
| Component | Driver |
| Test case | |
| Differential revisions | |
| BlockedBy | |
| Related | |
| Blocking | |
| CC | |
| Operating system | |
| Architecture | |
</details>
<!-- {"blocked_by":[],"summary":"Option to dump preprocessed source","status":"New","operating_system":"","component":"Driver","related":[],"milestone":"","resolution":"Unresolved","owner":{"tag":"Unowned"},"version":"7.10.2","keywords":[],"differentials":[],"test_case":"","architecture":"","cc":[""],"type":"FeatureRequest","description":"It would be awesome if GHC had an option `-ddump-preprocessed` that dumps the source code for each module after preprocessing. I am not sure what the current definition of \"preprocessing\" is but I mean the output of at least the following tools: happy, alex, c2hs, hsc2hs and cpp. Additionally even if a module was not subject to any preprocessing it should be dumped anyway.\r\n\r\nUse case: I want to parse module files from packages from hackage with `haskell-src-exts` but find it prohibitively difficult to get the preprocessing right. The idea is that after `cabal install` with ghc options `-ddump-preprocessed -ddump-to-file -dumpdir real_modules` you get a complete working set of haskell modules that can be parsed directly without any preprocessing in folder `real_modules`.","type_of_failure":"OtherFailure","blocking":[]} -->8.8.1Roland SennRoland Senn