Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • ghc/ghc
  • bgamari/ghc
  • syd/ghc
  • ggreif/ghc
  • watashi/ghc
  • RolandSenn/ghc
  • mpickering/ghc
  • DavidEichmann/ghc
  • carter/ghc
  • harpocrates/ghc
  • ethercrow/ghc
  • mijicd/ghc
  • adamse/ghc
  • alexbiehl/ghc
  • gridaphobe/ghc
  • trofi/ghc
  • supersven/ghc
  • ppk/ghc
  • ulysses4ever/ghc
  • AndreasK/ghc
  • ghuntley/ghc
  • shayne-fletcher-da/ghc
  • fgaz/ghc
  • yav/ghc
  • osa1/ghc
  • mbbx6spp/ghc
  • JulianLeviston/ghc
  • reactormonk/ghc
  • rae/ghc
  • takenobu-hs/ghc
  • michalt/ghc
  • andrewthad/ghc
  • hsyl20/ghc
  • scottgw/ghc
  • sjakobi/ghc
  • angerman/ghc
  • RyanGlScott/ghc
  • hvr/ghc
  • howtonotwin/ghc
  • chessai/ghc
  • m-renaud/ghc
  • brprice/ghc
  • stevehartdata/ghc
  • sighingnow/ghc
  • kgardas/ghc
  • ckoparkar/ghc
  • alp/ghc
  • smaeul/ghc
  • kakkun61/ghc
  • sykloid/ghc
  • newhoggy/ghc
  • toonn/ghc
  • nineonine/ghc
  • Phyx/ghc
  • ezyang/ghc
  • tweag/ghc
  • langston/ghc
  • ndmitchell/ghc
  • rockbmb/ghc
  • artempyanykh/ghc
  • mniip/ghc
  • mynguyenbmc/ghc
  • alexfmpe/ghc
  • crockeea/ghc
  • nh2/ghc
  • vaibhavsagar/ghc
  • phadej/ghc
  • Haskell-mouse/ghc
  • lolotp/ghc
  • spacekitteh/ghc
  • michaelpj/ghc
  • mgsloan/ghc
  • HPCohen/ghc
  • tmobile/ghc
  • radrow/ghc
  • simonmar/ghc
  • _deepfire/ghc
  • Ericson2314/ghc
  • leitao/ghc
  • fumieval/ghc
  • trac-isovector/ghc
  • cblp/ghc
  • xich/ghc
  • ciil/ghc
  • erthalion/ghc
  • xldenis/ghc
  • autotaker/ghc
  • haskell-wasm/ghc
  • kcsongor/ghc
  • agander/ghc
  • Baranowski/ghc
  • trac-dredozubov/ghc
  • 23Skidoo/ghc
  • iustin/ghc
  • ningning/ghc
  • josefs/ghc
  • kabuhr/ghc
  • gallais/ghc
  • dten/ghc
  • expipiplus1/ghc
  • Pluralia/ghc
  • rohanjr/ghc
  • intricate/ghc
  • kirelagin/ghc
  • Javran/ghc
  • DanielG/ghc
  • trac-mizunashi_mana/ghc
  • pparkkin/ghc
  • bollu/ghc
  • ntc2/ghc
  • jaspervdj/ghc
  • JoshMeredith/ghc
  • wz1000/ghc
  • zkourouma/ghc
  • code5hot/ghc
  • jdprice/ghc
  • tdammers/ghc
  • J-mie6/ghc
  • trac-lantti/ghc
  • ch1bo/ghc
  • cgohla/ghc
  • lucamolteni/ghc
  • acairncross/ghc
  • amerocu/ghc
  • chreekat/ghc
  • txsmith/ghc
  • trupill/ghc
  • typetetris/ghc
  • sergv/ghc
  • fryguybob/ghc
  • erikd/ghc
  • trac-roland/ghc
  • setupminimal/ghc
  • Friede80/ghc
  • SkyWriter/ghc
  • xplorld/ghc
  • abrar/ghc
  • obsidiansystems/ghc
  • Icelandjack/ghc
  • adinapoli/ghc
  • trac-matthewbauer/ghc
  • heatsink/ghc
  • dwijnand/ghc
  • Cmdv/ghc
  • alinab/ghc
  • pepeiborra/ghc
  • fommil/ghc
  • luochen1990/ghc
  • rlupton20/ghc
  • applePrincess/ghc
  • lehins/ghc
  • ronmrdechai/ghc
  • leeadam/ghc
  • harendra/ghc
  • mightymosquito1991/ghc
  • trac-gershomb/ghc
  • lucajulian/ghc
  • Rizary/ghc
  • VictorCMiraldo/ghc
  • jamesbrock/ghc
  • andrewdmeier/ghc
  • luke/ghc
  • pranaysashank/ghc
  • cocreature/ghc
  • hithroc/ghc
  • obreitwi/ghc
  • slrtbtfs/ghc
  • kaol/ghc
  • yairchu/ghc
  • Mathemagician98/ghc
  • trac-taylorfausak/ghc
  • leungbk/ghc
  • MichaWiedenmann/ghc
  • chris-martin/ghc
  • TDecki/ghc
  • adithyaov/ghc
  • trac-gelisam/ghc
  • Lysxia/ghc
  • complyue/ghc
  • bwignall/ghc
  • sternmull/ghc
  • sonika/ghc
  • leif/ghc
  • broadwaylamb/ghc
  • myszon/ghc
  • danbroooks/ghc
  • Mechachleopteryx/ghc
  • zardyh/ghc
  • trac-vdukhovni/ghc
  • OmarKhaledAbdo/ghc
  • arrowd/ghc
  • Bodigrim/ghc
  • matheus23/ghc
  • cardenaso11/ghc
  • trac-Athas/ghc
  • mb720/ghc
  • DylanZA/ghc
  • liff/ghc
  • typedrat/ghc
  • trac-claude/ghc
  • jbm/ghc
  • Gertjan423/ghc
  • PHO/ghc
  • JKTKops/ghc
  • kockahonza/ghc
  • msakai/ghc
  • Sir4ur0n/ghc
  • barambani/ghc
  • vishnu.c/ghc
  • dcoutts/ghc
  • trac-runeks/ghc
  • trac-MaxGabriel/ghc
  • lexi.lambda/ghc
  • strake/ghc
  • spavikevik/ghc
  • JakobBruenker/ghc
  • rmanne/ghc
  • gdziadkiewicz/ghc
  • ani/ghc
  • iliastsi/ghc
  • smunix/ghc
  • judah/ghc
  • blackgnezdo/ghc
  • emilypi/ghc
  • trac-bpfoley/ghc
  • muesli4/ghc
  • trac-gkaracha/ghc
  • Kleidukos/ghc
  • nek0/ghc
  • TristanCacqueray/ghc
  • dwulive/ghc
  • mbakke/ghc
  • arybczak/ghc
  • Yang123321/ghc
  • maksbotan/ghc
  • QuietMisdreavus/ghc
  • trac-olshanskydr/ghc
  • emekoi/ghc
  • samuela/ghc
  • josephcsible/ghc
  • dramforever/ghc
  • lpsmith/ghc
  • DenisFrezzato/ghc
  • michivi/ghc
  • jneira/ghc
  • jeffhappily/ghc
  • Ivan-Yudin/ghc
  • nakaji-dayo/ghc
  • gdevanla/ghc
  • galen/ghc
  • fendor/ghc
  • yaitskov/ghc
  • rcythr/ghc
  • awpr/ghc
  • jeremyschlatter/ghc
  • Aver1y/ghc
  • mitchellvitez/ghc
  • merijn/ghc
  • tomjaguarpaw1/ghc
  • trac-NoidedSuper/ghc
  • erewok/ghc
  • trac-junji.hashimoto/ghc
  • adamwespiser/ghc
  • bjaress/ghc
  • jhrcek/ghc
  • leonschoorl/ghc
  • lukasz-golebiewski/ghc
  • sheaf/ghc
  • last-g/ghc
  • carassius1014/ghc
  • eschwartz/ghc
  • dwincort/ghc
  • felixwiemuth/ghc
  • TimWSpence/ghc
  • marcusmonteirodesouza/ghc
  • WJWH/ghc
  • vtols/ghc
  • theobat/ghc
  • BinderDavid/ghc
  • ckoparkar0/ghc
  • alexander-kjeldaas/ghc
  • dme2/ghc
  • philderbeast/ghc
  • aaronallen8455/ghc
  • rayshih/ghc
  • benkard/ghc
  • mpardalos/ghc
  • saidelman/ghc
  • leiftw/ghc
  • ca333/ghc
  • bwroga/ghc
  • nmichael44/ghc
  • trac-crobbins/ghc
  • felixonmars/ghc
  • adityagupta1089/ghc
  • hgsipiere/ghc
  • treeowl/ghc
  • alexpeits/ghc
  • CraigFe/ghc
  • dnlkrgr/ghc
  • kerckhove_ts/ghc
  • cptwunderlich/ghc
  • eiais/ghc
  • hahohihu/ghc
  • sanchayan/ghc
  • lemmih/ghc
  • sehqlr/ghc
  • trac-dbeacham/ghc
  • luite/ghc
  • trac-f-a/ghc
  • vados/ghc
  • luntain/ghc
  • fatho/ghc
  • alexbiehl-gc/ghc
  • dcbdan/ghc
  • tvh/ghc
  • liam-ly/ghc
  • timbobbarnes/ghc
  • GovanifY/ghc
  • shanth2600/ghc
  • gliboc/ghc
  • duog/ghc
  • moxonsghost/ghc
  • zander/ghc
  • masaeedu/ghc
  • georgefst/ghc
  • guibou/ghc
  • nicuveo/ghc
  • mdebruijne/ghc
  • stjordanis/ghc
  • emiflake/ghc
  • wygulmage/ghc
  • frasertweedale/ghc
  • coot/ghc
  • aratamizuki/ghc
  • tsandstr/ghc
  • mrBliss/ghc
  • Anton-Latukha/ghc
  • tadfisher/ghc
  • vapourismo/ghc
  • Sorokin-Anton/ghc
  • basile-henry/ghc
  • trac-mightybyte/ghc
  • AbsoluteNikola/ghc
  • cobrien99/ghc
  • songzh/ghc
  • blamario/ghc
  • aj4ayushjain/ghc
  • trac-utdemir/ghc
  • tangcl/ghc
  • hdgarrood/ghc
  • maerwald/ghc
  • arjun/ghc
  • ratherforky/ghc
  • haskieLambda/ghc
  • EmilGedda/ghc
  • Bogicevic/ghc
  • eddiejessup/ghc
  • kozross/ghc
  • AlistairB/ghc
  • 3Rafal/ghc
  • christiaanb/ghc
  • trac-bit/ghc
  • matsumonkie/ghc
  • trac-parsonsmatt/ghc
  • chisui/ghc
  • jaro/ghc
  • trac-kmiyazato/ghc
  • davidsd/ghc
  • Tritlo/ghc
  • I-B-3/ghc
  • lykahb/ghc
  • AriFordsham/ghc
  • turion1/ghc
  • berberman/ghc
  • christiantakle/ghc
  • zyklotomic/ghc
  • trac-ocramz/ghc
  • CSEdd/ghc
  • doyougnu/ghc
  • mmhat/ghc
  • why-not-try-calmer/ghc
  • plutotulp/ghc
  • kjekac/ghc
  • Manvi07/ghc
  • teo/ghc
  • cactus/ghc
  • CarrieMY/ghc
  • abel/ghc
  • yihming/ghc
  • tsakki/ghc
  • jessicah/ghc
  • oliverbunting/ghc
  • meld/ghc
  • friedbrice/ghc
  • Joald/ghc
  • abarbu/ghc
  • DigitalBrains1/ghc
  • sterni/ghc
  • alexDarcy/ghc
  • hexchain/ghc
  • minimario/ghc
  • zliu41/ghc
  • tommd/ghc
  • jazcarate/ghc
  • peterbecich/ghc
  • alirezaghey/ghc
  • solomon/ghc
  • mikael.urankar/ghc
  • davjam/ghc
  • int-index/ghc
  • MorrowM/ghc
  • nrnrnr/ghc
  • Sonfamm/ghc-test-only
  • afzt1/ghc
  • nguyenhaibinh-tpc/ghc
  • trac-lierdakil/ghc
  • MichaWiedenmann1/ghc
  • jmorag/ghc
  • Ziharrk/ghc
  • trac-MitchellSalad/ghc
  • juampe/ghc
  • jwaldmann/ghc
  • snowleopard/ghc
  • juhp/ghc
  • normalcoder/ghc
  • ksqsf/ghc
  • trac-jberryman/ghc
  • roberth/ghc
  • 1ntEgr8/ghc
  • epworth/ghc
  • MrAdityaAlok/ghc
  • JunmingZhao42/ghc
  • jappeace/ghc
  • trac-Gabriel439/ghc
  • alt-romes/ghc
  • HugoPeters1024/ghc
  • 10ne1/ghc-fork
  • agentultra/ghc
  • Garfield1002/ghc
  • ChickenProp/ghc
  • clyring/ghc
  • MaxHearnden/ghc
  • jumper149/ghc
  • vem/ghc
  • ketzacoatl/ghc
  • Rosuavio/ghc
  • jackohughes/ghc
  • p4l1ly/ghc
  • konsumlamm/ghc
  • shlevy/ghc
  • torsten.schmits/ghc
  • andremarianiello/ghc
  • amesgen/ghc
  • googleson78/ghc
  • InfiniteVerma/ghc
  • uhbif19/ghc
  • yiyunliu/ghc
  • raehik/ghc
  • mrkun/ghc
  • telser/ghc
  • 1Jajen1/ghc
  • slotThe/ghc
  • WinstonHartnett/ghc
  • mpilgrem/ghc
  • dreamsmasher/ghc
  • schuelermine/ghc
  • trac-Viwor/ghc
  • undergroundquizscene/ghc
  • evertedsphere/ghc
  • coltenwebb/ghc
  • oberblastmeister/ghc
  • agrue/ghc
  • lf-/ghc
  • zacwood9/ghc
  • steshaw/ghc
  • high-cloud/ghc
  • SkamDart/ghc
  • PiDelport/ghc
  • maoif/ghc
  • RossPaterson/ghc
  • CharlesTaylor7/ghc
  • ribosomerocker/ghc
  • trac-ramirez7/ghc
  • daig/ghc
  • NicolasT/ghc
  • FinleyMcIlwaine/ghc
  • lawtonnichols/ghc
  • jmtd/ghc
  • ozkutuk/ghc
  • wildsebastian/ghc
  • lrzlin/ghc
  • tobias/ghc
  • fw/ghc
  • hawkinsw/ghc
  • type-dance/ghc
  • rui314/ghc
  • ocharles/ghc
  • wavewave/ghc
  • TheKK/ghc
  • nomeata/ghc
  • trac-csabahruska/ghc
  • jonathanjameswatson/ghc
  • L-as/ghc
  • Axman6/ghc
  • barracuda156/ghc
  • trac-jship/ghc
  • jake-87/ghc
  • meooow/ghc
  • rebeccat/ghc
  • hamana55/ghc
  • Enigmage/ghc
  • kokobd/ghc
  • agevelt/ghc
  • gshen42/ghc
  • chrismwendt/ghc
  • MangoIV/ghc
  • teto/ghc
  • Sookr1/ghc
  • trac-thomasjm/ghc
  • barci2/ghc-dev
  • trac-m4dc4p/ghc
  • dixonary/ghc
  • breakerzirconia/ghc
  • alexsio27444/ghc
  • glocq/ghc
  • sourabhxyz/ghc
  • ryantrinkle/ghc
  • Jade/ghc
  • scedfaliako/ghc
  • martijnbastiaan/ghc
  • trac-george.colpitts/ghc
  • ammarbinfaisal/ghc
  • mimi.vx/ghc
  • lortabac/ghc
  • trac-zyla/ghc
  • benbellick/ghc
  • aadaa-fgtaa/ghc
  • jvanbruegge/ghc
  • archbung/ghc
  • gilmi/ghc
  • mfonism/ghc
  • alex-mckenna/ghc
  • Ei30metry/ghc
  • DiegoDiverio/ghc
  • jorgecunhamendes/ghc
  • liesnikov/ghc
  • akrmn/ghc
  • trac-simplifierticks/ghc
  • jacco/ghc
  • rhendric/ghc
  • damhiya/ghc
  • ryndubei/ghc
  • DaveBarton/ghc
  • trac-Profpatsch/ghc
  • GZGavinZhao/ghc
  • ncfavier/ghc
  • jameshaydon/ghc
  • ajccosta/ghc
  • dschrempf/ghc
  • cydparser/ghc
  • LinuxUserGD/ghc
  • elodielander/ghc
  • facundominguez/ghc
  • psilospore/ghc
  • lachrimae/ghc
  • dylan-thinnes/ghc-type-errors-plugin
  • hamishmack/ghc
  • Leary/ghc
  • lzszt/ghc
  • lyokha/ghc
  • trac-glaubitz/ghc
  • Rewbert/ghc
  • andreabedini/ghc
  • Jasagredo/ghc
  • sol/ghc
  • OlegAlexander/ghc
  • trac-sthibaul/ghc
  • avdv/ghc
  • Wendaolee/ghc
  • ur4t/ghc
  • daylily/ghc
  • boltzmannrain/ghc
  • mmzk1526/ghc
  • trac-fizzixnerd/ghc
  • soulomoon/ghc
  • rwmjones/ghc
  • j14i/ghc
  • tracsis/ghc
  • gesh/ghc
  • flip101/ghc
  • eldritch-cookie/ghc
  • LemonjamesD/ghc
  • pgujjula/ghc
  • skeuchel/ghc
  • noteed/ghc
  • Torrekie/ghc
  • jlwoodwa/ghc
  • ayanamists/ghc
  • husong998/ghc
  • trac-edmundnoble/ghc
  • josephf/ghc
  • contrun/ghc
  • baulig/ghc
  • edsko/ghc
  • mzschr/ghc-issue-24732
  • ulidtko/ghc
  • Arsen/ghc
  • trac-sjoerd_visscher/ghc
  • crumbtoo/ghc
  • L0neGamer/ghc
  • DrewFenwick/ghc
  • benz0li/ghc
  • MaciejWas/ghc
  • jordanrule/ghc
  • trac-qqwy/ghc
  • LiamGoodacre/ghc
  • isomorpheme/ghc
  • trac-danidiaz/ghc
  • Kariim/ghc
  • MTaimoorZaeem/ghc
  • hololeap/ghc
  • ticat-fp/ghc
  • meritamen/ghc
  • criskell/ghc
  • trac-kraai/ghc
  • aergus/ghc
  • jdral/ghc
  • SamB/ghc
  • Tristian/ghc
  • ywgrit/ghc
  • KatsuPatrick/ghc
  • OsePedro/ghc
  • mpscholten/ghc
  • zaquest/ghc
  • fangyi-zhou/ghc
  • augyg/ghc
  • rkirkman/ghc
  • gulin.serge/ghc-windows-aarch64-bootstrap
  • iris/ghc
  • kwxm/ghc
  • maralorn/ghc
  • rafl/ghc
  • nikshalark/ghc
  • mrcjkb/ghc
  • blackheaven/ghc
  • laurenyim/ghc
  • bolt12/ghc
  • Xitian9/ghc
  • wenkokke/ghc
  • kephas/ghc
651 results
Show changes
Commits on Source (571)
Showing
with 1981 additions and 715 deletions
...@@ -189,6 +189,7 @@ _darcs/ ...@@ -189,6 +189,7 @@ _darcs/
/utils/mkUserGuidePart/mkUserGuidePart.cabal /utils/mkUserGuidePart/mkUserGuidePart.cabal
/utils/runghc/runghc.cabal /utils/runghc/runghc.cabal
/utils/gen-dll/gen-dll.cabal /utils/gen-dll/gen-dll.cabal
/utils/ghc-pkg/ghc-pkg.cabal
utils/lndir/fs.* utils/lndir/fs.*
utils/unlit/fs.* utils/unlit/fs.*
rts/fs.* rts/fs.*
......
...@@ -2,39 +2,98 @@ variables: ...@@ -2,39 +2,98 @@ variables:
GIT_SSL_NO_VERIFY: "1" GIT_SSL_NO_VERIFY: "1"
# Commit of ghc/ci-images repository from which to pull Docker images # Commit of ghc/ci-images repository from which to pull Docker images
DOCKER_REV: 8beb70e553e521796f4250000107c008b477040f DOCKER_REV: 3ae30d6ca1dad3670b0e3d3d2cdfdf4ad145b05f
# Sequential version number capturing the versions of all tools fetched by # Sequential version number capturing the versions of all tools fetched by
# .gitlab/win32-init.sh. # .gitlab/ci.sh.
WINDOWS_TOOLCHAIN_VERSION: 1 WINDOWS_TOOLCHAIN_VERSION: 1
# Disable shallow clones; they break our linting rules # Disable shallow clones; they break our linting rules
GIT_DEPTH: 0 GIT_DEPTH: 0
before_script: # Always start with a fresh clone to avoid non-hermetic builds
- git submodule sync --recursive GIT_STRATEGY: clone
- git submodule update --init --recursive
- git checkout .gitmodules # Overridden by individual jobs
- "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true" CONFIGURE_ARGS: ""
GIT_SUBMODULE_STRATEGY: "recursive"
HACKAGE_INDEX_STATE: "2020-12-21T14:48:20Z"
# Reduce XZ compression level for regular jobs (it is bumped to 9 for releases
# and nightly jobs). In my experiments I've got the following bindist size in
# the given time for each compression level (with the quick flavour):
#
# XZ_OPT Time Size
# -9 4m06s 112 MB
# -8 4m00s 114 MB
# -7 3m50s 116 MB
# -6 (default) 3m40s 118 MB
# -5 2m47s 123 MB
# -4 1m57s 134 MB
# -3 1m03s 129 MB
# -2 49.73s 136 MB
# -1 37.72s 142 MB
# -0 34.40s 156 MB
#
XZ_OPT: "-1"
default:
interruptible: true
stages: stages:
- lint # Source linting - lint # Source linting
- quick-build # A very quick smoke-test to weed out broken commits - quick-build # A very quick smoke-test to weed out broken commits
- build # A quick smoke-test to weed out broken commits - build # A quick smoke-test to weed out broken commits
- full-build # Build all the things - full-build # Build all the things
- cleanup # See Note [Cleanup after the shell executor]
- packaging # Source distribution, etc. - packaging # Source distribution, etc.
- testing # head.hackage correctness and compiler performance testing - testing # head.hackage correctness and compiler performance testing
- deploy # push documentation - deploy # push documentation
# N.B.Don't run on wip/ branches, instead on run on merge requests. # Note [The CI Story]
.only-default: &only-default # ~~~~~~~~~~~~~~~~~~~
only: #
- master # There are two different types of pipelines:
- /ghc-[0-9]+\.[0-9]+/ #
- merge_requests # - marge-bot merges to `master`. Here we perform an exhaustive validation
- tags # across all of the platforms which we support. In addition, we push
- web # performance metric notes upstream, providing a persistent record of the
# performance characteristics of the compiler.
#
# - merge requests. Here we perform a slightly less exhaustive battery of
# testing. Namely we omit some configurations (e.g. the unregisterised job).
# These use the merge request's base commit for performance metric
# comparisons.
#
workflow:
# N.B. Don't run on wip/ branches, instead on run on merge requests.
rules:
- if: $CI_MERGE_REQUEST_ID
- if: $CI_COMMIT_TAG
- if: '$CI_COMMIT_BRANCH == "master"'
- if: '$CI_COMMIT_BRANCH =~ /ghc-[0-9]+\.[0-9]+/'
- if: '$CI_PIPELINE_SOURCE == "web"'
.nightly: &nightly
variables:
XZ_OPT: "-9"
rules:
- if: $NIGHTLY
artifacts:
when: always
expire_in: 8 weeks
.release: &release
variables:
BUILD_FLAVOUR: "perf"
XZ_OPT: "-9"
FLAVOUR: "perf"
artifacts:
when: always
expire_in: 1 year
rules:
- if: '$RELEASE_JOB == "yes"'
############################################################ ############################################################
# Runner Tags # Runner Tags
...@@ -68,13 +127,11 @@ ghc-linters: ...@@ -68,13 +127,11 @@ ghc-linters:
dependencies: [] dependencies: []
tags: tags:
- lint - lint
only: rules:
refs: - if: $CI_MERGE_REQUEST_ID
- merge_requests
# Run mypy Python typechecker on linter scripts. # Run mypy Python typechecker on linter scripts.
lint-linters: lint-linters:
<<: *only-default
stage: lint stage: lint
image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV"
script: script:
...@@ -85,7 +142,6 @@ lint-linters: ...@@ -85,7 +142,6 @@ lint-linters:
# Check that .T files all parse by listing broken tests. # Check that .T files all parse by listing broken tests.
lint-testsuite: lint-testsuite:
<<: *only-default
stage: lint stage: lint
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV"
script: script:
...@@ -96,7 +152,6 @@ lint-testsuite: ...@@ -96,7 +152,6 @@ lint-testsuite:
# Run mypy Python typechecker on testsuite driver # Run mypy Python typechecker on testsuite driver
typecheck-testsuite: typecheck-testsuite:
<<: *only-default
stage: lint stage: lint
image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV"
script: script:
...@@ -106,10 +161,9 @@ typecheck-testsuite: ...@@ -106,10 +161,9 @@ typecheck-testsuite:
- lint - lint
# We allow the submodule checker to fail when run on merge requests (to # We allow the submodule checker to fail when run on merge requests (to
# accomodate, e.g., haddock changes not yet upstream) but not on `master` or # accommodate, e.g., haddock changes not yet upstream) but not on `master` or
# Marge jobs. # Marge jobs.
.lint-submods: .lint-submods:
<<: *only-default
stage: lint stage: lint
image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV"
script: script:
...@@ -117,8 +171,7 @@ typecheck-testsuite: ...@@ -117,8 +171,7 @@ typecheck-testsuite:
- base="$(git merge-base FETCH_HEAD $CI_COMMIT_SHA)" - base="$(git merge-base FETCH_HEAD $CI_COMMIT_SHA)"
- "echo Linting submodule changes between $base..$CI_COMMIT_SHA" - "echo Linting submodule changes between $base..$CI_COMMIT_SHA"
- git submodule foreach git remote update - git submodule foreach git remote update
# TODO: Fix submodule linter - submodchecker . $(git rev-list $base..$CI_COMMIT_SHA)
- submodchecker . $(git rev-list $base..$CI_COMMIT_SHA) || true
dependencies: [] dependencies: []
tags: tags:
- lint - lint
...@@ -149,38 +202,30 @@ lint-submods-branch: ...@@ -149,38 +202,30 @@ lint-submods-branch:
- "echo Linting submodule changes between $CI_COMMIT_BEFORE_SHA..$CI_COMMIT_SHA" - "echo Linting submodule changes between $CI_COMMIT_BEFORE_SHA..$CI_COMMIT_SHA"
- git submodule foreach git remote update - git submodule foreach git remote update
- submodchecker . $(git rev-list $CI_COMMIT_BEFORE_SHA..$CI_COMMIT_SHA) - submodchecker . $(git rev-list $CI_COMMIT_BEFORE_SHA..$CI_COMMIT_SHA)
only: rules:
refs: - if: '$CI_COMMIT_BRANCH == "master"'
- master - if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
- /ghc-[0-9]+\.[0-9]+/
.lint-changelogs: .lint-changelogs:
<<: *only-default
stage: lint stage: lint
image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/linters:$DOCKER_REV"
dependencies: [] dependencies: []
tags: tags:
- lint - lint
script: script:
- | - bash .gitlab/linters/check-changelogs.sh
grep TBA libraries/*/changelog.md && (
echo "Error: Found \"TBA\"s in changelogs."
exit 1
)
lint-changelogs: lint-changelogs:
extends: .lint-changelogs extends: .lint-changelogs
# Allow failure since this isn't a final release. # Allow failure since this isn't a final release.
allow_failure: true allow_failure: true
only: rules:
refs: - if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
- /ghc-[0-9]+\.[0-9]+/
lint-release-changelogs: lint-release-changelogs:
extends: .lint-changelogs extends: .lint-changelogs
only: rules:
refs: - if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
- /ghc-[0-9]+\.[0-9]+\.[0-9]+-.*/
############################################################ ############################################################
...@@ -188,34 +233,19 @@ lint-release-changelogs: ...@@ -188,34 +233,19 @@ lint-release-changelogs:
############################################################ ############################################################
.validate-hadrian: .validate-hadrian:
<<: *only-default
variables: variables:
FLAVOUR: "validate" FLAVOUR: "validate"
script: script:
- cabal update - .gitlab/ci.sh setup
- git clean -xdf && git submodule foreach git clean -xdf - .gitlab/ci.sh configure
- .gitlab/prepare-system.sh - .gitlab/ci.sh build_hadrian
- if [[ -d ./cabal-cache ]]; then cp -R ./.cabal-cache ~/.cabal-cache; fi - .gitlab/ci.sh test_hadrian
- ./boot
- ./configure $CONFIGURE_ARGS
- hadrian/build.cabal.sh --flavour=$FLAVOUR -j`mk/detect-cpu-count.sh` --docs=no-sphinx binary-dist
- mv _build/bindist/ghc*.tar.xz ghc.tar.xz
- export TOP=$(pwd)
- cd _build/bindist/ghc-*/ && ./configure --prefix=$TOP/_build/install && make install && cd ../../../
- |
# Prepare to push git notes.
export METRICS_FILE=$CI_PROJECT_DIR/performance-metrics.tsv
git config user.email "ben+ghc-ci@smart-cactus.org"
git config user.name "GHC GitLab CI"
- hadrian/build.cabal.sh --flavour=$FLAVOUR -j`mk/detect-cpu-count.sh` --docs=no-sphinx test --summary-junit=./junit.xml --test-compiler=$TOP/_build/install/bin/ghc || (.gitlab/push-test-metrics.sh && false)
- |
# Push git notes.
.gitlab/push-test-metrics.sh
cache: cache:
key: hadrian key: hadrian
paths: paths:
- cabal-cache - cabal-cache
artifacts: artifacts:
when: always
reports: reports:
junit: junit.xml junit: junit.xml
expire_in: 2 week expire_in: 2 week
...@@ -235,6 +265,8 @@ lint-release-changelogs: ...@@ -235,6 +265,8 @@ lint-release-changelogs:
- git submodule update --init --recursive - git submodule update --init --recursive
- git checkout .gitmodules - git checkout .gitmodules
- "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true" - "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true"
after_script:
- .gitlab/ci.sh clean
tags: tags:
- x86_64-linux - x86_64-linux
...@@ -250,7 +282,6 @@ validate-x86_64-linux-deb9-unreg-hadrian: ...@@ -250,7 +282,6 @@ validate-x86_64-linux-deb9-unreg-hadrian:
TEST_ENV: "x86_64-linux-deb9-unreg-hadrian" TEST_ENV: "x86_64-linux-deb9-unreg-hadrian"
hadrian-ghc-in-ghci: hadrian-ghc-in-ghci:
<<: *only-default
stage: quick-build stage: quick-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV"
before_script: before_script:
...@@ -267,7 +298,7 @@ hadrian-ghc-in-ghci: ...@@ -267,7 +298,7 @@ hadrian-ghc-in-ghci:
- cabal update - cabal update
- cd hadrian; cabal new-build --project-file=ci.project; cd .. - cd hadrian; cabal new-build --project-file=ci.project; cd ..
- git clean -xdf && git submodule foreach git clean -xdf - git clean -xdf && git submodule foreach git clean -xdf
- .gitlab/prepare-system.sh - .gitlab/ci.sh setup
- if [[ -d ./cabal-cache ]]; then cp -R ./.cabal-cache ~/.cabal-cache; fi - if [[ -d ./cabal-cache ]]; then cp -R ./.cabal-cache ~/.cabal-cache; fi
- ./boot - ./boot
- ./configure $CONFIGURE_ARGS - ./configure $CONFIGURE_ARGS
...@@ -283,32 +314,48 @@ hadrian-ghc-in-ghci: ...@@ -283,32 +314,48 @@ hadrian-ghc-in-ghci:
############################################################ ############################################################
.validate: .validate:
<<: *only-default
variables: variables:
TEST_TYPE: test TEST_TYPE: test
before_script: MAKE_ARGS: "-Werror"
- git clean -xdf && git submodule foreach git clean -xdf script: |
script: # Build hyperlinked sources for documentation when building releases
- ./boot if [[ "$RELEASE_JOB" = "yes" ]]; then
- ./configure $CONFIGURE_ARGS HADDOCK_HYPERLINKED_SOURCES=1
- | fi
THREADS=`mk/detect-cpu-count.sh` TIME_START=$(date +%s)
make V=0 -j$THREADS WERROR=-Werror time .gitlab/ci.sh setup 2>&1
- make binary-dist-prep TAR_COMP_OPTS="-1" TIME_SETUP=$(date +%s)
- make test_bindist TEST_PREP=YES TIME_SETUP_DELTA=$(expr $TIME_SETUP - $TIME_START)
- | echo "Setup took $TIME_SETUP_DELTA seconds"
# Prepare to push git notes.
METRICS_FILE=$CI_PROJECT_DIR/performance-metrics.tsv time .gitlab/ci.sh configure 2>&1
git config user.email "ben+ghc-ci@smart-cactus.org" TIME_CONFIGURE=$(date +%s)
git config user.name "GHC GitLab CI" TIME_CONFIGURE_DELTA=$(expr $TIME_CONFIGURE - $TIME_SETUP)
- | echo "Configure took $TIME_CONFIGURE_DELTA seconds"
THREADS=`mk/detect-cpu-count.sh`
make $TEST_TYPE THREADS=$THREADS JUNIT_FILE=../../junit.xml METRICS_FILE=$METRICS_FILE || (METRICS_FILE=$METRICS_FILE .gitlab/push-test-metrics.sh && false) time .gitlab/ci.sh build_make 2>&1
- | TIME_BUILD=$(date +%s)
# Push git notes. TIME_BUILD_DELTA=$(expr $TIME_BUILD - $TIME_CONFIGURE)
METRICS_FILE=$METRICS_FILE .gitlab/push-test-metrics.sh echo "Build took $TIME_BUILD_DELTA seconds"
set +e
time .gitlab/ci.sh test_make 2>&1
status=$?
set -e
TIME_TEST=$(date +%s)
TIME_TEST_DELTA=$(expr $TIME_TEST - $TIME_BUILD)
echo "Test took $TIME_TEST_DELTA seconds"
echo "=== TIMINGS ==="
echo "Setup | $TIME_SETUP_DELTA"
echo "Configure | $TIME_CONFIGURE_DELTA"
echo "Build | $TIME_BUILD_DELTA"
echo "Test | $TIME_TEST_DELTA"
exit $status
dependencies: [] dependencies: []
artifacts: artifacts:
when: always
reports: reports:
junit: junit.xml junit: junit.xml
expire_in: 2 week expire_in: 2 week
...@@ -317,6 +364,84 @@ hadrian-ghc-in-ghci: ...@@ -317,6 +364,84 @@ hadrian-ghc-in-ghci:
- junit.xml - junit.xml
- performance-metrics.tsv - performance-metrics.tsv
#################################
# x86_64-freebsd
#################################
.build-x86_64-freebsd:
extends: .validate
tags:
- x86_64-freebsd
allow_failure: true
variables:
# N.B. we use iconv from ports as I see linker errors when we attempt
# to use the "native" iconv embedded in libc as suggested by the
# porting guide [1].
# [1] https://www.freebsd.org/doc/en/books/porters-handbook/using-iconv.html)
CONFIGURE_ARGS: "--with-gmp-includes=/usr/local/include --with-gmp-libraries=/usr/local/lib --with-iconv-includes=/usr/local/include --with-iconv-libraries=/usr/local/lib"
GHC_VERSION: 8.6.3
CABAL_INSTALL_VERSION: 3.0.0.0
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-portbld-freebsd.tar.xz"
TEST_ENV: "x86_64-freebsd"
BUILD_FLAVOUR: "validate"
after_script:
- cp -Rf $HOME/.cabal cabal-cache
- .gitlab/ci.sh clean
artifacts:
when: always
expire_in: 2 week
cache:
key: "freebsd-$GHC_VERSION"
paths:
- cabal-cache
- toolchain
# Disabled due to lack of builder capacity
.validate-x86_64-freebsd:
extends: .build-x86_64-freebsd
stage: full-build
nightly-x86_64-freebsd:
<<: *nightly
extends: .build-x86_64-freebsd
stage: full-build
release-x86_64-freebsd:
<<: *release
extends: .build-x86_64-freebsd
stage: full-build
.build-x86_64-freebsd-hadrian:
extends: .validate-hadrian
stage: full-build
tags:
- x86_64-freebsd
allow_failure: true
variables:
CONFIGURE_ARGS: "--with-gmp-includes=/usr/local/include --with-gmp-libraries=/usr/local/lib --with-iconv-includes=/usr/local/include --with-iconv-libraries=/usr/local/lib"
HADRIAN_ARGS: "--docs=no-sphinx"
GHC_VERSION: 8.6.3
CABAL_INSTALL_VERSION: 3.0.0.0
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-portbld-freebsd.tar.xz"
TEST_ENV: "x86_64-freebsd-hadrian"
FLAVOUR: "validate"
after_script:
- cp -Rf $HOME/.cabal cabal-cache
- .gitlab/ci.sh clean
artifacts:
when: always
expire_in: 2 week
cache:
key: "freebsd-$GHC_VERSION"
paths:
- cabal-cache
- toolchain
# Disabled due to lack of builder capacity
.validate-x86_64-freebsd-hadrian:
extends: .build-x86_64-freebsd-hadrian
stage: full-build
################################# #################################
# x86_64-darwin # x86_64-darwin
################################# #################################
...@@ -325,40 +450,181 @@ validate-x86_64-darwin: ...@@ -325,40 +450,181 @@ validate-x86_64-darwin:
extends: .validate extends: .validate
stage: full-build stage: full-build
tags: tags:
- x86_64-darwin - x86_64-darwin-m1
# for now make this non mandatory to pass.
allow_failure: true
variables: variables:
GHC_VERSION: 8.6.5 TEST_TYPE: test
CABAL_INSTALL_VERSION: 2.4.1.0 MAKE_ARGS: "-Werror"
GHC_VERSION: 8.10.3
CABAL_INSTALL_VERSION: 3.2.0.0
BUILD_FLAVOUR: "perf"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-apple-darwin.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-apple-darwin.tar.xz"
MACOSX_DEPLOYMENT_TARGET: "10.7" MACOSX_DEPLOYMENT_TARGET: "10.7"
# Only Sierra and onwards supports clock_gettime. See #12858 # Only Sierra and onwards supports clock_gettime. See #12858
ac_cv_func_clock_gettime: "no" ac_cv_func_clock_gettime: "no"
# we run on M1's for now, getconf can't be built with nix yet,
# and we use a pure shell, so we can't/shouldn't use /usr/bin/getconf
# inside th shell.
CPUS: 8
LANG: "en_US.UTF-8" LANG: "en_US.UTF-8"
CONFIGURE_ARGS: --with-intree-gmp # WARNING: this is overridden in the shell.nix, see shell.nix!
TEST_ENV: "x86_64-darwin" CONFIGURE_ARGS: "--with-intree-gmp"
before_script:
- git clean -xdf && git submodule foreach git clean -xdf # I wish we could just use the nix #! logic, but we can't --run and -i bash
- git submodule sync --recursive # behave very differently. -i bash does not pass any nix related env vars
- git submodule update --init --recursive # the whole $stdenv/setup part seems to be missing.
- git checkout .gitmodules script: |
- "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true" set -Eeuo pipefail
function runInNixShell() {
time nix-shell .gitlab/shell.nix \
-I nixpkgs=https://github.com/angerman/nixpkgs/archive/75f7281738b.tar.gz \
--argstr system "x86_64-darwin" \
--pure \
--keep GHC_VERSION --keep CABAL_INSTALL_VERSION --keep BUILD_FLAVOUR \
--keep BIN_DIST_PREP_TAR_COMP --keep CPUS --keep PROJECT_DIR \
--keep CI_PROJECT_DIR --keep MAKE_ARGS \
--keep LANG --keep CONFIGURE_ARGS \
--keep MACOSX_DEPLOYMENT_TARGET --keep ac_cv_func_clock_gettime \
--run "$1" 2>&1
}
# fix up config.sub in libraries for the time.
# aarch64-darwin is not supported in older config.sub's
TIME_START=$(date +%s)
find libraries -name config.sub -exec cp config.sub {} \;
runInNixShell ".gitlab/ci.sh setup" 2>&1
TIME_SETUP=$(date +%s)
TIME_SETUP_DELTA=$(expr $TIME_SETUP - $TIME_START)
echo "Setup took $TIME_SETUP_DELTA seconds"
runInNixShell ".gitlab/ci.sh configure" 2>&1
TIME_CONFIGURE=$(date +%s)
TIME_CONFIGURE_DELTA=$(expr $TIME_CONFIGURE - $TIME_SETUP)
echo "Configure took $TIME_CONFIGURE_DELTA seconds"
runInNixShell ".gitlab/ci.sh build_make" 2>&1
TIME_BUILD=$(date +%s)
TIME_BUILD_DELTA=$(expr $TIME_BUILD - $TIME_CONFIGURE)
echo "Build took $TIME_BUILD_DELTA seconds"
set +e
runInNixShell ".gitlab/ci.sh test_make" 2>&1
status=$?
set -e
TIME_TEST=$(date +%s)
TIME_TEST_DELTA=$(expr $TIME_TEST - $TIME_BUILD)
echo "Test took $TIME_TEST_DELTA seconds"
echo "=== TIMINGS ==="
echo "Setup | $TIME_SETUP_DELTA"
echo "Configure | $TIME_CONFIGURE_DELTA"
echo "Build | $TIME_BUILD_DELTA"
echo "Test | $TIME_TEST_DELTA"
exit $status
- bash .gitlab/darwin-init.sh
- PATH="`pwd`/toolchain/bin:$PATH"
after_script:
- cp -Rf $HOME/.cabal cabal-cache
artifacts: artifacts:
when: always when: always
reports:
junit: junit.xml
expire_in: 2 week expire_in: 2 week
cache:
key: "darwin-$GHC_VERSION"
paths: paths:
- cabal-cache - $BIN_DIST_PREP_TAR_COMP
- toolchain - junit.xml
- performance-metrics.tsv
validate-aarch64-darwin:
extends: .validate
stage: full-build
tags:
- aarch64-darwin-m1
# for now make this non mandatory to pass.
allow_failure: true
variables:
TEST_TYPE: test
MAKE_ARGS: "-Werror"
GHC_VERSION: 8.10.3
CABAL_INSTALL_VERSION: 3.2.0.0
BUILD_FLAVOUR: "perf"
BIN_DIST_PREP_TAR_COMP: "ghc-arm64-apple-darwin.tar.xz"
# we run on M1's for now, getconf can't be built with nix yet,
# and we use a pure shell, so we can't/shouldn't use /usr/bin/getconf
# inside th shell.
CPUS: 8
LANG: "en_US.UTF-8"
# WARNING: this is overridden in the shell.nix, see shell.nix!
CONFIGURE_ARGS: "--with-intree-gmp"
# I wish we could just use the nix #! logic, but we can't --run and -i bash
# behave very differently. -i bash does not pass any nix related env vars
# the whole $stdenv/setup part seems to be missing.
script: |
set -Eeuo pipefail
function runInNixShell() {
time nix-shell .gitlab/shell.nix \
-I nixpkgs=https://github.com/angerman/nixpkgs/archive/75f7281738b.tar.gz \
--argstr system "aarch64-darwin" \
--pure \
--keep GHC_VERSION --keep CABAL_INSTALL_VERSION --keep BUILD_FLAVOUR \
--keep BIN_DIST_PREP_TAR_COMP --keep CPUS --keep PROJECT_DIR \
--keep CI_PROJECT_DIR --keep MAKE_ARGS \
--keep LANG --keep CONFIGURE_ARGS \
--run "$1" 2>&1
}
# fix up config.sub in libraries for the time.
# aarch64-darwin is not supported in older config.sub's
TIME_START=$(date +%s)
find libraries -name config.sub -exec cp config.sub {} \;
time runInNixShell ".gitlab/ci.sh setup" 2>&1
TIME_SETUP=$(date +%s)
TIME_SETUP_DELTA=$(expr $TIME_SETUP - $TIME_START)
echo "Setup took $TIME_SETUP_DELTA seconds"
runInNixShell ".gitlab/ci.sh configure" 2>&1
TIME_CONFIGURE=$(date +%s)
TIME_CONFIGURE_DELTA=$(expr $TIME_CONFIGURE - $TIME_SETUP)
echo "Setup took $TIME_CONFIGURE_DELTA seconds"
runInNixShell ".gitlab/ci.sh build_make" 2>&1
TIME_BUILD=$(date +%s)
TIME_BUILD_DELTA=$(expr $TIME_BUILD - $TIME_CONFIGURE)
echo "Build took $TIME_BUILD_DELTA seconds"
set +e
runInNixShell ".gitlab/ci.sh test_make" 2>&1
status=$?
set -e
TIME_TEST=$(date +%s)
TIME_TEST_DELTA=$(expr $TIME_TEST - $TIME_BUILD)
echo "Test took $TIME_TEST_DELTA seconds"
echo "=== TIMINGS ==="
echo "Setup | $TIME_SETUP_DELTA"
echo "Configure | $TIME_CONFIGURE_DELTA"
echo "Build | $TIME_BUILD_DELTA"
echo "Test | $TIME_TEST_DELTA"
exit $status
artifacts:
when: always
reports:
junit: junit.xml
expire_in: 2 week
paths:
- $BIN_DIST_PREP_TAR_COMP
- junit.xml
- performance-metrics.tsv
# Disabled because of OS X CI capacity # Disabled because of OS X CI capacity
.validate-x86_64-darwin-hadrian: .validate-x86_64-darwin-hadrian:
<<: *only-default
stage: full-build stage: full-build
tags: tags:
- x86_64-darwin - x86_64-darwin
...@@ -370,26 +636,14 @@ validate-x86_64-darwin: ...@@ -370,26 +636,14 @@ validate-x86_64-darwin:
CONFIGURE_ARGS: --with-intree-gmp CONFIGURE_ARGS: --with-intree-gmp
TEST_ENV: "x86_64-darwin-hadrian" TEST_ENV: "x86_64-darwin-hadrian"
FLAVOUR: "validate" FLAVOUR: "validate"
before_script:
- git clean -xdf && git submodule foreach git clean -xdf
- git submodule sync --recursive
- git submodule update --init --recursive
- git checkout .gitmodules
- "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true"
- bash .gitlab/darwin-init.sh
- PATH="`pwd`/toolchain/bin:$PATH"
script: script:
- cabal update - .gitlab/ci.sh setup
- ./boot - .gitlab/ci.sh configure
- ./configure $CONFIGURE_ARGS - .gitlab/ci.sh build_hadrian
- hadrian/build.cabal.sh --flavour=$FLAVOUR -j`mk/detect-cpu-count.sh` --docs=no-sphinx binary-dist - .gitlab/ci.sh test_hadrian
- mv _build/bindist/ghc*.tar.xz ghc.tar.xz
- export TOP=$(pwd)
- cd _build/bindist/ghc-*/ && ./configure --prefix=$TOP/_build/install && make install && cd ../../../
- hadrian/build.cabal.sh --flavour=$FLAVOUR -j`mk/detect-cpu-count.sh` --docs=no-sphinx test --summary-junit=./junit.xml --test-compiler=$TOP/_build/install/bin/ghc
after_script: after_script:
- cp -Rf $HOME/.cabal cabal-cache - cp -Rf $HOME/.cabal cabal-cache
- .gitlab/ci.sh clean
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
...@@ -403,19 +657,9 @@ validate-x86_64-darwin: ...@@ -403,19 +657,9 @@ validate-x86_64-darwin:
extends: .validate extends: .validate
tags: tags:
- x86_64-linux - x86_64-linux
variables:
BUILD_FLAVOUR: "perf"
before_script: before_script:
- git clean -xdf && git submodule foreach git clean -xdf
- git submodule sync --recursive
- git submodule update --init --recursive
- git checkout .gitmodules
- "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true"
# Build hyperlinked sources for documentation when building releases
- |
if [[ -n "$CI_COMMIT_TAG" ]]; then
echo "EXTRA_HADDOCK_OPTS += --hyperlinked-source --quickjump" >> mk/build.mk
fi
- .gitlab/prepare-system.sh
# workaround for docker permissions # workaround for docker permissions
- sudo chown ghc:ghc -R . - sudo chown ghc:ghc -R .
after_script: after_script:
...@@ -427,71 +671,63 @@ validate-x86_64-darwin: ...@@ -427,71 +671,63 @@ validate-x86_64-darwin:
- toolchain - toolchain
################################# #################################
# aarch64-linux-deb9 # aarch64-linux-deb10
################################# #################################
.build-aarch64-linux-deb9: .build-aarch64-linux-deb10:
extends: .validate-linux extends: .validate-linux
stage: full-build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/aarch64-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/aarch64-linux-deb10:$DOCKER_REV"
allow_failure: true allow_failure: true
variables: variables:
TEST_ENV: "aarch64-linux-deb9" TEST_ENV: "aarch64-linux-deb10"
BIN_DIST_PREP_TAR_COMP: "ghc-aarch64-linux-deb9.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-aarch64-linux-deb10.tar.xz"
cache: cache:
key: linux-aarch64-deb9 key: linux-aarch64-deb10
tags: tags:
- aarch64-linux - aarch64-linux
validate-aarch64-linux-deb9: validate-aarch64-linux-deb10:
extends: .build-aarch64-linux-deb9 extends: .build-aarch64-linux-deb10
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
nightly-aarch64-linux-deb9: nightly-aarch64-linux-deb10:
extends: .build-aarch64-linux-deb9 <<: *nightly
artifacts: extends: .build-aarch64-linux-deb10
expire_in: 2 year
variables: variables:
TEST_TYPE: slowtest TEST_TYPE: slowtest
only:
variables:
- $NIGHTLY
################################# #################################
# armv7-linux-deb9 # armv7-linux-deb10
################################# #################################
.build-armv7-linux-deb9: .build-armv7-linux-deb10:
extends: .validate-linux extends: .validate-linux
stage: full-build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/armv7-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/armv7-linux-deb10:$DOCKER_REV"
allow_failure: true
variables: variables:
TEST_ENV: "armv7-linux-deb9" TEST_ENV: "armv7-linux-deb10"
BIN_DIST_PREP_TAR_COMP: "ghc-armv7-linux-deb9.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-armv7-linux-deb10.tar.xz"
CONFIGURE_ARGS: "--host=armv7-linux-gnueabihf --build=armv7-linux-gnueabihf --target=armv7-linux-gnueabihf" CONFIGURE_ARGS: "--host=armv7-linux-gnueabihf --build=armv7-linux-gnueabihf --target=armv7-linux-gnueabihf"
cache: cache:
key: linux-armv7-deb9 key: linux-armv7-deb10
tags: tags:
- armv7-linux - armv7-linux
validate-armv7-linux-deb9: validate-armv7-linux-deb10:
extends: .build-armv7-linux-deb9 extends: .build-armv7-linux-deb10
allow_failure: true
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
nightly-armv7-linux-deb9: nightly-armv7-linux-deb10:
extends: .build-armv7-linux-deb9 <<: *nightly
artifacts: extends: .build-armv7-linux-deb10
expire_in: 2 year
variables: variables:
TEST_TYPE: slowtest TEST_TYPE: slowtest
only:
variables:
- $NIGHTLY
################################# #################################
# i386-linux-deb9 # i386-linux-deb9
...@@ -514,15 +750,10 @@ validate-i386-linux-deb9: ...@@ -514,15 +750,10 @@ validate-i386-linux-deb9:
expire_in: 2 week expire_in: 2 week
nightly-i386-linux-deb9: nightly-i386-linux-deb9:
<<: *nightly
extends: .build-i386-linux-deb9 extends: .build-i386-linux-deb9
variables: variables:
TEST_TYPE: slowtest TEST_TYPE: slowtest
artifacts:
when: always
expire_in: 2 week
only:
variables:
- $NIGHTLY
################################# #################################
# x86_64-linux-deb9 # x86_64-linux-deb9
...@@ -530,7 +761,6 @@ nightly-i386-linux-deb9: ...@@ -530,7 +761,6 @@ nightly-i386-linux-deb9:
.build-x86_64-linux-deb9: .build-x86_64-linux-deb9:
extends: .validate-linux extends: .validate-linux
stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV"
variables: variables:
TEST_ENV: "x86_64-linux-deb9" TEST_ENV: "x86_64-linux-deb9"
...@@ -541,24 +771,27 @@ nightly-i386-linux-deb9: ...@@ -541,24 +771,27 @@ nightly-i386-linux-deb9:
# Disabled to reduce CI load # Disabled to reduce CI load
.validate-x86_64-linux-deb9: .validate-x86_64-linux-deb9:
extends: .build-x86_64-linux-deb9 extends: .build-x86_64-linux-deb9
stage: full-build
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
release-x86_64-linux-deb9:
<<: *release
extends: .build-x86_64-linux-deb9
stage: full-build
nightly-x86_64-linux-deb9: nightly-x86_64-linux-deb9:
<<: *nightly
extends: .build-x86_64-linux-deb9 extends: .build-x86_64-linux-deb9
artifacts: stage: full-build
expire_in: 2 year
variables: variables:
TEST_TYPE: slowtest TEST_TYPE: slowtest
only:
variables:
- $NIGHTLY
# N.B. Has DEBUG assertions enabled in stage2 # N.B. Has DEBUG assertions enabled in stage2
validate-x86_64-linux-deb9-debug: validate-x86_64-linux-deb9-debug:
extends: .build-x86_64-linux-deb9 extends: .build-x86_64-linux-deb9
stage: build stage: full-build
variables: variables:
BUILD_FLAVOUR: validate BUILD_FLAVOUR: validate
# Ensure that stage2 also has DEBUG enabled # Ensure that stage2 also has DEBUG enabled
...@@ -567,7 +800,7 @@ validate-x86_64-linux-deb9-debug: ...@@ -567,7 +800,7 @@ validate-x86_64-linux-deb9-debug:
BUILD_SPHINX_PDF: "YES" BUILD_SPHINX_PDF: "YES"
TEST_TYPE: slowtest TEST_TYPE: slowtest
TEST_ENV: "x86_64-linux-deb9-debug" TEST_ENV: "x86_64-linux-deb9-debug"
BIN_DIST_PREP_COMP: "ghc-x86_64-deb9-linux-debug.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb9-linux-debug.tar.xz"
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
...@@ -581,109 +814,166 @@ validate-x86_64-linux-deb9-debug: ...@@ -581,109 +814,166 @@ validate-x86_64-linux-deb9-debug:
TEST_ENV: "x86_64-linux-deb9-llvm" TEST_ENV: "x86_64-linux-deb9-llvm"
nightly-x86_64-linux-deb9-llvm: nightly-x86_64-linux-deb9-llvm:
<<: *nightly
extends: .build-x86_64-linux-deb9 extends: .build-x86_64-linux-deb9
stage: full-build stage: full-build
variables: variables:
BUILD_FLAVOUR: perf-llvm BUILD_FLAVOUR: perf-llvm
TEST_ENV: "x86_64-linux-deb9-llvm" TEST_ENV: "x86_64-linux-deb9-llvm"
only:
variables:
- $NIGHTLY
validate-x86_64-linux-deb9-integer-simple: validate-x86_64-linux-deb9-integer-simple:
extends: .build-x86_64-linux-deb9 extends: .build-x86_64-linux-deb9
stage: full-build stage: full-build
variables: variables:
BUILD_FLAVOUR: validate
INTEGER_LIBRARY: integer-simple INTEGER_LIBRARY: integer-simple
TEST_ENV: "x86_64-linux-deb9-integer-simple" TEST_ENV: "x86_64-linux-deb9-integer-simple-validate"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb9-linux-integer-simple.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb9-linux-integer-simple.tar.xz"
nightly-x86_64-linux-deb9-integer-simple: nightly-x86_64-linux-deb9-integer-simple:
<<: *nightly
extends: .build-x86_64-linux-deb9 extends: .build-x86_64-linux-deb9
stage: full-build stage: full-build
variables: variables:
INTEGER_LIBRARY: integer-simple INTEGER_LIBRARY: integer-simple
TEST_ENV: "x86_64-linux-deb9-integer-simple" TEST_ENV: "x86_64-linux-deb9-integer-simple"
TEST_TYPE: slowtest TEST_TYPE: slowtest
artifacts:
expire_in: 2 year
only:
variables:
- $NIGHTLY
release-x86_64-linux-deb9-dwarf: .build-x86_64-linux-deb9-tsan:
extends: .validate-linux extends: .validate-linux-hadrian
stage: build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV" variables:
TEST_ENV: "x86_64-linux-deb9-tsan"
BUILD_FLAVOUR: "thread-sanitizer"
TSAN_OPTIONS: "suppressions=$CI_PROJECT_DIR/rts/.tsan-suppressions"
# Haddock is large enough to make TSAN choke without massive quantities of
# memory.
HADRIAN_ARGS: "--docs=none"
nightly-x86_64-linux-deb9-tsan:
<<: *nightly
extends: .build-x86_64-linux-deb9-tsan
validate-x86_64-linux-deb9-tsan:
extends: .build-x86_64-linux-deb9-tsan
when: manual
validate-x86_64-linux-deb9-dwarf:
extends: .build-x86_64-linux-deb9
stage: full-build
variables: variables:
CONFIGURE_ARGS: "--enable-dwarf-unwind" CONFIGURE_ARGS: "--enable-dwarf-unwind"
BUILD_FLAVOUR: dwarf BUILD_FLAVOUR: dwarf
TEST_ENV: "x86_64-linux-deb9-dwarf" TEST_ENV: "x86_64-linux-deb9-dwarf"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb9-linux-dwarf.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb9-linux-dwarf.tar.xz"
artifacts:
when: always #################################
expire_in: 2 week # x86_64-linux-deb10
#################################
.build-x86_64-linux-deb10:
extends: .validate-linux
stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10:$DOCKER_REV"
variables:
TEST_ENV: "x86_64-linux-deb10"
BIN_DIST_PREP_TAR_COMP: "./ghc-x86_64-deb10-linux.tar.xz"
LD: "ld.gold"
cache: cache:
key: linux-x86_64-deb9 key: linux-x86_64-deb10
# Disabled to alleviate CI load
.validate-x86_64-linux-deb10:
extends: .build-x86_64-linux-deb10
stage: full-build
nightly-x86_64-linux-deb10:
<<: *nightly
extends: .build-x86_64-linux-deb10
variables:
TEST_TYPE: slowtest
release-x86_64-linux-deb10:
<<: *release
extends: .build-x86_64-linux-deb10
release-x86_64-linux-deb10-dwarf:
<<: *release
extends: .build-x86_64-linux-deb10
variables:
CONFIGURE_ARGS: "--enable-dwarf-unwind"
BUILD_FLAVOUR: dwarf
TEST_ENV: "x86_64-linux-deb10-dwarf"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb10-linux-dwarf.tar.xz"
artifacts:
when: always
expire_in: 1 year
################################# #################################
# x86_64-linux-deb8 # x86_64-linux-deb8
################################# #################################
release-x86_64-linux-deb8: .build-x86_64-linux-deb8:
extends: .validate-linux extends: .validate-linux
stage: full-build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb8:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb8:$DOCKER_REV"
allow_failure: true
variables: variables:
TEST_ENV: "x86_64-linux-deb8" TEST_ENV: "x86_64-linux-deb8"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb8-linux.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-deb8-linux.tar.xz"
only: # Debian 8's Sphinx is too old to support the table directive's :widths:
- tags # option: https://sourceforge.net/p/docutils/patches/120/
BUILD_SPHINX_HTML: "NO"
BUILD_SPHINX_INFO: "NO"
BUILD_SPHINX_PDF: "NO"
BUILD_SPHINX_MAN: "NO"
cache: cache:
key: linux-x86_64-deb8 key: linux-x86_64-deb8
artifacts:
when: always release-x86_64-linux-deb8:
expire_in: 2 week <<: *release
extends: .build-x86_64-linux-deb8
################################# #################################
# x86_64-linux-alpine # x86_64-linux-alpine
################################# #################################
.build-x86_64-linux-alpine: .build-x86_64-linux-alpine-hadrian:
extends: .validate-linux extends: .validate-linux-hadrian
stage: full-build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-alpine:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-alpine:$DOCKER_REV"
# There are currently a few failing tests # There are currently a few failing tests
allow_failure: true allow_failure: true
variables: variables:
BUILD_SPHINX_PDF: "NO"
TEST_ENV: "x86_64-linux-alpine" TEST_ENV: "x86_64-linux-alpine"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-alpine-linux.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-alpine-linux.tar.xz"
# Can't use ld.gold due to #13958. # Can't use ld.gold due to #13958.
CONFIGURE_ARGS: "--disable-ld-override" CONFIGURE_ARGS: "--disable-ld-override"
HADRIAN_ARGS: "--docs=no-sphinx"
# encoding004 due to lack of locale support
# T10458 due to fact that dynamic linker tries to reload libAS
BROKEN_TESTS: "encoding004 T10458"
cache: cache:
key: linux-x86_64-alpine key: linux-x86_64-alpine
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
release-x86_64-linux-alpine: release-x86_64-linux-alpine-integer-simple:
extends: .build-x86_64-linux-alpine <<: *release
only: extends: .build-x86_64-linux-alpine-hadrian
- tags variables:
HADRIAN_ARGS: "--docs=no-sphinx --integer-simple"
nightly-x86_64-linux-alpine: nightly-x86_64-linux-alpine:
extends: .build-x86_64-linux-alpine <<: *nightly
only: extends: .build-x86_64-linux-alpine-hadrian
variables:
- $NIGHTLY
################################# #################################
# x86_64-linux-centos7 # x86_64-linux-centos7
################################# #################################
release-x86_64-linux-centos7: .build-x86_64-linux-centos7:
extends: .validate-linux extends: .validate-linux
stage: full-build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-centos7:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-centos7:$DOCKER_REV"
...@@ -693,19 +983,20 @@ release-x86_64-linux-centos7: ...@@ -693,19 +983,20 @@ release-x86_64-linux-centos7:
BUILD_SPHINX_PDF: "NO" BUILD_SPHINX_PDF: "NO"
TEST_ENV: "x86_64-linux-centos7" TEST_ENV: "x86_64-linux-centos7"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-centos7-linux.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-centos7-linux.tar.xz"
only: # CentOS seems to default to ascii
- tags LANG: "en_US.UTF-8"
cache: cache:
key: linux-x86_64-centos7 key: linux-x86_64-centos7
artifacts:
when: always release-x86_64-linux-centos7:
expire_in: 2 week <<: *release
extends: .build-x86_64-linux-centos7
################################# #################################
# x86_64-linux-fedora27 # x86_64-linux-fedora27
################################# #################################
validate-x86_64-linux-fedora27: .build-x86_64-linux-fedora27:
extends: .validate-linux extends: .validate-linux
stage: full-build stage: full-build
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora27:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora27:$DOCKER_REV"
...@@ -714,68 +1005,76 @@ validate-x86_64-linux-fedora27: ...@@ -714,68 +1005,76 @@ validate-x86_64-linux-fedora27:
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-fedora27-linux.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-fedora27-linux.tar.xz"
cache: cache:
key: linux-x86_64-fedora27 key: linux-x86_64-fedora27
validate-x86_64-linux-fedora27:
extends: .build-x86_64-linux-fedora27
artifacts: artifacts:
when: always when: always
expire_in: 2 week # These are used for head.hackage jobs therefore we keep them around for
# longer.
expire_in: 8 week
release-x86_64-linux-fedora27:
<<: *release
extends: .build-x86_64-linux-fedora27
release-x86_64-linux-fedora27-dwarf:
<<: *release
extends: .build-x86_64-linux-fedora27
variables:
CONFIGURE_ARGS: "--enable-dwarf-unwind"
BUILD_FLAVOUR: dwarf
TEST_ENV: "x86_64-linux-fedora27-dwarf"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-fedora27-linux-dwarf.tar.xz"
############################################################ ############################################################
# Validation via Pipelines (Windows) # Validation via Pipelines (Windows)
############################################################ ############################################################
.build-windows: .build-windows:
<<: *only-default # For the reasons given in #17777 this build isn't reliable.
allow_failure: true
before_script: before_script:
- git clean -xdf - git clean -xdf
- git submodule foreach git clean -xdf
# Use a local temporary directory to ensure that concurrent builds don't # Setup toolchain
# interfere with one another - bash .gitlab/ci.sh setup
- |
mkdir tmp
set TMP=%cd%\tmp
set TEMP=%cd%\tmp
- set PATH=C:\msys64\usr\bin;%PATH%
- git submodule sync --recursive
- git submodule update --init --recursive
- git checkout .gitmodules
- "git fetch https://gitlab.haskell.org/ghc/ghc-performance-notes.git refs/notes/perf:refs/notes/perf || true"
- bash .gitlab/win32-init.sh
after_script: after_script:
- rd /s /q tmp - |
- robocopy /np /nfl /ndl /e "%APPDATA%\cabal" cabal-cache Copy-Item -Recurse -Path $Env:APPDATA\cabal -Destination cabal-cache
- bash -c 'make clean || true' - bash .gitlab/ci.sh clean
dependencies: [] dependencies: []
variables: variables:
FORCE_SYMLINKS: 1 #FORCE_SYMLINKS: 1
LANG: "en_US.UTF-8" LANG: "en_US.UTF-8"
SPHINXBUILD: "/mingw64/bin/sphinx-build.exe" SPHINXBUILD: "/mingw64/bin/sphinx-build.exe"
CABAL_INSTALL_VERSION: 3.0.0.0
GHC_VERSION: "8.8.3"
cache: cache:
paths: paths:
- cabal-cache - cabal-cache
- ghc-8.6.5 - toolchain
- ghc-tarballs - ghc-tarballs
.build-windows-hadrian: .build-windows-hadrian:
extends: .build-windows extends: .build-windows
stage: full-build stage: full-build
variables: variables:
GHC_VERSION: "8.6.5"
FLAVOUR: "validate" FLAVOUR: "validate"
# skipping perf tests for now since we build a quick-flavoured GHC,
# which might result in some broken perf tests?
HADRIAN_ARGS: "--docs=no-sphinx --skip-perf"
# due to #16574 this currently fails # due to #16574 this currently fails
allow_failure: true allow_failure: true
script: script:
- | - bash .gitlab/ci.sh configure
python boot - bash .gitlab/ci.sh build_hadrian
bash -c './configure --enable-tarballs-autodownload GHC=`pwd`/toolchain/bin/ghc HAPPY=`pwd`/toolchain/bin/happy ALEX=`pwd`/toolchain/bin/alex' - bash .gitlab/ci.sh test_hadrian
- bash -c "PATH=`pwd`/toolchain/bin:$PATH hadrian/build.cabal.sh --flavour=$FLAVOUR -j`mk/detect-cpu-count.sh` --flavour=Quick --docs=no-sphinx binary-dist"
- mv _build/bindist/ghc*.tar.xz ghc.tar.xz
- bash -c "export TOP=$(pwd); cd _build/bindist/ghc-*/ && PATH=$TOP/toolchain/bin:$PATH ./configure --prefix=$TOP/_build/install && make install && cd ../../../"
- bash -c "export TOP=$(pwd); PATH=$TOP/toolchain/bin:$PATH hadrian/build.cabal.sh --flavour=$FLAVOUR -j`mk/detect-cpu-count.sh` --flavour=quick test --summary-junit=./junit.xml --skip-perf --test-compiler=$TOP/_build/install/bin/ghc"
# skipping perf tests for now since we build a quick-flavoured GHC,
# which might result in some broken perf tests?
tags: tags:
- x86_64-windows - new-x86_64-windows
- test
artifacts: artifacts:
reports: reports:
junit: junit.xml junit: junit.xml
...@@ -794,152 +1093,103 @@ validate-x86_64-windows-hadrian: ...@@ -794,152 +1093,103 @@ validate-x86_64-windows-hadrian:
key: "x86_64-windows-hadrian-$WINDOWS_TOOLCHAIN_VERSION" key: "x86_64-windows-hadrian-$WINDOWS_TOOLCHAIN_VERSION"
nightly-i386-windows-hadrian: nightly-i386-windows-hadrian:
<<: *nightly
extends: .build-windows-hadrian extends: .build-windows-hadrian
variables: variables:
MSYSTEM: MINGW32 MSYSTEM: MINGW32
TEST_ENV: "i386-windows-hadrian" TEST_ENV: "i386-windows-hadrian"
only:
variables:
- $NIGHTLY
cache: cache:
key: "i386-windows-hadrian-$WINDOWS_TOOLCHAIN_VERSION" key: "i386-windows-hadrian-$WINDOWS_TOOLCHAIN_VERSION"
.build-windows-make: .build-windows-make:
extends: .build-windows extends: .build-windows
stage: full-build stage: full-build
allow_failure: true
variables: variables:
BUILD_FLAVOUR: "quick" BUILD_FLAVOUR: "quick"
GHC_VERSION: "8.6.5"
BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-mingw32.tar.xz" BIN_DIST_PREP_TAR_COMP: "ghc-x86_64-mingw32.tar.xz"
script: script:
- | - bash .gitlab/ci.sh configure
python boot - bash .gitlab/ci.sh build_make
bash -c './configure --enable-tarballs-autodownload GHC=`pwd`/toolchain/bin/ghc HAPPY=`pwd`/toolchain/bin/happy ALEX=`pwd`/toolchain/bin/alex $CONFIGURE_ARGS' - bash .gitlab/ci.sh test_make
- bash -c "echo include mk/flavours/${BUILD_FLAVOUR}.mk > mk/build.mk"
- bash -c "echo 'GhcLibHcOpts+=-haddock' >> mk/build.mk"
- bash -c "PATH=`pwd`/toolchain/bin:$PATH make -j`mk/detect-cpu-count.sh`"
- bash -c "PATH=`pwd`/toolchain/bin:$PATH make binary-dist-prep TAR_COMP_OPTS=-1"
- bash -c "PATH=`pwd`/toolchain/bin:$PATH make test_bindist TEST_PREP=YES"
- bash -c 'make V=0 test THREADS=`mk/detect-cpu-count.sh` JUNIT_FILE=../../junit.xml'
tags: tags:
- x86_64-windows - new-x86_64-windows
- test
artifacts: artifacts:
when: always when: always
expire_in: 2 week expire_in: 2 week
reports: reports:
junit: junit.xml junit: junit.xml
paths: paths:
- $BIN_DIST_PREP_TAR_COMP # N.B. variable interpolation apparently doesn't work on Windows so
# this can't be $BIN_DIST_PREP_TAR_COMP
- "ghc-x86_64-mingw32.tar.xz"
- junit.xml - junit.xml
validate-x86_64-windows: .build-x86_64-windows-make:
extends: .build-windows-make extends: .build-windows-make
variables: variables:
MSYSTEM: MINGW64 MSYSTEM: MINGW64
CONFIGURE_ARGS: "--target=x86_64-unknown-mingw32"
TEST_ENV: "x86_64-windows" TEST_ENV: "x86_64-windows"
cache: cache:
key: "x86_64-windows-$WINDOWS_TOOLCHAIN_VERSION" key: "x86_64-windows-$WINDOWS_TOOLCHAIN_VERSION"
validate-x86_64-windows:
extends: .build-x86_64-windows-make
nightly-x86_64-windows: nightly-x86_64-windows:
extends: .build-windows-make <<: *nightly
extends: .build-x86_64-windows-make
stage: full-build stage: full-build
variables: variables:
BUILD_FLAVOUR: "validate" BUILD_FLAVOUR: "validate"
MSYSTEM: MINGW64
CONFIGURE_ARGS: "--target=x86_64-unknown-mingw32"
only:
variables:
- $NIGHTLY
cache:
key: "x86_64-windows-$WINDOWS_TOOLCHAIN_VERSION"
# Normal Windows validate builds are profiled; that won't do for releases. # Normal Windows validate builds are profiled; that won't do for releases.
release-x86_64-windows: release-x86_64-windows:
<<: *release
extends: validate-x86_64-windows extends: validate-x86_64-windows
variables: variables:
MSYSTEM: MINGW64
BUILD_FLAVOUR: "perf" BUILD_FLAVOUR: "perf"
CONFIGURE_ARGS: "--target=x86_64-unknown-mingw32" #
TEST_ENV: "x86_64-windows" release-x86_64-windows-integer-simple:
only: <<: *release
- tags extends: validate-x86_64-windows
release-i386-windows:
extends: .build-windows-make
only:
- tags
variables: variables:
MSYSTEM: MINGW32 INTEGER_LIBRARY: integer-simple
BUILD_FLAVOUR: "perf" BUILD_FLAVOUR: "perf"
CONFIGURE_ARGS: "--target=i386-unknown-mingw32"
# Due to #15934
BUILD_PROF_LIBS: "NO"
TEST_ENV: "i386-windows"
cache:
key: "i386-windows-$WINDOWS_TOOLCHAIN_VERSION"
nightly-i386-windows:
.build-i386-windows-make:
extends: .build-windows-make extends: .build-windows-make
only:
variables:
- $NIGHTLY
variables: variables:
MSYSTEM: MINGW32 MSYSTEM: MINGW32
CONFIGURE_ARGS: "--target=i386-unknown-mingw32"
# Due to #15934 # Due to #15934
BUILD_PROF_LIBS: "NO" BUILD_PROF_LIBS: "NO"
TEST_ENV: "i386-windows" TEST_ENV: "i386-windows"
# Due to #17736
allow_failure: true
cache: cache:
key: "i386-windows-$WINDOWS_TOOLCHAIN_VERSION" key: "i386-windows-$WINDOWS_TOOLCHAIN_VERSION"
############################################################ validate-i386-windows:
# Cleanup extends: .build-i386-windows-make
############################################################ variables:
BUILD_FLAVOUR: "perf"
# Note [Cleaning up after shell executor] release-i386-windows:
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <<: *release
# extends: .build-i386-windows-make
# As noted in [1], gitlab-runner's shell executor doesn't clean up its working variables:
# directory after builds. Unfortunately, we are forced to use the shell executor BUILD_FLAVOUR: "perf"
# on Darwin. To avoid running out of disk space we add a stage at the end of
# the build to remove the /.../GitLabRunner/builds directory. Since we only run a
# single build at a time on Darwin this should be safe.
#
# We used to have a similar cleanup job on Windows as well however it ended up
# being quite fragile as we have multiple Windows builders yet there is no
# guarantee that the cleanup job is run on the same machine as the build itself
# was run. Consequently we were forced to instead handle cleanup with a separate
# cleanup cron job on Windows.
#
# [1] https://gitlab.com/gitlab-org/gitlab-runner/issues/3856
# See Note [Cleanup after shell executor] nightly-i386-windows:
cleanup-darwin: <<: *nightly
<<: *only-default extends: .build-i386-windows-make
stage: cleanup
tags:
- x86_64-darwin
when: always
dependencies: []
before_script:
- echo "Time to clean up"
script:
- echo "Let's go"
after_script:
- BUILD_DIR=$CI_PROJECT_DIR
- echo "Cleaning $BUILD_DIR"
- cd $HOME
- rm -Rf $BUILD_DIR/*
- exit 0
############################################################ ############################################################
# Packaging # Packaging
############################################################ ############################################################
doc-tarball: doc-tarball:
<<: *only-default
stage: packaging stage: packaging
tags: tags:
- x86_64-linux - x86_64-linux
...@@ -948,7 +1198,7 @@ doc-tarball: ...@@ -948,7 +1198,7 @@ doc-tarball:
- validate-x86_64-linux-deb9-debug - validate-x86_64-linux-deb9-debug
- validate-x86_64-windows - validate-x86_64-windows
variables: variables:
LINUX_BINDIST: "ghc-x86_64-deb9-linux.tar.xz" LINUX_BINDIST: "ghc-x86_64-deb9-linux-debug.tar.xz"
WINDOWS_BINDIST: "ghc-x86_64-mingw32.tar.xz" WINDOWS_BINDIST: "ghc-x86_64-mingw32.tar.xz"
# Due to Windows allow_failure # Due to Windows allow_failure
allow_failure: true allow_failure: true
...@@ -980,14 +1230,15 @@ source-tarball: ...@@ -980,14 +1230,15 @@ source-tarball:
- x86_64-linux - x86_64-linux
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV"
dependencies: [] dependencies: []
only: rules:
- tags - if: $CI_COMMIT_TAG
when: always
artifacts: artifacts:
paths: paths:
- ghc-*.tar.xz - ghc-*.tar.xz
- version - version
script: script:
- mk/get-win32-tarballs.sh download all - python3 mk/get-win32-tarballs.py download all
- ./boot - ./boot
- ./configure - ./configure
- make sdist - make sdist
...@@ -1008,7 +1259,6 @@ source-tarball: ...@@ -1008,7 +1259,6 @@ source-tarball:
# pipeline. # pipeline.
.hackage: .hackage:
<<: *only-default
stage: testing stage: testing
image: ghcci/x86_64-linux-deb9:0.2 image: ghcci/x86_64-linux-deb9:0.2
tags: tags:
...@@ -1025,15 +1275,13 @@ hackage: ...@@ -1025,15 +1275,13 @@ hackage:
hackage-label: hackage-label:
extends: .hackage extends: .hackage
only: rules:
variables: - if: '$CI_MERGE_REQUEST_LABELS =~ /.*user-facing.*/'
- $CI_MERGE_REQUEST_LABELS =~ /.*user-facing.*/
nightly-hackage: nightly-hackage:
rules:
- if: $NIGHTLY
extends: .hackage extends: .hackage
only:
variables:
- $NIGHTLY
############################################################ ############################################################
# Nofib testing # Nofib testing
...@@ -1042,13 +1290,12 @@ nightly-hackage: ...@@ -1042,13 +1290,12 @@ nightly-hackage:
perf-nofib: perf-nofib:
stage: testing stage: testing
dependencies: dependencies:
- release-x86_64-linux-deb9-dwarf - validate-x86_64-linux-deb9-dwarf
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb9:$DOCKER_REV"
only: rules:
refs: - if: $CI_MERGE_REQUEST_ID
- merge_requests - if: '$CI_COMMIT_BRANCH == "master"'
- master - if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
- /ghc-[0-9]+\.[0-9]+/
tags: tags:
- x86_64-linux - x86_64-linux
script: script:
...@@ -1097,8 +1344,10 @@ pages: ...@@ -1097,8 +1344,10 @@ pages:
<meta http-equiv="refresh" content="1; url=doc/"> <meta http-equiv="refresh" content="1; url=doc/">
EOF EOF
- cp -f index.html public/doc - cp -f index.html public/doc
only: rules:
- master # N.B. only run this on ghc/ghc since the deployed pages are quite large
# and we only serve GitLab Pages for ghc/ghc.
- if: '$CI_COMMIT_BRANCH == "master" && $CI_PROJECT_NAMESPACE == "ghc"'
artifacts: artifacts:
paths: paths:
- public - public
#!/usr/bin/env bash
# shellcheck disable=SC2230
# This is the primary driver of the GitLab CI infrastructure.
set -Eeuo pipefail
# Configuration:
hackage_index_state="@1579718451"
# Colors
BLACK="0;30"
GRAY="1;30"
RED="0;31"
LT_RED="1;31"
BROWN="0;33"
LT_BROWN="1;33"
GREEN="0;32"
LT_GREEN="1;32"
BLUE="0;34"
LT_BLUE="1;34"
PURPLE="0;35"
LT_PURPLE="1;35"
CYAN="0;36"
LT_CYAN="1;36"
WHITE="1;37"
LT_GRAY="0;37"
# GitLab Pipelines log section delimiters
# https://gitlab.com/gitlab-org/gitlab-foss/issues/14664
start_section() {
name="$1"
echo -e "section_start:$(date +%s):$name\015\033[0K"
}
end_section() {
name="$1"
echo -e "section_end:$(date +%s):$name\015\033[0K"
}
echo_color() {
local color="$1"
local msg="$2"
echo -e "\033[${color}m${msg}\033[0m"
}
error() { echo_color "${RED}" "$1"; }
warn() { echo_color "${LT_BROWN}" "$1"; }
info() { echo_color "${LT_BLUE}" "$1"; }
fail() { error "error: $1"; exit 1; }
function run() {
info "Running $*..."
"$@" || ( error "$* failed"; return 1; )
}
TOP="$(pwd)"
function mingw_init() {
case "$MSYSTEM" in
MINGW32)
triple="i386-unknown-mingw32"
boot_triple="i386-unknown-mingw32" # triple of bootstrap GHC
;;
MINGW64)
triple="x86_64-unknown-mingw32"
boot_triple="x86_64-unknown-mingw32" # triple of bootstrap GHC
;;
*)
fail "win32-init: Unknown MSYSTEM $MSYSTEM"
;;
esac
# Bring mingw toolchain into PATH.
# This is extracted from /etc/profile since this script inexplicably fails to
# run under gitlab-runner.
# shellcheck disable=SC1091
source /etc/msystem
MINGW_MOUNT_POINT="${MINGW_PREFIX}"
PATH="$MINGW_MOUNT_POINT/bin:$PATH"
# We always use mingw64 Python to avoid path length issues like #17483.
export PYTHON="/mingw64/bin/python3"
}
# This will contain GHC's local native toolchain
toolchain="$TOP/toolchain"
mkdir -p "$toolchain/bin"
PATH="$toolchain/bin:$PATH"
export METRICS_FILE="$CI_PROJECT_DIR/performance-metrics.tsv"
cores="$(mk/detect-cpu-count.sh)"
# Use a local temporary directory to ensure that concurrent builds don't
# interfere with one another
mkdir -p "$TOP/tmp"
export TMP="$TOP/tmp"
export TEMP="$TOP/tmp"
function show_tool() {
local tool="$1"
info "$tool = ${!tool}"
${!tool} --version
}
function set_toolchain_paths() {
needs_toolchain=1
case "$(uname -m)-$(uname)" in
*-Linux) needs_toolchain="" ;;
*) ;;
esac
if [[ -n "${IN_NIX_SHELL:-}" ]]; then
needs_toolchain=""
GHC="$(which ghc)"
CABAL="$(which cabal)"
HAPPY="$(which happy)"
ALEX="$(which alex)"
elif [[ -n "$needs_toolchain" ]]; then
# These are populated by setup_toolchain
GHC="$toolchain/bin/ghc$exe"
CABAL="$toolchain/bin/cabal$exe"
HAPPY="$toolchain/bin/happy$exe"
ALEX="$toolchain/bin/alex$exe"
else
GHC="$(which ghc)"
CABAL="/usr/local/bin/cabal"
HAPPY="$HOME/.cabal/bin/happy"
ALEX="$HOME/.cabal/bin/alex"
fi
export GHC
export CABAL
export HAPPY
export ALEX
# FIXME: Temporarily use ghc from ports
case "$(uname)" in
FreeBSD) GHC="/usr/local/bin/ghc" ;;
*) ;;
esac
}
# Extract GHC toolchain
function setup() {
if [ -d "$TOP/cabal-cache" ]; then
info "Extracting cabal cache..."
mkdir -p "$cabal_dir"
cp -Rf cabal-cache/* "$cabal_dir"
fi
if [[ -n "$needs_toolchain" ]]; then
setup_toolchain
fi
# Make sure that git works
git config user.email "ghc-ci@gitlab-haskell.org"
git config user.name "GHC GitLab CI"
info "====================================================="
info "Toolchain versions"
info "====================================================="
show_tool GHC
show_tool CABAL
show_tool HAPPY
show_tool ALEX
}
function fetch_ghc() {
local v="$GHC_VERSION"
if [[ -z "$v" ]]; then
fail "GHC_VERSION is not set"
fi
if [ ! -e "$GHC" ]; then
start_section "fetch GHC"
url="https://downloads.haskell.org/~ghc/${GHC_VERSION}/ghc-${GHC_VERSION}-${boot_triple}.tar.xz"
info "Fetching GHC binary distribution from $url..."
curl "$url" > ghc.tar.xz || fail "failed to fetch GHC binary distribution"
tar -xJf ghc.tar.xz || fail "failed to extract GHC binary distribution"
case "$(uname)" in
MSYS_*|MINGW*)
cp -r "ghc-${GHC_VERSION}"/* "$toolchain"
;;
*)
pushd "ghc-${GHC_VERSION}"
./configure --prefix="$toolchain"
"$MAKE" install
popd
;;
esac
rm -Rf "ghc-${GHC_VERSION}" ghc.tar.xz
end_section "fetch GHC"
fi
}
function fetch_cabal() {
local v="$CABAL_INSTALL_VERSION"
if [[ -z "$v" ]]; then
fail "CABAL_INSTALL_VERSION is not set"
fi
if [ ! -e "$CABAL" ]; then
start_section "fetch GHC"
case "$(uname)" in
# N.B. Windows uses zip whereas all others use .tar.xz
MSYS_*|MINGW*)
case "$MSYSTEM" in
MINGW32) cabal_arch="i386" ;;
MINGW64) cabal_arch="x86_64" ;;
*) fail "unknown MSYSTEM $MSYSTEM" ;;
esac
url="https://downloads.haskell.org/~cabal/cabal-install-$v/cabal-install-$v-$cabal_arch-unknown-mingw32.zip"
info "Fetching cabal binary distribution from $url..."
curl "$url" > "$TMP/cabal.zip"
unzip "$TMP/cabal.zip"
mv cabal.exe "$CABAL"
;;
*)
local base_url="https://downloads.haskell.org/~cabal/cabal-install-$v/"
case "$(uname)" in
Darwin) cabal_url="$base_url/cabal-install-$v-x86_64-apple-darwin17.7.0.tar.xz" ;;
FreeBSD)
#cabal_url="$base_url/cabal-install-$v-x86_64-portbld-freebsd.tar.xz" ;;
cabal_url="http://home.smart-cactus.org/~ben/ghc/cabal-install-3.0.0.0-x86_64-portbld-freebsd.tar.xz" ;;
*) fail "don't know where to fetch cabal-install for $(uname)"
esac
echo "Fetching cabal-install from $cabal_url"
curl "$cabal_url" > cabal.tar.xz
tar -xJf cabal.tar.xz
mv cabal "$toolchain/bin"
;;
esac
end_section "fetch GHC"
fi
}
# For non-Docker platforms we prepare the bootstrap toolchain
# here. For Docker platforms this is done in the Docker image
# build.
function setup_toolchain() {
fetch_ghc
fetch_cabal
cabal_install="$CABAL v2-install --index-state=$hackage_index_state --installdir=$toolchain/bin"
# Avoid symlinks on Windows
case "$(uname)" in
MSYS_*|MINGW*) cabal_install="$cabal_install --install-method=copy" ;;
*) ;;
esac
if [ ! -e "$HAPPY" ]; then
info "Building happy..."
cabal update
$cabal_install happy
fi
if [ ! -e "$ALEX" ]; then
info "Building alex..."
cabal update
$cabal_install alex
fi
}
function cleanup_submodules() {
start_section "clean submodules"
info "Cleaning submodules..."
# On Windows submodules can inexplicably get into funky states where git
# believes that the submodule is initialized yet its associated repository
# is not valid. Avoid failing in this case with the following insanity.
git submodule sync --recursive || git submodule deinit --force --all
git submodule update --init --recursive
git submodule foreach git clean -xdf
end_section "clean submodules"
}
function prepare_build_mk() {
if [[ -z "$BUILD_FLAVOUR" ]]; then fail "BUILD_FLAVOUR is not set"; fi
if [[ -z ${BUILD_SPHINX_HTML:-} ]]; then BUILD_SPHINX_HTML=YES; fi
if [[ -z ${BUILD_SPHINX_PDF:-} ]]; then BUILD_SPHINX_PDF=YES; fi
if [[ -z ${INTEGER_LIBRARY:-} ]]; then INTEGER_LIBRARY=integer-gmp; fi
cat > mk/build.mk <<EOF
INTEGER_LIBRARY=$INTEGER_LIBRARY
BuildFlavour=$BUILD_FLAVOUR
include mk/flavours/${BUILD_FLAVOUR}.mk
GhcLibHcOpts+=-haddock
EOF
if [ -n "${HADDOCK_HYPERLINKED_SOURCES:-}" ]; then
echo "EXTRA_HADDOCK_OPTS += --hyperlinked-source --quickjump" >> mk/build.mk
fi
info "build.mk is:"
cat mk/build.mk
}
function configure() {
start_section "booting"
run python3 boot
end_section "booting"
local target_args=""
if [[ -n "${triple:-}" ]]; then
target_args="--target=$triple"
fi
start_section "configuring"
run ./configure \
--enable-tarballs-autodownload \
$target_args \
${CONFIGURE_ARGS:-} \
GHC="$GHC" \
HAPPY="$HAPPY" \
ALEX="$ALEX" \
|| ( cat config.log; fail "configure failed" )
end_section "configuring"
}
function build_make() {
prepare_build_mk
if [[ -z "$BIN_DIST_PREP_TAR_COMP" ]]; then
fail "BIN_DIST_PREP_TAR_COMP is not set"
fi
if [[ -n "${VERBOSE:-}" ]]; then
MAKE_ARGS="${MAKE_ARGS:-} V=1"
else
MAKE_ARGS="${MAKE_ARGS:-} V=0"
fi
run "$MAKE" -j"$cores" "$MAKE_ARGS"
run "$MAKE" -j"$cores" binary-dist-prep TAR_COMP_OPTS=-1
ls -lh "$BIN_DIST_PREP_TAR_COMP"
}
function fetch_perf_notes() {
info "Fetching perf notes..."
"$TOP/.gitlab/test-metrics.sh" pull
}
function push_perf_notes() {
info "Pushing perf notes..."
"$TOP/.gitlab/test-metrics.sh" push
}
function test_make() {
run "$MAKE" test_bindist TEST_PREP=YES
run "$MAKE" V=0 VERBOSE=1 test \
THREADS="$cores" \
JUNIT_FILE=../../junit.xml
}
function build_hadrian() {
if [ -z "$FLAVOUR" ]; then
fail "FLAVOUR not set"
fi
run_hadrian binary-dist
mv _build/bindist/ghc*.tar.xz ghc.tar.xz
}
function test_hadrian() {
cd _build/bindist/ghc-*/
case "$(uname)" in
MSYS_*|MINGW*)
mkdir -p "$TOP"/_build/install
cp -a * "$TOP"/_build/install
;;
*)
run ./configure --prefix="$TOP"/_build/install
run "$MAKE" install
;;
esac
cd ../../../
run_hadrian \
test \
--summary-junit=./junit.xml \
--test-compiler="$TOP/_build/install/bin/ghc$exe" \
"runtest.opts+=${RUNTEST_ARGS:-}"
}
function clean() {
rm -R tmp
run "$MAKE" --quiet clean || true
run rm -Rf _build
}
function run_hadrian() {
if [ -n "${VERBOSE:-}" ]; then HADRIAN_ARGS="${HADRIAN_ARGS:-} -V"; fi
run hadrian/build.cabal.sh \
--flavour="$FLAVOUR" \
-j"$cores" \
${HADRIAN_ARGS:-} \
$@
}
# A convenience function to allow debugging in the CI environment.
function shell() {
local cmd=$@
if [ -z "$cmd" ]; then
cmd="bash -i"
fi
run $cmd
}
# Determine Cabal data directory
case "$(uname)" in
MSYS_*|MINGW*) exe=".exe"; cabal_dir="$APPDATA/cabal" ;;
*) cabal_dir="$HOME/.cabal"; exe="" ;;
esac
# Platform-specific environment initialization
MAKE="make"
case "$(uname)" in
MSYS_*|MINGW*) mingw_init ;;
Darwin) boot_triple="x86_64-apple-darwin" ;;
FreeBSD)
boot_triple="x86_64-portbld-freebsd"
MAKE="gmake"
;;
Linux) ;;
*) fail "uname $(uname) is not supported" ;;
esac
set_toolchain_paths
case $1 in
setup) setup && cleanup_submodules ;;
configure) configure ;;
build_make) build_make ;;
test_make) fetch_perf_notes; test_make; push_perf_notes ;;
build_hadrian) build_hadrian ;;
test_hadrian) fetch_perf_notes; test_hadrian; push_perf_notes ;;
run_hadrian) run_hadrian $@ ;;
clean) clean ;;
shell) shell $@ ;;
*) fail "unknown mode $1" ;;
esac
#!/usr/bin/env bash
# vim: sw=2 et
set -euo pipefail
fail() {
echo "ERROR: $*" >&2
exit 1
}
hackage_index_state="@1522046735"
if [[ -z ${BUILD_SPHINX_HTML:-} ]]; then BUILD_SPHINX_HTML=YES; fi
if [[ -z ${BUILD_SPHINX_PDF:-} ]]; then BUILD_SPHINX_PDF=YES; fi
if [[ -z ${INTEGER_LIBRARY:-} ]]; then INTEGER_LIBRARY=integer-gmp; fi
if [[ -z ${BUILD_FLAVOUR:-} ]]; then BUILD_FLAVOUR=perf; fi
cat > mk/build.mk <<EOF
V=1
HADDOCK_DOCS=YES
LATEX_DOCS=YES
HSCOLOUR_SRCS=YES
BUILD_SPHINX_HTML=$BUILD_SPHINX_HTML
BUILD_SPHINX_PDF=$BUILD_SPHINX_PDF
BeConservative=YES
INTEGER_LIBRARY=$INTEGER_LIBRARY
XZ_CMD=pxz
EOF
cat <<EOF >> mk/build.mk
BuildFlavour=$BUILD_FLAVOUR
ifneq "\$(BuildFlavour)" ""
include mk/flavours/\$(BuildFlavour).mk
endif
GhcLibHcOpts+=-haddock
EOF
case "$(uname)" in
Linux)
if [[ -n ${TARGET:-} ]]; then
if [[ $TARGET = FreeBSD ]]; then
# cross-compiling to FreeBSD
echo 'HADDOCK_DOCS = NO' >> mk/build.mk
echo 'WERROR=' >> mk/build.mk
# https://circleci.com/docs/2.0/env-vars/#interpolating-environment-variables-to-set-other-environment-variables
echo 'export PATH=/opt/ghc/bin:$PATH' >> $BASH_ENV
else
fail "TARGET=$target not supported"
fi
fi
;;
Darwin)
if [[ -n ${TARGET:-} ]]; then
fail "uname=$(uname) not supported for cross-compilation"
fi
# It looks like we already have python2 here and just installing python3
# does not work.
brew upgrade python
brew install ghc cabal-install ncurses gmp
pip3 install sphinx
# PDF documentation disabled as MacTeX apparently doesn't include xelatex.
#brew cask install mactex
cabal update
cabal install --reinstall alex happy haddock hscolour --index-state=$hackage_index_state
# put them on the $PATH, don't fail if already installed
ln -s $HOME/.cabal/bin/alex /usr/local/bin/alex || true
ln -s $HOME/.cabal/bin/happy /usr/local/bin/happy || true
ln -s $HOME/.cabal/bin/HsColour /usr/local/bin/HsColour || true
echo "libraries/integer-gmp_CONFIGURE_OPTS += --configure-option=--with-intree-gmp" >> mk/build.mk
;;
*)
fail "uname=$(uname) not supported"
esac
echo "================================================="
echo "Build.mk:"
echo ""
cat mk/build.mk
echo "================================================="
{ system ? "aarch64-darwin"
#, nixpkgs ? fetchTarball https://github.com/angerman/nixpkgs/archive/257cb120334.tar.gz #apple-silicon.tar.gz
, pkgs ? import <nixpkgs> { inherit system; }
, compiler ? if system == "aarch64-darwin" then "ghc8103Binary" else "ghc8103"
}: pkgs.mkShell {
# this prevents nix from trying to write the env-vars file.
# we can't really, as NIX_BUILD_TOP/env-vars is not set.
noDumpEnvVars=1;
# stop polluting LDFLAGS with -liconv
dontAddExtraLibs = true;
# we need to inject ncurses into --with-curses-libraries.
# the real fix is to teach terminfo to use libcurses on macOS.
# CONFIGURE_ARGS = "--with-intree-gmp --with-curses-libraries=${pkgs.ncurses.out}/lib";
CONFIGURE_ARGS = "--with-intree-gmp";
# magic speedup pony :facepalm:
#
# nix has the ugly habbit of duplicating ld flags more than necessary. This
# somewhat consolidates this.
shellHook = ''
echo $NIX_LDFLAGS
export NIX_LDFLAGS=$(for a in $NIX_LDFLAGS; do echo $a; done |sort|uniq|xargs)
export NIX_LDFLAGS_FOR_TARGET=$(for a in $NIX_LDFLAGS_FOR_TARGET; do echo $a; done |sort|uniq|xargs)
export NIX_LDFLAGS_FOR_TARGET=$(comm -3 <(for l in $NIX_LDFLAGS_FOR_TARGET; do echo $l; done) <(for l in $NIX_LDFLAGS; do echo $l; done))
# Impurity hack for GHC releases.
#################################
# We don't want binary releases to depend on nix, thus we'll need to make sure we don't leak in references.
# GHC externally depends only on iconv and curses. However we can't force a specific curses library for
# the terminfo package, as such we'll need to make sure we only look in the system path for the curses library
# and not pick up the tinfo from the nix provided ncurses package.
#
# We also need to force us to use the systems COREFOUNDATION, not the one that nix builds. Again this is impure,
# but it will allow us to have proper binary distributions.
#
# do not use nixpkgs provided core foundation
export NIX_COREFOUNDATION_RPATH=/System/Library/Frameworks
# drop curses from the LDFLAGS, we really want the system ones, not the nix ones.
export NIX_LDFLAGS=$(for lib in $NIX_LDFLAGS; do case "$lib" in *curses*);; *) echo -n "$lib ";; esac; done;)
export NIX_CFLAGS_COMPILE+=" -Wno-nullability-completeness -Wno-availability -Wno-expansion-to-defined -Wno-builtin-requires-header -Wno-unused-command-line-argument"
# unconditionally add the MacOSX.sdk and TargetConditional.h
export NIX_CFLAGS_COMPILE+=" -isystem /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include"
export NIX_LDFLAGS="-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib $NIX_LDFLAGS"
echo $NIX_LDFLAGS
echo $MACOSX_DEPLOYMENT_TARGET
export MACOSX_DEPLOYMENT_TARGET=10.7
echo $MACOSX_DEPLOYMENT_TARGET
'';
nativeBuildInputs = (with pkgs; [
# This needs to come *before* ghc,
# otherwise we migth end up with the clang from
# the bootstrap GHC in PATH with higher priority.
clang_11
llvm_11
haskell.compiler.${compiler}
haskell.packages.${compiler}.cabal-install
haskell.packages.${compiler}.alex
haskell.packages.${compiler}.happy_1_19_12
automake
autoconf
m4
gmp
zlib.out
zlib.dev
glibcLocales
# locale doesn't build yet :-/
# locale
git
python3
# python3Full
# python3Packages.sphinx
perl
which
wget
file
xz
xlibs.lndir
cacert ])
++ (with pkgs.darwin.apple_sdk.frameworks; [ Foundation Security ]);
}
#!/usr/bin/env bash
# vim: sw=2 et
set -euo pipefail
NOTES_ORIGIN="https://gitlab.haskell.org/ghc/ghc-performance-notes.git"
NOTES_ORIGIN_PUSH="git@gitlab.haskell.org:ghc/ghc-performance-notes.git"
REF="perf"
run() {
echo "$@"
$@
}
fail() {
echo "ERROR: $*" >&2
exit 1
}
function pull() {
local ref="refs/notes/$REF"
run git fetch -f $NOTES_ORIGIN $ref:$ref
echo "perf notes ref $ref is $(git rev-parse $ref)"
}
function setup_ssh() {
# Add gitlab as a known host.
mkdir -p ~/.ssh
echo "|1|+AUrMGS1elvPeLNt+NHGa5+c6pU=|4XvfRsQftO1OgZD4c0JJ7oNaii8= ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDXilA5l4kOZPx0nM6xDATF+t4fS6te0eYPDwBI/jLWD9cJVtCnsrwMl5ar+/NfmcD0jnCYztUiVHuXyTaWPJYSQpwltfpTeqpo9/z/0MxkPtSl1uMP2cLbDiqA01OWveChktOXwU6hRQ+7MmO+dNRS/iXrRmYrGv/p1W811QgLBLS9fefEdF25n+0dP71L7Ov7riOawlDmd0C11FraE/R8HX6gs6lbXta1kisdxGyKojYSiCtobUaJxRoatMfUP0a9rwTAyl8tf56LgB+igjMky879VAbL7eQ/AmfHYPrSGJ/YlWP6Jj23Dnos5nOVlWL/rVTs9Y/NakLpPwMs75KTC0Pd74hdf2e3folDdAi2kLrQgO2SI6so7rOYZ+mFkCM751QdDVy4DzjmDvSgSIVf9SV7RQf7e7unE7pSZ/ILupZqz9KhR1MOwVO+ePa5qJMNSdC204PIsRWkIO5KP0QLl507NI9Ri84+aODoHD7gDIWNhU08J2P8/E6r0wcC8uWaxh+HaOjI9BkHjqRYsrgfn54BAuO9kw1cDvyi3c8n7VFlNtvQP15lANwim3gr9upV+r95KEPJCgZMYWJBDPIVtp4GdYxCfXxWj5oMXbA5pf0tNixwNJjAsY7I6RN2htHbuySH36JybOZk+gCj6mQkxpCT/tKaUn14hBJWLq7Q+Q==" >> ~/.ssh/known_hosts
echo "|1|JZkdAPJmpX6SzGeqhmQLfMWLGQA=|4vTELroOlbFxbCr0WX+PK9EcpD0= ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJknufU+I6A5Nm58lmse4/o11Ai2UzYbYe7782J1+kRk" >> ~/.ssh/known_hosts
# Setup ssh keys.
eval `ssh-agent`
echo "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDJPR1vrZgeGTXmgJw2PsJfMjf22LcDnVVwt3l0rwTZ+8Q2J0bHaYxMRKBco1sON6LGcZepw0Hy76RQ87v057pTz18SXvnfE7U/B6v9qBk0ILJz+4BOX9sEhxu2XmScp/wMxkG9IoyruMlsxXzd1sz09o+rzzx24U2Rp27PRm08vG0oipve6BWLbYEqYrE4/nCufqOJmGd56fju7OTU0lTpEkGDEDWGMxutaX2CbTbDju7qy07Ld8BjSc9aHfvuQaslUbj3ex3EF8EXahURzGpHQn/UFFzVGMokFumiJCAagHQb7cj6jOkKseZLaysbA/mTBQsOzjWiRmkN23bQf1wF ben+ghc-ci@smart-cactus.org" > ~/.ssh/perf_rsa.pub
touch ~/.ssh/perf_rsa
chmod 0600 ~/.ssh/perf_rsa
echo "$PERF_NOTE_KEY" >> ~/.ssh/perf_rsa
ssh-add ~/.ssh/perf_rsa
}
# Reset the git notes and append the metrics file to the notes, then push and return the result.
# This is favoured over a git notes merge as it avoids potential data loss/duplication from the merge strategy.
function reset_append_note_push {
pull || true
run git notes --ref=$REF append -F $METRICS_FILE HEAD
run git push $NOTES_ORIGIN_PUSH refs/notes/$REF
}
function push() {
# Check that private key is available (Set on all GitLab protected branches).
if [ -z ${PERF_NOTE_KEY+"$PERF_NOTE_KEY"} ]
then
echo "Not pushing performance git notes: PERF_NOTE_KEY is not set."
exit 0
fi
# TEST_ENV must be set.
if [ -z ${TEST_ENV+"$TEST_ENV"} ]
then
fail "Not pushing performance git notes: TEST_ENV must be set."
fi
# Assert that the METRICS_FILE exists and can be read.
if [ -z ${METRICS_FILE+"$METRICS_FILE"} ]
then
fail "\$METRICS_FILE not set."
fi
if ! [ -r $METRICS_FILE ]
then
fail "Metrics file not found: $METRICS_FILE"
fi
setup_ssh
# Push the metrics file as a git note. This may fail if another task pushes a note first. In that case
# the latest note is fetched and appended.
MAX_RETRY=20
until reset_append_note_push || [ $MAX_RETRY -le 0 ]
do
((MAX_RETRY--))
echo ""
echo "Failed to push git notes. Fetching, appending, and retrying... $MAX_RETRY retries left."
done
}
case $1 in
push) push ;;
pull) pull ;;
*) fail "Invalid mode $1" ;;
esac
#!/bin/bash
set -e
toolchain=`pwd`/toolchain
PATH="$toolchain/bin:/mingw64/bin:$PATH"
if [ -d "`pwd`/cabal-cache" ]; then
cp -Rf cabal-cache $APPDATA/cabal
fi
if [ ! -e $toolchain/bin/ghc ]; then
case $MSYSTEM in
MINGW32)
triple="i386-unknown-mingw32"
;;
MINGW64)
triple="x86_64-unknown-mingw32"
;;
*)
echo "win32-init: Unknown MSYSTEM $MSYSTEM"
exit 1
;;
esac
curl https://downloads.haskell.org/~ghc/$GHC_VERSION/ghc-$GHC_VERSION-$triple.tar.xz | tar -xJ
mv ghc-$GHC_VERSION toolchain
fi
if [ ! -e $toolchain/bin/cabal ]; then
url="https://downloads.haskell.org/~cabal/cabal-install-2.4.1.0/cabal-install-2.4.1.0-x86_64-unknown-mingw32.zip"
curl $url > /tmp/cabal.zip
unzip /tmp/cabal.zip
mv cabal.exe $toolchain/bin
fi
if [ ! -e $toolchain/bin/happy ]; then
cabal update
cabal install happy
cp $APPDATA/cabal/bin/happy $toolchain/bin
fi
if [ ! -e $toolchain/bin/alex ]; then
cabal update
cabal install alex
cp $APPDATA/cabal/bin/alex $toolchain/bin
fi
# Install new process to mitigate #17480.
cabal install libraries/process
...@@ -3,6 +3,17 @@ ...@@ -3,6 +3,17 @@
# To be a good autoconf citizen, names of local macros have prefixed with FP_ to # To be a good autoconf citizen, names of local macros have prefixed with FP_ to
# ensure we don't clash with any pre-supplied autoconf ones. # ensure we don't clash with any pre-supplied autoconf ones.
m4_include([m4/ax_compare_version.m4])
# FPTOOLS_WRITE_FILE
# ------------------
# Write $2 to the file named $1.
AC_DEFUN([FPTOOLS_WRITE_FILE],
[
cat >$1 <<ACEOF
$2
ACEOF
])
AC_DEFUN([GHC_SELECT_FILE_EXTENSIONS], AC_DEFUN([GHC_SELECT_FILE_EXTENSIONS],
[ [
...@@ -109,7 +120,7 @@ AC_DEFUN([FPTOOLS_SET_PLATFORM_VARS], ...@@ -109,7 +120,7 @@ AC_DEFUN([FPTOOLS_SET_PLATFORM_VARS],
GHC_CONVERT_OS([$target_os], [$TargetArch], [TargetOS]) GHC_CONVERT_OS([$target_os], [$TargetArch], [TargetOS])
fi fi
GHC_LLVM_TARGET([$target_cpu],[$target_vendor],[$target_os],[LlvmTarget]) GHC_LLVM_TARGET([$target],[$target_cpu],[$target_vendor],[$target_os],[LlvmTarget])
GHC_SELECT_FILE_EXTENSIONS([$host], [exeext_host], [soext_host]) GHC_SELECT_FILE_EXTENSIONS([$host], [exeext_host], [soext_host])
GHC_SELECT_FILE_EXTENSIONS([$target], [exeext_target], [soext_target]) GHC_SELECT_FILE_EXTENSIONS([$target], [exeext_target], [soext_target])
...@@ -209,7 +220,7 @@ AC_DEFUN([FPTOOLS_SET_HASKELL_PLATFORM_VARS], ...@@ -209,7 +220,7 @@ AC_DEFUN([FPTOOLS_SET_HASKELL_PLATFORM_VARS],
test -z "[$]2" || eval "[$]2=\"ArchARM {armISA = \$ARM_ISA, armISAExt = \$ARM_ISA_EXT, armABI = \$ARM_ABI}\"" test -z "[$]2" || eval "[$]2=\"ArchARM {armISA = \$ARM_ISA, armISAExt = \$ARM_ISA_EXT, armABI = \$ARM_ABI}\""
;; ;;
aarch64) aarch64)
test -z "[$]2" || eval "[$]2=ArchARM64" test -z "[$]2" || eval "[$]2=ArchAArch64"
;; ;;
alpha) alpha)
test -z "[$]2" || eval "[$]2=ArchAlpha" test -z "[$]2" || eval "[$]2=ArchAlpha"
...@@ -318,9 +329,15 @@ AC_DEFUN([FPTOOLS_SET_HASKELL_PLATFORM_VARS], ...@@ -318,9 +329,15 @@ AC_DEFUN([FPTOOLS_SET_HASKELL_PLATFORM_VARS],
AC_LINK_IFELSE( AC_LINK_IFELSE(
[AC_LANG_PROGRAM([], [__asm__ (".subsections_via_symbols");])], [AC_LANG_PROGRAM([], [__asm__ (".subsections_via_symbols");])],
[AC_MSG_RESULT(yes) [AC_MSG_RESULT(yes)
TargetHasSubsectionsViaSymbols=YES if test x"$TargetArch" = xaarch64; then
AC_DEFINE([HAVE_SUBSECTIONS_VIA_SYMBOLS],[1], dnl subsections via symbols is busted on arm64 (aarch64-darwin)
dnl see #19379
TargetHasSubsectionsViaSymbols=NO
else
TargetHasSubsectionsViaSymbols=YES
AC_DEFINE([HAVE_SUBSECTIONS_VIA_SYMBOLS],[1],
[Define to 1 if Apple-style dead-stripping is supported.]) [Define to 1 if Apple-style dead-stripping is supported.])
fi
], ],
[TargetHasSubsectionsViaSymbols=NO [TargetHasSubsectionsViaSymbols=NO
AC_MSG_RESULT(no)]) AC_MSG_RESULT(no)])
...@@ -498,6 +515,7 @@ AC_DEFUN([GET_ARM_ISA], ...@@ -498,6 +515,7 @@ AC_DEFUN([GET_ARM_ISA],
# FP_SETTINGS # FP_SETTINGS
# ---------------------------------- # ----------------------------------
# Set the variables used in the settings file # Set the variables used in the settings file
# See Note [tooldir: How GHC finds mingw on Windows]
AC_DEFUN([FP_SETTINGS], AC_DEFUN([FP_SETTINGS],
[ [
if test "$windows" = YES -a "$EnableDistroToolchain" = "NO" if test "$windows" = YES -a "$EnableDistroToolchain" = "NO"
...@@ -507,6 +525,10 @@ AC_DEFUN([FP_SETTINGS], ...@@ -507,6 +525,10 @@ AC_DEFUN([FP_SETTINGS],
SettingsHaskellCPPCommand="${mingw_bin_prefix}gcc.exe" SettingsHaskellCPPCommand="${mingw_bin_prefix}gcc.exe"
SettingsHaskellCPPFlags="$HaskellCPPArgs" SettingsHaskellCPPFlags="$HaskellCPPArgs"
SettingsLdCommand="${mingw_bin_prefix}ld.exe" SettingsLdCommand="${mingw_bin_prefix}ld.exe"
# Overrides FIND_MERGE_OBJECTS in order to avoid hard-coding linker
# path on Windows (#18550).
SettingsMergeObjectsCommand="${SettingsLdCommand}"
SettingsMergeObjectsFlags="-r --oformat=pe-bigobj-x86-64"
SettingsArCommand="${mingw_bin_prefix}ar.exe" SettingsArCommand="${mingw_bin_prefix}ar.exe"
SettingsRanlibCommand="${mingw_bin_prefix}ranlib.exe" SettingsRanlibCommand="${mingw_bin_prefix}ranlib.exe"
SettingsDllWrapCommand="${mingw_bin_prefix}dllwrap.exe" SettingsDllWrapCommand="${mingw_bin_prefix}dllwrap.exe"
...@@ -520,6 +542,8 @@ AC_DEFUN([FP_SETTINGS], ...@@ -520,6 +542,8 @@ AC_DEFUN([FP_SETTINGS],
SettingsHaskellCPPCommand="$(basename $HaskellCPPCmd)" SettingsHaskellCPPCommand="$(basename $HaskellCPPCmd)"
SettingsHaskellCPPFlags="$HaskellCPPArgs" SettingsHaskellCPPFlags="$HaskellCPPArgs"
SettingsLdCommand="$(basename $LdCmd)" SettingsLdCommand="$(basename $LdCmd)"
SettingsMergeObjectsCommand="$(basename $MergeObjsCmd)"
SettingsMergeObjectsFlags="$MergeObjsArgs"
SettingsArCommand="$(basename $ArCmd)" SettingsArCommand="$(basename $ArCmd)"
SettingsDllWrapCommand="$(basename $DllWrapCmd)" SettingsDllWrapCommand="$(basename $DllWrapCmd)"
SettingsWindresCommand="$(basename $WindresCmd)" SettingsWindresCommand="$(basename $WindresCmd)"
...@@ -529,6 +553,8 @@ AC_DEFUN([FP_SETTINGS], ...@@ -529,6 +553,8 @@ AC_DEFUN([FP_SETTINGS],
SettingsHaskellCPPCommand="$HaskellCPPCmd" SettingsHaskellCPPCommand="$HaskellCPPCmd"
SettingsHaskellCPPFlags="$HaskellCPPArgs" SettingsHaskellCPPFlags="$HaskellCPPArgs"
SettingsLdCommand="$LdCmd" SettingsLdCommand="$LdCmd"
SettingsMergeObjectsCommand="$MergeObjsCmd"
SettingsMergeObjectsFlags="$MergeObjsArgs"
SettingsArCommand="$ArCmd" SettingsArCommand="$ArCmd"
SettingsRanlibCommand="$RanlibCmd" SettingsRanlibCommand="$RanlibCmd"
if test -z "$DllWrapCmd" if test -z "$DllWrapCmd"
...@@ -569,11 +595,24 @@ AC_DEFUN([FP_SETTINGS], ...@@ -569,11 +595,24 @@ AC_DEFUN([FP_SETTINGS],
else else
SettingsOptCommand="$OptCmd" SettingsOptCommand="$OptCmd"
fi fi
if test -z "$OtoolCmd"
then
SettingsOtoolCommand="otool"
else
SettingsOtoolCommand="$OtoolCmd"
fi
if test -z "$InstallNameToolCmd"
then
SettingsInstallNameToolCommand="install_name_tool"
else
SettingsInstallNameToolCommand="$InstallNameToolCmd"
fi
SettingsCCompilerFlags="$CONF_CC_OPTS_STAGE2" SettingsCCompilerFlags="$CONF_CC_OPTS_STAGE2"
SettingsCxxCompilerFlags="$CONF_CXX_OPTS_STAGE2" SettingsCxxCompilerFlags="$CONF_CXX_OPTS_STAGE2"
SettingsCCompilerLinkFlags="$CONF_GCC_LINKER_OPTS_STAGE2" SettingsCCompilerLinkFlags="$CONF_GCC_LINKER_OPTS_STAGE2"
SettingsCCompilerSupportsNoPie="$CONF_GCC_SUPPORTS_NO_PIE" SettingsCCompilerSupportsNoPie="$CONF_GCC_SUPPORTS_NO_PIE"
SettingsLdFlags="$CONF_LD_LINKER_OPTS_STAGE2" SettingsLdFlags="$CONF_LD_LINKER_OPTS_STAGE2"
SettingsUseDistroMINGW="$EnableDistroToolchain"
AC_SUBST(SettingsCCompilerCommand) AC_SUBST(SettingsCCompilerCommand)
AC_SUBST(SettingsHaskellCPPCommand) AC_SUBST(SettingsHaskellCPPCommand)
AC_SUBST(SettingsHaskellCPPFlags) AC_SUBST(SettingsHaskellCPPFlags)
...@@ -583,8 +622,12 @@ AC_DEFUN([FP_SETTINGS], ...@@ -583,8 +622,12 @@ AC_DEFUN([FP_SETTINGS],
AC_SUBST(SettingsCCompilerSupportsNoPie) AC_SUBST(SettingsCCompilerSupportsNoPie)
AC_SUBST(SettingsLdCommand) AC_SUBST(SettingsLdCommand)
AC_SUBST(SettingsLdFlags) AC_SUBST(SettingsLdFlags)
AC_SUBST(SettingsMergeObjectsCommand)
AC_SUBST(SettingsMergeObjectsFlags)
AC_SUBST(SettingsArCommand) AC_SUBST(SettingsArCommand)
AC_SUBST(SettingsRanlibCommand) AC_SUBST(SettingsRanlibCommand)
AC_SUBST(SettingsOtoolCommand)
AC_SUBST(SettingsInstallNameToolCommand)
AC_SUBST(SettingsDllWrapCommand) AC_SUBST(SettingsDllWrapCommand)
AC_SUBST(SettingsWindresCommand) AC_SUBST(SettingsWindresCommand)
AC_SUBST(SettingsLibtoolCommand) AC_SUBST(SettingsLibtoolCommand)
...@@ -592,6 +635,7 @@ AC_DEFUN([FP_SETTINGS], ...@@ -592,6 +635,7 @@ AC_DEFUN([FP_SETTINGS],
AC_SUBST(SettingsClangCommand) AC_SUBST(SettingsClangCommand)
AC_SUBST(SettingsLlcCommand) AC_SUBST(SettingsLlcCommand)
AC_SUBST(SettingsOptCommand) AC_SUBST(SettingsOptCommand)
AC_SUBST(SettingsUseDistroMINGW)
]) ])
# Helper for cloning a shell variable's state # Helper for cloning a shell variable's state
...@@ -617,7 +661,7 @@ AC_DEFUN([FP_SET_CFLAGS_C99], ...@@ -617,7 +661,7 @@ AC_DEFUN([FP_SET_CFLAGS_C99],
CPPFLAGS="$$3" CPPFLAGS="$$3"
unset ac_cv_prog_cc_c99 unset ac_cv_prog_cc_c99
dnl perform detection dnl perform detection
_AC_PROG_CC_C99 AC_PROG_CC_C99
fp_cc_c99="$ac_cv_prog_cc_c99" fp_cc_c99="$ac_cv_prog_cc_c99"
case "x$ac_cv_prog_cc_c99" in case "x$ac_cv_prog_cc_c99" in
x) ;; # noop x) ;; # noop
...@@ -808,7 +852,7 @@ AC_SUBST(ContextDiffCmd, [$fp_cv_context_diff]) ...@@ -808,7 +852,7 @@ AC_SUBST(ContextDiffCmd, [$fp_cv_context_diff])
# is supported in autoconf versions 2.50 up to the actual 2.57, so there is # is supported in autoconf versions 2.50 up to the actual 2.57, so there is
# little risk. # little risk.
AC_DEFUN([FP_COMPUTE_INT], AC_DEFUN([FP_COMPUTE_INT],
[_AC_COMPUTE_INT([$1], [$2], [$3], [$4])[]dnl [AC_COMPUTE_INT([$2], [$1], [$3], [$4])[]dnl
])# FP_COMPUTE_INT ])# FP_COMPUTE_INT
...@@ -865,7 +909,7 @@ AC_DEFUN( ...@@ -865,7 +909,7 @@ AC_DEFUN(
[FP_DEFAULT_CHOICE_OVERRIDE_CHECK], [FP_DEFAULT_CHOICE_OVERRIDE_CHECK],
[AC_ARG_ENABLE( [AC_ARG_ENABLE(
[$1], [$1],
[AC_HELP_STRING( [AS_HELP_STRING(
[--enable-$1], [--enable-$1],
[$5])], [$5])],
[AS_IF( [AS_IF(
...@@ -1003,6 +1047,8 @@ if test ! -f compiler/parser/Parser.hs || test ! -f compiler/cmm/CmmParse.hs ...@@ -1003,6 +1047,8 @@ if test ! -f compiler/parser/Parser.hs || test ! -f compiler/cmm/CmmParse.hs
then then
FP_COMPARE_VERSIONS([$fptools_cv_happy_version],[-lt],[1.19.10], FP_COMPARE_VERSIONS([$fptools_cv_happy_version],[-lt],[1.19.10],
[AC_MSG_ERROR([Happy version 1.19.10 or later is required to compile GHC.])])[] [AC_MSG_ERROR([Happy version 1.19.10 or later is required to compile GHC.])])[]
FP_COMPARE_VERSIONS([$fptools_cv_happy_version],[-ge],[1.20.0],
[AC_MSG_ERROR([Happy version 1.19 is required to compile GHC.])])[]
fi fi
HappyVersion=$fptools_cv_happy_version; HappyVersion=$fptools_cv_happy_version;
AC_SUBST(HappyVersion) AC_SUBST(HappyVersion)
...@@ -1318,19 +1364,25 @@ AC_DEFUN([FP_PROG_AR_NEEDS_RANLIB],[ ...@@ -1318,19 +1364,25 @@ AC_DEFUN([FP_PROG_AR_NEEDS_RANLIB],[
# (unsubstituted) output variable GccVersion. # (unsubstituted) output variable GccVersion.
AC_DEFUN([FP_GCC_VERSION], [ AC_DEFUN([FP_GCC_VERSION], [
AC_REQUIRE([AC_PROG_CC]) AC_REQUIRE([AC_PROG_CC])
if test -z "$CC" if test -z "$CC"; then
then AC_MSG_ERROR([C compiler is required])
AC_MSG_ERROR([gcc is required]) fi
if $CC --version | grep --quiet gcc; then
AC_CACHE_CHECK([version of gcc], [fp_cv_gcc_version],
[
# Be sure only to look at the first occurrence of the "version " string;
# Some Apple compilers emit multiple messages containing this string.
AC_MSG_CHECKING([version of gcc])
fp_cv_gcc_version="`$CC -v 2>&1 | sed -n -e '1,/version /s/.*version [[^0-9]]*\([[0-9.]]*\).*/\1/p'`"
AC_MSG_RESULT([$fp_cv_gcc_version])
FP_COMPARE_VERSIONS([$fp_cv_gcc_version], [-lt], [4.6],
[AC_MSG_ERROR([Need at least gcc version 4.6 (4.7+ recommended)])])
])
AC_SUBST([GccVersion], [$fp_cv_gcc_version])
else
AC_MSG_NOTICE([\$CC is not gcc; assuming it's a reasonably new C compiler])
fi fi
AC_CACHE_CHECK([version of gcc], [fp_cv_gcc_version],
[
# Be sure only to look at the first occurrence of the "version " string;
# Some Apple compilers emit multiple messages containing this string.
fp_cv_gcc_version="`$CC -v 2>&1 | sed -n -e '1,/version /s/.*version [[^0-9]]*\([[0-9.]]*\).*/\1/p'`"
FP_COMPARE_VERSIONS([$fp_cv_gcc_version], [-lt], [4.6],
[AC_MSG_ERROR([Need at least gcc version 4.6 (4.7+ recommended)])])
])
GccVersion="$fp_cv_gcc_version"
])# FP_GCC_VERSION ])# FP_GCC_VERSION
dnl Check to see if the C compiler is clang or llvm-gcc dnl Check to see if the C compiler is clang or llvm-gcc
...@@ -1555,13 +1607,12 @@ AC_SUBST([GhcPkgCmd]) ...@@ -1555,13 +1607,12 @@ AC_SUBST([GhcPkgCmd])
AC_DEFUN([FP_GCC_EXTRA_FLAGS], AC_DEFUN([FP_GCC_EXTRA_FLAGS],
[AC_REQUIRE([FP_GCC_VERSION]) [AC_REQUIRE([FP_GCC_VERSION])
AC_CACHE_CHECK([for extra options to pass gcc when compiling via C], [fp_cv_gcc_extra_opts], AC_CACHE_CHECK([for extra options to pass gcc when compiling via C], [fp_cv_gcc_extra_opts],
[fp_cv_gcc_extra_opts= [
FP_COMPARE_VERSIONS([$fp_cv_gcc_version], [-ge], [3.4], if test "$Unregisterised" = "YES"; then
[fp_cv_gcc_extra_opts="$fp_cv_gcc_extra_opts -fwrapv"], # These used to be conditioned on gcc version but we no longer support
[]) # GCC versions which lack support for these flags
FP_COMPARE_VERSIONS([$fp_cv_gcc_version], [-ge], [4.0], fp_cv_gcc_extra_opts="-fwrapv -fno-builtin"
[fp_cv_gcc_extra_opts="$fp_cv_gcc_extra_opts -fno-builtin"], fi
[])
]) ])
AC_SUBST([GccExtraViaCOpts],$fp_cv_gcc_extra_opts) AC_SUBST([GccExtraViaCOpts],$fp_cv_gcc_extra_opts)
]) ])
...@@ -1796,12 +1847,12 @@ AC_DEFUN([FP_ICONV], ...@@ -1796,12 +1847,12 @@ AC_DEFUN([FP_ICONV],
dnl environment. dnl environment.
AC_ARG_WITH([iconv-includes], AC_ARG_WITH([iconv-includes],
[AC_HELP_STRING([--with-iconv-includes], [AS_HELP_STRING([--with-iconv-includes],
[directory containing iconv.h])], [directory containing iconv.h])],
[ICONV_INCLUDE_DIRS=$withval]) [ICONV_INCLUDE_DIRS=$withval])
AC_ARG_WITH([iconv-libraries], AC_ARG_WITH([iconv-libraries],
[AC_HELP_STRING([--with-iconv-libraries], [AS_HELP_STRING([--with-iconv-libraries],
[directory containing iconv library])], [directory containing iconv library])],
[ICONV_LIB_DIRS=$withval]) [ICONV_LIB_DIRS=$withval])
...@@ -1818,23 +1869,23 @@ AC_DEFUN([FP_GMP], ...@@ -1818,23 +1869,23 @@ AC_DEFUN([FP_GMP],
dnl-------------------------------------------------------------------- dnl--------------------------------------------------------------------
AC_ARG_WITH([gmp-includes], AC_ARG_WITH([gmp-includes],
[AC_HELP_STRING([--with-gmp-includes], [AS_HELP_STRING([--with-gmp-includes],
[directory containing gmp.h])], [directory containing gmp.h])],
[GMP_INCLUDE_DIRS=$withval]) [GMP_INCLUDE_DIRS=$withval])
AC_ARG_WITH([gmp-libraries], AC_ARG_WITH([gmp-libraries],
[AC_HELP_STRING([--with-gmp-libraries], [AS_HELP_STRING([--with-gmp-libraries],
[directory containing gmp library])], [directory containing gmp library])],
[GMP_LIB_DIRS=$withval]) [GMP_LIB_DIRS=$withval])
AC_ARG_WITH([intree-gmp], AC_ARG_WITH([intree-gmp],
[AC_HELP_STRING([--with-intree-gmp], [AS_HELP_STRING([--with-intree-gmp],
[force using the in-tree GMP])], [force using the in-tree GMP])],
[GMP_FORCE_INTREE=YES], [GMP_FORCE_INTREE=YES],
[GMP_FORCE_INTREE=NO]) [GMP_FORCE_INTREE=NO])
AC_ARG_WITH([gmp-framework-preferred], AC_ARG_WITH([gmp-framework-preferred],
[AC_HELP_STRING([--with-gmp-framework-preferred], [AS_HELP_STRING([--with-gmp-framework-preferred],
[on OSX, prefer the GMP framework to the gmp lib])], [on OSX, prefer the GMP framework to the gmp lib])],
[GMP_PREFER_FRAMEWORK=YES], [GMP_PREFER_FRAMEWORK=YES],
[GMP_PREFER_FRAMEWORK=NO]) [GMP_PREFER_FRAMEWORK=NO])
...@@ -1854,12 +1905,21 @@ AC_DEFUN([FP_CURSES], ...@@ -1854,12 +1905,21 @@ AC_DEFUN([FP_CURSES],
dnl-------------------------------------------------------------------- dnl--------------------------------------------------------------------
AC_ARG_WITH([curses-libraries], AC_ARG_WITH([curses-libraries],
[AC_HELP_STRING([--with-curses-libraries], [AS_HELP_STRING([--with-curses-libraries],
[directory containing curses libraries])], [directory containing curses libraries])],
[CURSES_LIB_DIRS=$withval]) [CURSES_LIB_DIRS=$withval])
AC_SUBST(CURSES_INCLUDE_DIRS) AC_SUBST(CURSES_INCLUDE_DIRS)
AC_SUBST(CURSES_LIB_DIRS) AC_SUBST(CURSES_LIB_DIRS)
AC_ARG_WITH([curses-libraries-stage0],
[AC_HELP_STRING([--with-curses-libraries-stage0],
[directory containing build-platform curses libraries (for stage0 build)])],
[CURSES_LIB_DIRS_STAGE0=$withval],
[CURSES_LIB_DIRS_STAGE0=$CURSES_LIB_DIRS])
AC_SUBST(CURSES_LIB_DIRS_STAGE0)
])# FP_CURSES ])# FP_CURSES
# -------------------------------------------------------------- # --------------------------------------------------------------
...@@ -1905,7 +1965,7 @@ AC_MSG_CHECKING(for path to top of build tree) ...@@ -1905,7 +1965,7 @@ AC_MSG_CHECKING(for path to top of build tree)
# converts cpu from gnu to ghc naming, and assigns the result to $target_var # converts cpu from gnu to ghc naming, and assigns the result to $target_var
AC_DEFUN([GHC_CONVERT_CPU],[ AC_DEFUN([GHC_CONVERT_CPU],[
case "$1" in case "$1" in
aarch64*) aarch64*|arm64*)
$2="aarch64" $2="aarch64"
;; ;;
alpha*) alpha*)
...@@ -1987,37 +2047,50 @@ case "$1" in ...@@ -1987,37 +2047,50 @@ case "$1" in
esac esac
]) ])
# GHC_LLVM_TARGET(target_cpu, target_vendor, target_os, llvm_target_var) # GHC_LLVM_TARGET(target, target_cpu, target_vendor, target_os, llvm_target_var)
# -------------------------------- # --------------------------------
# converts the canonicalized target into someting llvm can understand # converts the canonicalized target into someting llvm can understand
AC_DEFUN([GHC_LLVM_TARGET], [ AC_DEFUN([GHC_LLVM_TARGET], [
case "$2-$3" in llvm_target_cpu=$2
case "$1" in
*-freebsd*-gnueabihf) *-freebsd*-gnueabihf)
llvm_target_vendor="unknown" llvm_target_vendor="unknown"
llvm_target_os="freebsd-gnueabihf" llvm_target_os="freebsd-gnueabihf"
;; ;;
hardfloat-*eabi) *-hardfloat-*eabi)
llvm_target_vendor="unknown" llvm_target_vendor="unknown"
llvm_target_os="$3""hf" llvm_target_os="$4""hf"
;; ;;
*-mingw32|*-mingw64|*-msys) *-mingw32|*-mingw64|*-msys)
llvm_target_vendor="unknown" llvm_target_vendor="unknown"
llvm_target_os="windows" llvm_target_os="windows"
;; ;;
# retain any android and gnueabi linux flavours # apple is a bit about their naming scheme for
# for the LLVM Target. Otherwise these would be # aarch64; and clang on macOS doesn't know that
# turned into just `-linux` and fail to be found # aarch64 would be arm64. So for LLVM we'll need
# in the `llvm-targets` file. # to call it arm64; while we'll refer to it internally
*-android*|*-gnueabi*|*-musleabi*) # as aarch64 for consistency and sanity.
GHC_CONVERT_VENDOR([$2],[llvm_target_vendor]) aarch64-apple-*|arm64-apple-*)
llvm_target_os="$3" llvm_target_cpu="arm64"
GHC_CONVERT_VENDOR([$3],[llvm_target_vendor])
GHC_CONVERT_OS([$4],[$2],[llvm_target_os])
;; ;;
*) *)
GHC_CONVERT_VENDOR([$2],[llvm_target_vendor]) GHC_CONVERT_VENDOR([$3],[llvm_target_vendor])
GHC_CONVERT_OS([$3],[$1],[llvm_target_os]) GHC_CONVERT_OS([$4],[$2],[llvm_target_os])
;; ;;
esac esac
$4="$1-$llvm_target_vendor-$llvm_target_os" case "$4" in
# retain any android and gnueabi linux flavours
# for the LLVM Target. Otherwise these would be
# turned into just `-linux` and fail to be found
# in the `llvm-targets` file.
*-android*|*-gnueabi*|*-musleabi*)
GHC_CONVERT_VENDOR([$3],[llvm_target_vendor])
llvm_target_os="$4"
;;
esac
$5="$llvm_target_cpu-$llvm_target_vendor-$llvm_target_os"
]) ])
...@@ -2108,39 +2181,39 @@ fi ...@@ -2108,39 +2181,39 @@ fi
AC_SUBST($1) AC_SUBST($1)
]) ])
# LIBRARY_VERSION(lib, [dir]) # LIBRARY_VERSION(lib, [cabal_file])
# -------------------------------- # --------------------------------
# Gets the version number of a library. # Gets the version number of a library.
# If $1 is ghc-prim, then we define LIBRARY_ghc_prim_VERSION as 1.2.3 # If $1 is ghc-prim, then we define LIBRARY_ghc_prim_VERSION as 1.2.3
# $2 points to the directory under libraries/ # $2 points to the directory under libraries/
AC_DEFUN([LIBRARY_VERSION],[ AC_DEFUN([LIBRARY_VERSION],[
dir=m4_default([$2],[$1]) cabal_file=m4_default([$2],[$1/$1.cabal])
LIBRARY_[]translit([$1], [-], [_])[]_VERSION=`grep -i "^version:" libraries/${dir}/$1.cabal | sed "s/.* //"` LIBRARY_[]translit([$1], [-], [_])[]_VERSION=`grep -i "^version:" libraries/${cabal_file} | sed "s/.* //"`
AC_SUBST(LIBRARY_[]translit([$1], [-], [_])[]_VERSION) AC_SUBST(LIBRARY_[]translit([$1], [-], [_])[]_VERSION)
]) ])
# XCODE_VERSION() # XCODE_VERSION()
# -------------------------------- # --------------------------------
# Gets the version number of XCode, if on a Mac # Gets the version number of Xcode, if on a Mac
AC_DEFUN([XCODE_VERSION],[ AC_DEFUN([XCODE_VERSION],[
if test "$TargetVendor_CPP" = "apple" if test "$TargetVendor_CPP" = "apple"
then then
AC_MSG_CHECKING(XCode version) AC_MSG_CHECKING(Xcode version)
XCodeVersion=`xcodebuild -version | grep Xcode | sed "s/Xcode //"` XcodeVersion=`(xcode-select -p >& /dev/null && xcodebuild -version) | grep Xcode | sed "s/Xcode //"`
# Old XCode versions don't actually give the XCode version # Old Xcode versions don't actually give the Xcode version
if test "$XCodeVersion" = "" if test "$XcodeVersion" = ""
then then
AC_MSG_RESULT(not found (too old?)) AC_MSG_RESULT(not found (too old?))
XCodeVersion1=0 XcodeVersion1=0
XCodeVersion2=0 XcodeVersion2=0
else else
AC_MSG_RESULT($XCodeVersion) AC_MSG_RESULT($XcodeVersion)
XCodeVersion1=`echo "$XCodeVersion" | sed 's/\..*//'` XcodeVersion1=`echo "$XcodeVersion" | sed 's/\..*//'`
changequote(, )dnl changequote(, )dnl
XCodeVersion2=`echo "$XCodeVersion" | sed 's/[^.]*\.\([^.]*\).*/\1/'` XcodeVersion2=`echo "$XcodeVersion" | sed 's/[^.]*\.\([^.]*\).*/\1/'`
changequote([, ])dnl changequote([, ])dnl
AC_MSG_NOTICE(XCode version component 1: $XCodeVersion1) AC_MSG_NOTICE(Xcode version component 1: $XcodeVersion1)
AC_MSG_NOTICE(XCode version component 2: $XCodeVersion2) AC_MSG_NOTICE(Xcode version component 2: $XcodeVersion2)
fi fi
fi fi
]) ])
...@@ -2153,22 +2226,29 @@ AC_DEFUN([XCODE_VERSION],[ ...@@ -2153,22 +2226,29 @@ AC_DEFUN([XCODE_VERSION],[
# #
# $1 = the variable to set # $1 = the variable to set
# $2 = the command to look for # $2 = the command to look for
# $3 = the version of the command to look for # $3 = the lower bound version of the command to look for
# $4 = the upper bound version of the command to look for.
# #
AC_DEFUN([FIND_LLVM_PROG],[ AC_DEFUN([FIND_LLVM_PROG],[
# Test for program with and without version name. # Test for program with and without version name.
AC_CHECK_TOOLS([$1], [$2-$3 $2-$3.0 $2], [:]) PROG_VERSION_CANDIDATES=$(for llvmVersion in `seq $4 -1 $3`; do echo "$2-$llvmVersion $2-$llvmVersion.0"; done)
if test "$$1" != ":"; then AC_CHECK_TOOLS([$1], [$PROG_VERSION_CANDIDATES $2], [])
AC_MSG_CHECKING([$$1 is version $3]) AS_IF([test x"$$1" != x],[
if test `$$1 --version | grep -c "version $3"` -gt 0 ; then PROG_VERSION=`$$1 --version | awk '/.*version [[0-9\.]]+/{for(i=1;i<=NF;i++){ if(\$i ~ /^[[0-9\.]]+$/){print \$i}}}'`
AC_MSG_RESULT(yes) AS_IF([test x"$PROG_VERSION" == x],
else [AC_MSG_RESULT(no)
AC_MSG_RESULT(no) $1=""
$1="" AC_MSG_NOTICE([We only support llvm $3 to $4 (no version found).])],
fi [AC_MSG_CHECKING([$$1 version ($PROG_VERSION) is between $3 and $4])
else AX_COMPARE_VERSION([$PROG_VERSION], [lt], [$3],
$1="" [AC_MSG_RESULT(no)
fi $1=""
AC_MSG_NOTICE([We only support llvm $3 to $4 (found $PROG_VERSION).])],
[AX_COMPARE_VERSION([$PROG_VERSION], [ge], [$4],
[AC_MSG_RESULT(no)
$1=""
AC_MSG_NOTICE([We only support llvm $3 to $4 (found $PROG_VERSION).])],
[AC_MSG_RESULT(yes)])])])])
]) ])
# CHECK_LD_COPY_BUG() # CHECK_LD_COPY_BUG()
...@@ -2269,7 +2349,7 @@ AC_DEFUN([FP_CPP_CMD_WITH_ARGS],[ ...@@ -2269,7 +2349,7 @@ AC_DEFUN([FP_CPP_CMD_WITH_ARGS],[
dnl ** what cpp to use? dnl ** what cpp to use?
dnl -------------------------------------------------------------- dnl --------------------------------------------------------------
AC_ARG_WITH(hs-cpp, AC_ARG_WITH(hs-cpp,
[AC_HELP_STRING([--with-hs-cpp=ARG], [AS_HELP_STRING([--with-hs-cpp=ARG],
[Path to the (C) preprocessor for Haskell files [default=autodetect]])], [Path to the (C) preprocessor for Haskell files [default=autodetect]])],
[ [
if test "$HostOS" = "mingw32" if test "$HostOS" = "mingw32"
...@@ -2323,7 +2403,7 @@ AC_ARG_WITH(hs-cpp, ...@@ -2323,7 +2403,7 @@ AC_ARG_WITH(hs-cpp,
dnl ** what cpp flags to use? dnl ** what cpp flags to use?
dnl ----------------------------------------------------------- dnl -----------------------------------------------------------
AC_ARG_WITH(hs-cpp-flags, AC_ARG_WITH(hs-cpp-flags,
[AC_HELP_STRING([--with-hs-cpp-flags=ARG], [AS_HELP_STRING([--with-hs-cpp-flags=ARG],
[Flags to the (C) preprocessor for Haskell files [default=autodetect]])], [Flags to the (C) preprocessor for Haskell files [default=autodetect]])],
[ [
if test "$HostOS" = "mingw32" if test "$HostOS" = "mingw32"
...@@ -2365,7 +2445,7 @@ $2=$HS_CPP_ARGS ...@@ -2365,7 +2445,7 @@ $2=$HS_CPP_ARGS
AC_DEFUN([FP_BFD_SUPPORT], [ AC_DEFUN([FP_BFD_SUPPORT], [
AC_SUBST([CabalHaveLibbfd], [False]) AC_SUBST([CabalHaveLibbfd], [False])
AC_ARG_ENABLE(bfd-debug, AC_ARG_ENABLE(bfd-debug,
[AC_HELP_STRING([--enable-bfd-debug], [AS_HELP_STRING([--enable-bfd-debug],
[Enable symbol resolution for -debug rts ('+RTS -Di') via binutils' libbfd [default=no]])], [Enable symbol resolution for -debug rts ('+RTS -Di') via binutils' libbfd [default=no]])],
[ [
# don't pollute general LIBS environment # don't pollute general LIBS environment
...@@ -2377,8 +2457,8 @@ AC_DEFUN([FP_BFD_SUPPORT], [ ...@@ -2377,8 +2457,8 @@ AC_DEFUN([FP_BFD_SUPPORT], [
dnl 'bfd_init' is a rare non-macro in libbfd dnl 'bfd_init' is a rare non-macro in libbfd
AC_CHECK_LIB(bfd, bfd_init) AC_CHECK_LIB(bfd, bfd_init)
AC_TRY_LINK([#include <bfd.h>], AC_LINK_IFELSE([AC_LANG_PROGRAM([[#include <bfd.h>]],
[ [[
/* mimic our rts/Printer.c */ /* mimic our rts/Printer.c */
bfd* abfd; bfd* abfd;
const char * name; const char * name;
...@@ -2400,7 +2480,7 @@ AC_DEFUN([FP_BFD_SUPPORT], [ ...@@ -2400,7 +2480,7 @@ AC_DEFUN([FP_BFD_SUPPORT], [
number_of_symbols = bfd_canonicalize_symtab (abfd, symbol_table); number_of_symbols = bfd_canonicalize_symtab (abfd, symbol_table);
bfd_get_symbol_info(abfd,symbol_table[0],&info); bfd_get_symbol_info(abfd,symbol_table[0],&info);
} }
], ]])],
[AC_SUBST([CabalHaveLibbfd], [True])],dnl bfd seems to work [AC_SUBST([CabalHaveLibbfd], [True])],dnl bfd seems to work
[AC_MSG_ERROR([can't use 'bfd' library])]) [AC_MSG_ERROR([can't use 'bfd' library])])
LIBS="$save_LIBS" LIBS="$save_LIBS"
...@@ -2441,7 +2521,7 @@ AC_DEFUN([FP_CC_LINKER_FLAG_TRY], [ ...@@ -2441,7 +2521,7 @@ AC_DEFUN([FP_CC_LINKER_FLAG_TRY], [
# #
AC_DEFUN([FIND_LD],[ AC_DEFUN([FIND_LD],[
AC_ARG_ENABLE(ld-override, AC_ARG_ENABLE(ld-override,
[AC_HELP_STRING([--disable-ld-override], [AS_HELP_STRING([--disable-ld-override],
[Prevent GHC from overriding the default linker used by gcc. If ld-override is enabled GHC will try to tell gcc to use whichever linker is selected by the LD environment variable. [default=override enabled]])], [Prevent GHC from overriding the default linker used by gcc. If ld-override is enabled GHC will try to tell gcc to use whichever linker is selected by the LD environment variable. [default=override enabled]])],
[], [],
[enable_ld_override=yes]) [enable_ld_override=yes])
...@@ -2450,7 +2530,6 @@ AC_DEFUN([FIND_LD],[ ...@@ -2450,7 +2530,6 @@ AC_DEFUN([FIND_LD],[
# Make sure the user didn't specify LD manually. # Make sure the user didn't specify LD manually.
if test "z$LD" != "z"; then if test "z$LD" != "z"; then
AC_CHECK_TARGET_TOOL([LD], [ld]) AC_CHECK_TARGET_TOOL([LD], [ld])
LD_NO_GOLD=$LD
return return
fi fi
...@@ -2463,7 +2542,6 @@ AC_DEFUN([FIND_LD],[ ...@@ -2463,7 +2542,6 @@ AC_DEFUN([FIND_LD],[
if test "x$TmpLd" = "x"; then continue; fi if test "x$TmpLd" = "x"; then continue; fi
out=`$TmpLd --version` out=`$TmpLd --version`
LD_NO_GOLD=$TmpLd
case $out in case $out in
"GNU ld"*) "GNU ld"*)
FP_CC_LINKER_FLAG_TRY(bfd, $2) ;; FP_CC_LINKER_FLAG_TRY(bfd, $2) ;;
...@@ -2471,8 +2549,6 @@ AC_DEFUN([FIND_LD],[ ...@@ -2471,8 +2549,6 @@ AC_DEFUN([FIND_LD],[
FP_CC_LINKER_FLAG_TRY(gold, $2) FP_CC_LINKER_FLAG_TRY(gold, $2)
if test "$cross_compiling" = "yes"; then if test "$cross_compiling" = "yes"; then
AC_MSG_NOTICE([Using ld.gold and assuming that it is not affected by binutils issue 22266]); AC_MSG_NOTICE([Using ld.gold and assuming that it is not affected by binutils issue 22266]);
else
LD_NO_GOLD=ld;
fi fi
;; ;;
"LLD"*) "LLD"*)
...@@ -2493,19 +2569,147 @@ AC_DEFUN([FIND_LD],[ ...@@ -2493,19 +2569,147 @@ AC_DEFUN([FIND_LD],[
# Fallback # Fallback
AC_CHECK_TARGET_TOOL([LD], [ld]) AC_CHECK_TARGET_TOOL([LD], [ld])
# This isn't entirely safe since $LD may have been discovered to be
# ld.gold, but what else can we do?
if test "x$LD_NO_GOLD" = "x"; then LD_NO_GOLD=$LD; fi
} }
if test "x$enable_ld_override" = "xyes"; then if test "x$enable_ld_override" = "xyes"; then
find_ld find_ld
else else
AC_CHECK_TARGET_TOOL([LD], [ld]) AC_CHECK_TARGET_TOOL([LD], [ld])
if test "x$LD_NO_GOLD" = "x"; then LD_NO_GOLD=$LD; fi
fi fi
CHECK_LD_COPY_BUG([$1]) CHECK_LD_COPY_BUG([$1])
]) ])
# CHECK_FOR_GOLD_T22266
# ----------------------
#
# Test for binutils #22266. This bug manifested as GHC bug #14328 (see also:
# #14675, #14291).
# Uses test from
# https://sourceware.org/git/gitweb.cgi?p=binutils-gdb.git;h=033bfb739b525703bfe23f151d09e9beee3a2afe
#
# $1 = linker to test
# Sets $result to 0 if not affected, 1 otherwise
AC_DEFUN([CHECK_FOR_GOLD_T22266],[
AC_MSG_CHECKING([for ld.gold object merging bug (binutils 22266)])
if ! $1 --version | grep -q "GNU gold"; then
# Not gold
result=0
elif test "$cross_compiling" = "yes"; then
AC_MSG_RESULT([cross-compiling, assuming LD can merge objects correctly.])
result=0
else
FPTOOLS_WRITE_FILE([conftest.a.c], [
__attribute__((section(".data.a")))
static int int_from_a_1 = 0x11223344;
__attribute__((section(".data.rel.ro.a")))
int *p_int_from_a_2 = &int_from_a_1;
const char *hello (void);
const char *
hello (void)
{
return "XXXHello, world!" + 3;
}
])
FPTOOLS_WRITE_FILE([conftest.main.c], [
#include <stdlib.h>
#include <string.h>
extern int *p_int_from_a_2;
extern const char *hello (void);
int main (void) {
if (*p_int_from_a_2 != 0x11223344)
abort ();
if (strcmp(hello(), "Hello, world!") != 0)
abort ();
return 0;
}
])
FPTOOLS_WRITE_FILE([conftest.t], [
SECTIONS
{
.text : {
*(.text*)
}
.rodata :
{
*(.rodata .rodata.* .gnu.linkonce.r.*)
}
.data.rel.ro : {
*(.data.rel.ro*)
}
.data : {
*(.data*)
}
.bss : {
*(.bss*)
}
}
])
$CC -c -o conftest.a.o conftest.a.c || AC_MSG_ERROR([Failed to compile test])
$MergeObjsCmd $MergeObjsArgs -T conftest.t conftest.a.o -o conftest.ar.o || AC_MSG_ERROR([Failed to merge test object])
$CC -c -o conftest.main.o conftest.main.c || AC_MSG_ERROR([Failed to compile test driver])
$CC conftest.ar.o conftest.main.o -o conftest || AC_MSG_ERROR([Failed to link test driver])
if ./conftest; then
AC_MSG_RESULT([not affected])
result=0
else
AC_MSG_RESULT([affected])
result=1
fi
rm -f conftest.a.o conftest.a.c conttest.ar.o conftest.main.c conftest.main.o conftest
fi
])
# FIND_MERGE_OBJECTS
# ------------------
# Find which linker to use to merge object files.
#
# See Note [Merging object files for GHCi] in GHC.Driver.Pipeline.
AC_DEFUN([FIND_MERGE_OBJECTS],[
AC_REQUIRE([FIND_LD])
if test -z "$MergeObjsCmd"; then
MergeObjsCmd="$LD"
fi
if test -z "$MergeObjsArgs"; then
MergeObjsArgs="-r"
fi
CHECK_FOR_GOLD_T22266($MergeObjsCmd)
if test "$result" = "1"; then
AC_MSG_NOTICE([$MergeObjsCmd is broken due to binutils 22266, looking for another linker...])
MergeObjsCmd=""
AC_CHECK_TARGET_TOOL([MergeObjsCmd], [ld])
CHECK_FOR_GOLD_T22266($MergeObjsCmd)
if test "$result" = "1"; then
AC_MSG_ERROR([Linker is affected by binutils 22266 but couldn't find another unaffected linker. Please set the MergeObjsCmd variable to a functional linker.])
fi
fi
AC_SUBST([MergeObjsCmd])
AC_SUBST([MergeObjsArgs])
])
# FIND_PYTHON
# -----------
# Find the version of `python` to use (for the testsuite driver)
#
AC_DEFUN([FIND_PYTHON],[
dnl Prefer the mingw64 distribution on Windows due to #17483.
AC_PATH_PROG([PYTHON], [python3], [], [/mingw64/bin $PATH])
PythonCmd="$PYTHON"
AC_SUBST([PythonCmd])
])
# LocalWords: fi # LocalWords: fi
...@@ -8,6 +8,7 @@ import argparse ...@@ -8,6 +8,7 @@ import argparse
from textwrap import dedent from textwrap import dedent
import subprocess import subprocess
import re import re
import shutil
cwd = os.getcwd() cwd = os.getcwd()
...@@ -172,6 +173,8 @@ def autoreconf(): ...@@ -172,6 +173,8 @@ def autoreconf():
for dir_ in ['.'] + glob.glob('libraries/*/'): for dir_ in ['.'] + glob.glob('libraries/*/'):
if os.path.isfile(os.path.join(dir_, 'configure.ac')): if os.path.isfile(os.path.join(dir_, 'configure.ac')):
print("Booting %s" % dir_) print("Booting %s" % dir_)
if dir_ != '.':
shutil.copyfile('config.sub', os.path.join(dir_, 'config.sub'))
processes[dir_] = subprocess.Popen(['sh', '-c', reconf_cmd], cwd=dir_) processes[dir_] = subprocess.Popen(['sh', '-c', reconf_cmd], cwd=dir_)
# Wait for all child processes to finish. # Wait for all child processes to finish.
......
...@@ -81,7 +81,7 @@ module GHC.Hs.Decls ( ...@@ -81,7 +81,7 @@ module GHC.Hs.Decls (
RoleAnnotDecl(..), LRoleAnnotDecl, roleAnnotDeclName, RoleAnnotDecl(..), LRoleAnnotDecl, roleAnnotDeclName,
-- ** Injective type families -- ** Injective type families
FamilyResultSig(..), LFamilyResultSig, InjectivityAnn(..), LInjectivityAnn, FamilyResultSig(..), LFamilyResultSig, InjectivityAnn(..), LInjectivityAnn,
resultVariableName, resultVariableName, familyDeclLName, familyDeclName,
-- * Grouping -- * Grouping
HsGroup(..), emptyRdrGroup, emptyRnGroup, appendGroups, hsGroupInstDecls HsGroup(..), emptyRdrGroup, emptyRnGroup, appendGroups, hsGroupInstDecls
...@@ -661,11 +661,14 @@ tyFamInstDeclLName (TyFamInstDecl (HsIB _ (XFamEqn nec))) ...@@ -661,11 +661,14 @@ tyFamInstDeclLName (TyFamInstDecl (HsIB _ (XFamEqn nec)))
tyFamInstDeclLName (TyFamInstDecl (XHsImplicitBndrs nec)) tyFamInstDeclLName (TyFamInstDecl (XHsImplicitBndrs nec))
= noExtCon nec = noExtCon nec
tyClDeclLName :: TyClDecl pass -> Located (IdP pass) tyClDeclLName :: TyClDecl (GhcPass p) -> Located (IdP (GhcPass p))
tyClDeclLName (FamDecl { tcdFam = FamilyDecl { fdLName = ln } }) = ln tyClDeclLName (FamDecl { tcdFam = fd }) = familyDeclLName fd
tyClDeclLName decl = tcdLName decl tyClDeclLName (SynDecl { tcdLName = ln }) = ln
tyClDeclLName (DataDecl { tcdLName = ln }) = ln
tyClDeclLName (ClassDecl { tcdLName = ln }) = ln
tyClDeclLName (XTyClDecl nec) = noExtCon nec
tcdName :: TyClDecl pass -> IdP pass tcdName :: TyClDecl (GhcPass p) -> IdP (GhcPass p)
tcdName = unLoc . tyClDeclLName tcdName = unLoc . tyClDeclLName
tyClDeclTyVars :: TyClDecl pass -> LHsQTyVars pass tyClDeclTyVars :: TyClDecl pass -> LHsQTyVars pass
...@@ -1091,6 +1094,16 @@ data FamilyInfo pass ...@@ -1091,6 +1094,16 @@ data FamilyInfo pass
-- said "type family Foo x where .." -- said "type family Foo x where .."
| ClosedTypeFamily (Maybe [LTyFamInstEqn pass]) | ClosedTypeFamily (Maybe [LTyFamInstEqn pass])
------------- Functions over FamilyDecls -----------
familyDeclLName :: FamilyDecl (GhcPass p) -> Located (IdP (GhcPass p))
familyDeclLName (FamilyDecl { fdLName = n }) = n
familyDeclLName (XFamilyDecl nec) = noExtCon nec
familyDeclName :: FamilyDecl (GhcPass p) -> IdP (GhcPass p)
familyDeclName = unLoc . familyDeclLName
famResultKindSignature :: FamilyResultSig (GhcPass p) -> Maybe (LHsKind (GhcPass p)) famResultKindSignature :: FamilyResultSig (GhcPass p) -> Maybe (LHsKind (GhcPass p))
famResultKindSignature (NoSig _) = Nothing famResultKindSignature (NoSig _) = Nothing
famResultKindSignature (KindSig _ ki) = Just ki famResultKindSignature (KindSig _ ki) = Just ki
...@@ -1106,6 +1119,8 @@ resultVariableName :: FamilyResultSig (GhcPass a) -> Maybe (IdP (GhcPass a)) ...@@ -1106,6 +1119,8 @@ resultVariableName :: FamilyResultSig (GhcPass a) -> Maybe (IdP (GhcPass a))
resultVariableName (TyVarSig _ sig) = Just $ hsLTyVarName sig resultVariableName (TyVarSig _ sig) = Just $ hsLTyVarName sig
resultVariableName _ = Nothing resultVariableName _ = Nothing
------------- Pretty printing FamilyDecls -----------
instance OutputableBndrId p instance OutputableBndrId p
=> Outputable (FamilyDecl (GhcPass p)) where => Outputable (FamilyDecl (GhcPass p)) where
ppr = pprFamilyDecl TopLevel ppr = pprFamilyDecl TopLevel
......
...@@ -37,6 +37,7 @@ import CoreSyn ...@@ -37,6 +37,7 @@ import CoreSyn
import DynFlags ( gopt, GeneralFlag(Opt_PrintExplicitCoercions) ) import DynFlags ( gopt, GeneralFlag(Opt_PrintExplicitCoercions) )
import Name import Name
import NameSet import NameSet
import RdrName ( GlobalRdrEnv )
import BasicTypes import BasicTypes
import ConLike import ConLike
import SrcLoc import SrcLoc
...@@ -188,6 +189,104 @@ is Less Cool because ...@@ -188,6 +189,104 @@ is Less Cool because
typecheck do-notation with (>>=) :: m1 a -> (a -> m2 b) -> m2 b.) typecheck do-notation with (>>=) :: m1 a -> (a -> m2 b) -> m2 b.)
-} -}
-- | An unbound variable; used for treating
-- out-of-scope variables as expression holes
--
-- Either "x", "y" Plain OutOfScope
-- or "_", "_x" A TrueExprHole
--
-- Both forms indicate an out-of-scope variable, but the latter
-- indicates that the user /expects/ it to be out of scope, and
-- just wants GHC to report its type
data UnboundVar
= OutOfScope OccName GlobalRdrEnv -- ^ An (unqualified) out-of-scope
-- variable, together with the GlobalRdrEnv
-- with respect to which it is unbound
-- See Note [OutOfScope and GlobalRdrEnv]
| TrueExprHole OccName -- ^ A "true" expression hole (_ or _x)
deriving Data
instance Outputable UnboundVar where
ppr (OutOfScope occ _) = text "OutOfScope" <> parens (ppr occ)
ppr (TrueExprHole occ) = text "ExprHole" <> parens (ppr occ)
unboundVarOcc :: UnboundVar -> OccName
unboundVarOcc (OutOfScope occ _) = occ
unboundVarOcc (TrueExprHole occ) = occ
{-
Note [OutOfScope and GlobalRdrEnv]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To understand why we bundle a GlobalRdrEnv with an out-of-scope variable,
consider the following module:
module A where
foo :: ()
foo = bar
bat :: [Double]
bat = [1.2, 3.4]
$(return [])
bar = ()
bad = False
When A is compiled, the renamer determines that `bar` is not in scope in the
declaration of `foo` (since `bar` is declared in the following inter-splice
group). Once it has finished typechecking the entire module, the typechecker
then generates the associated error message, which specifies both the type of
`bar` and a list of possible in-scope alternatives:
A.hs:6:7: error:
• Variable not in scope: bar :: ()
• ‘bar’ (line 13) is not in scope before the splice on line 11
Perhaps you meant ‘bat’ (line 9)
When it calls RnEnv.unknownNameSuggestions to identify these alternatives, the
typechecker must provide a GlobalRdrEnv. If it provided the current one, which
contains top-level declarations for the entire module, the error message would
incorrectly suggest the out-of-scope `bar` and `bad` as possible alternatives
for `bar` (see #11680). Instead, the typechecker must use the same
GlobalRdrEnv the renamer used when it determined that `bar` is out-of-scope.
To obtain this GlobalRdrEnv, can the typechecker simply use the out-of-scope
`bar`'s location to either reconstruct it (from the current GlobalRdrEnv) or to
look it up in some global store? Unfortunately, no. The problem is that
location information is not always sufficient for this task. This is most
apparent when dealing with the TH function addTopDecls, which adds its
declarations to the FOLLOWING inter-splice group. Consider these declarations:
ex9 = cat -- cat is NOT in scope here
$(do -------------------------------------------------------------
ds <- [d| f = cab -- cat and cap are both in scope here
cat = ()
|]
addTopDecls ds
[d| g = cab -- only cap is in scope here
cap = True
|])
ex10 = cat -- cat is NOT in scope here
$(return []) -----------------------------------------------------
ex11 = cat -- cat is in scope
Here, both occurrences of `cab` are out-of-scope, and so the typechecker needs
the GlobalRdrEnvs which were used when they were renamed. These GlobalRdrEnvs
are different (`cat` is present only in the GlobalRdrEnv for f's `cab'), but the
locations of the two `cab`s are the same (they are both created in the same
splice). Thus, we must include some additional information with each `cab` to
allow the typechecker to obtain the correct GlobalRdrEnv. Clearly, the simplest
information to use is the GlobalRdrEnv itself.
-}
-- | A Haskell expression. -- | A Haskell expression.
data HsExpr p data HsExpr p
= HsVar (XVar p) = HsVar (XVar p)
...@@ -196,7 +295,7 @@ data HsExpr p ...@@ -196,7 +295,7 @@ data HsExpr p
-- See Note [Located RdrNames] -- See Note [Located RdrNames]
| HsUnboundVar (XUnboundVar p) | HsUnboundVar (XUnboundVar p)
OccName -- ^ Unbound variable; also used for "holes" UnboundVar -- ^ Unbound variable; also used for "holes"
-- (_ or _x). -- (_ or _x).
-- Turned from HsVar to HsUnboundVar by the -- Turned from HsVar to HsUnboundVar by the
-- renamer, when it finds an out-of-scope -- renamer, when it finds an out-of-scope
...@@ -849,7 +948,7 @@ ppr_lexpr e = ppr_expr (unLoc e) ...@@ -849,7 +948,7 @@ ppr_lexpr e = ppr_expr (unLoc e)
ppr_expr :: forall p. (OutputableBndrId p) ppr_expr :: forall p. (OutputableBndrId p)
=> HsExpr (GhcPass p) -> SDoc => HsExpr (GhcPass p) -> SDoc
ppr_expr (HsVar _ (L _ v)) = pprPrefixOcc v ppr_expr (HsVar _ (L _ v)) = pprPrefixOcc v
ppr_expr (HsUnboundVar _ uv)= pprPrefixOcc uv ppr_expr (HsUnboundVar _ uv)= pprPrefixOcc (unboundVarOcc uv)
ppr_expr (HsConLikeOut _ c) = pprPrefixOcc c ppr_expr (HsConLikeOut _ c) = pprPrefixOcc c
ppr_expr (HsIPVar _ v) = ppr v ppr_expr (HsIPVar _ v) = ppr v
ppr_expr (HsOverLabel _ _ l)= char '#' <> ppr l ppr_expr (HsOverLabel _ _ l)= char '#' <> ppr l
...@@ -1042,7 +1141,7 @@ ppr_infix_expr :: (OutputableBndrId p) => HsExpr (GhcPass p) -> Maybe SDoc ...@@ -1042,7 +1141,7 @@ ppr_infix_expr :: (OutputableBndrId p) => HsExpr (GhcPass p) -> Maybe SDoc
ppr_infix_expr (HsVar _ (L _ v)) = Just (pprInfixOcc v) ppr_infix_expr (HsVar _ (L _ v)) = Just (pprInfixOcc v)
ppr_infix_expr (HsConLikeOut _ c) = Just (pprInfixOcc (conLikeName c)) ppr_infix_expr (HsConLikeOut _ c) = Just (pprInfixOcc (conLikeName c))
ppr_infix_expr (HsRecFld _ f) = Just (pprInfixOcc f) ppr_infix_expr (HsRecFld _ f) = Just (pprInfixOcc f)
ppr_infix_expr (HsUnboundVar _ occ) = Just (pprInfixOcc occ) ppr_infix_expr (HsUnboundVar _ h@TrueExprHole{}) = Just (pprInfixOcc (unboundVarOcc h))
ppr_infix_expr (HsWrap _ _ e) = ppr_infix_expr e ppr_infix_expr (HsWrap _ _ e) = ppr_infix_expr e
ppr_infix_expr _ = Nothing ppr_infix_expr _ = Nothing
...@@ -2181,22 +2280,30 @@ pprStmt (ApplicativeStmt _ args mb_join) ...@@ -2181,22 +2280,30 @@ pprStmt (ApplicativeStmt _ args mb_join)
else text "join" <+> parens ap_expr else text "join" <+> parens ap_expr
pp_arg :: (a, ApplicativeArg (GhcPass idL)) -> SDoc pp_arg :: (a, ApplicativeArg (GhcPass idL)) -> SDoc
pp_arg (_, ApplicativeArgOne _ pat expr isBody _) pp_arg (_, applicativeArg) = ppr applicativeArg
| isBody = -- See Note [Applicative BodyStmt]
ppr (BodyStmt (panic "pprStmt") expr noSyntaxExpr noSyntaxExpr pprStmt (XStmtLR x) = ppr x
instance (OutputableBndrId idL)
=> Outputable (ApplicativeArg (GhcPass idL)) where
ppr = pprArg
pprArg :: forall idL . (OutputableBndrId idL) => ApplicativeArg (GhcPass idL) -> SDoc
pprArg (ApplicativeArgOne _ pat expr isBody _)
| isBody = -- See Note [Applicative BodyStmt]
ppr (BodyStmt (panic "pprStmt") expr noSyntaxExpr noSyntaxExpr
:: ExprStmt (GhcPass idL)) :: ExprStmt (GhcPass idL))
| otherwise = | otherwise =
ppr (BindStmt (panic "pprStmt") pat expr noSyntaxExpr noSyntaxExpr ppr (BindStmt (panic "pprStmt") pat expr noSyntaxExpr noSyntaxExpr
:: ExprStmt (GhcPass idL)) :: ExprStmt (GhcPass idL))
pp_arg (_, ApplicativeArgMany _ stmts return pat) = pprArg (ApplicativeArgMany _ stmts return pat) =
ppr pat <+> ppr pat <+>
text "<-" <+> text "<-" <+>
ppr (HsDo (panic "pprStmt") DoExpr (noLoc ppr (HsDo (panic "pprStmt") DoExpr (noLoc
(stmts ++ (stmts ++
[noLoc (LastStmt noExtField (noLoc return) False noSyntaxExpr)]))) [noLoc (LastStmt noExtField (noLoc return) False noSyntaxExpr)])))
pp_arg (_, XApplicativeArg x) = ppr x pprArg (XApplicativeArg x) = ppr x
pprStmt (XStmtLR x) = ppr x
pprTransformStmt :: (OutputableBndrId p) pprTransformStmt :: (OutputableBndrId p)
=> [IdP (GhcPass p)] -> LHsExpr (GhcPass p) => [IdP (GhcPass p)] -> LHsExpr (GhcPass p)
......
...@@ -56,8 +56,7 @@ module GHC.Hs.Types ( ...@@ -56,8 +56,7 @@ module GHC.Hs.Types (
hsLTyVarName, hsLTyVarNames, hsLTyVarLocName, hsExplicitLTyVarNames, hsLTyVarName, hsLTyVarNames, hsLTyVarLocName, hsExplicitLTyVarNames,
splitLHsInstDeclTy, getLHsInstDeclHead, getLHsInstDeclClass_maybe, splitLHsInstDeclTy, getLHsInstDeclHead, getLHsInstDeclClass_maybe,
splitLHsPatSynTy, splitLHsPatSynTy,
splitLHsForAllTy, splitLHsForAllTyInvis, splitLHsForAllTyInvis, splitLHsQualTy, splitLHsSigmaTyInvis,
splitLHsQualTy, splitLHsSigmaTy, splitLHsSigmaTyInvis,
splitHsFunType, hsTyGetAppHead_maybe, splitHsFunType, hsTyGetAppHead_maybe,
mkHsOpTy, mkHsAppTy, mkHsAppTys, mkHsAppKindTy, mkHsOpTy, mkHsAppTy, mkHsAppTys, mkHsAppKindTy,
ignoreParens, hsSigType, hsSigWcType, ignoreParens, hsSigType, hsSigWcType,
...@@ -1248,21 +1247,9 @@ splitLHsPatSynTy ty = (univs, reqs, exis, provs, ty4) ...@@ -1248,21 +1247,9 @@ splitLHsPatSynTy ty = (univs, reqs, exis, provs, ty4)
(provs, ty4) = splitLHsQualTy ty3 (provs, ty4) = splitLHsQualTy ty3
-- | Decompose a sigma type (of the form @forall <tvs>. context => body@) -- | Decompose a sigma type (of the form @forall <tvs>. context => body@)
-- into its constituent parts. -- into its constituent parts. Note that only /invisible/ @forall@s
-- -- (i.e., @forall a.@, with a dot) are split apart; /visible/ @forall@s
-- Note that this function looks through parentheses, so it will work on types -- (i.e., @forall a ->@, with an arrow) are left untouched.
-- such as @(forall a. <...>)@. The downside to this is that it is not
-- generally possible to take the returned types and reconstruct the original
-- type (parentheses and all) from them.
splitLHsSigmaTy :: LHsType pass
-> ([LHsTyVarBndr pass], LHsContext pass, LHsType pass)
splitLHsSigmaTy ty
| (tvs, ty1) <- splitLHsForAllTy ty
, (ctxt, ty2) <- splitLHsQualTy ty1
= (tvs, ctxt, ty2)
-- | Like 'splitLHsSigmaTy', but only splits type variable binders that were
-- quantified invisibly (e.g., @forall a.@, with a dot).
-- --
-- This function is used to split apart certain types, such as instance -- This function is used to split apart certain types, such as instance
-- declaration types, which disallow visible @forall@s. For instance, if GHC -- declaration types, which disallow visible @forall@s. For instance, if GHC
...@@ -1280,20 +1267,10 @@ splitLHsSigmaTyInvis ty ...@@ -1280,20 +1267,10 @@ splitLHsSigmaTyInvis ty
, (ctxt, ty2) <- splitLHsQualTy ty1 , (ctxt, ty2) <- splitLHsQualTy ty1
= (tvs, ctxt, ty2) = (tvs, ctxt, ty2)
-- | Decompose a type of the form @forall <tvs>. body@) into its constituent -- | Decompose a type of the form @forall <tvs>. body@ into its constituent
-- parts. -- parts. Note that only /invisible/ @forall@s
-- -- (i.e., @forall a.@, with a dot) are split apart; /visible/ @forall@s
-- Note that this function looks through parentheses, so it will work on types -- (i.e., @forall a ->@, with an arrow) are left untouched.
-- such as @(forall a. <...>)@. The downside to this is that it is not
-- generally possible to take the returned types and reconstruct the original
-- type (parentheses and all) from them.
splitLHsForAllTy :: LHsType pass -> ([LHsTyVarBndr pass], LHsType pass)
splitLHsForAllTy (L _ (HsParTy _ ty)) = splitLHsForAllTy ty
splitLHsForAllTy (L _ (HsForAllTy { hst_bndrs = tvs, hst_body = body })) = (tvs, body)
splitLHsForAllTy body = ([], body)
-- | Like 'splitLHsForAllTy', but only splits type variable binders that
-- were quantified invisibly (e.g., @forall a.@, with a dot).
-- --
-- This function is used to split apart certain types, such as instance -- This function is used to split apart certain types, such as instance
-- declaration types, which disallow visible @forall@s. For instance, if GHC -- declaration types, which disallow visible @forall@s. For instance, if GHC
......
...@@ -48,7 +48,7 @@ module GHC.Hs.Utils( ...@@ -48,7 +48,7 @@ module GHC.Hs.Utils(
mkChunkified, chunkify, mkChunkified, chunkify,
-- * Bindings -- * Bindings
mkFunBind, mkVarBind, mkHsVarBind, mk_easy_FunBind, mkTopFunBind, mkFunBind, mkVarBind, mkHsVarBind, mkSimpleGeneratedFunBind, mkTopFunBind,
mkPatSynBind, mkPatSynBind,
isInfixFunBind, isInfixFunBind,
...@@ -800,14 +800,15 @@ l ...@@ -800,14 +800,15 @@ l
************************************************************************ ************************************************************************
-} -}
mkFunBind :: Located RdrName -> [LMatch GhcPs (LHsExpr GhcPs)] mkFunBind :: Origin -> Located RdrName -> [LMatch GhcPs (LHsExpr GhcPs)]
-> HsBind GhcPs -> HsBind GhcPs
-- ^ Not infix, with place holders for coercion and free vars -- ^ Not infix, with place holders for coercion and free vars
mkFunBind fn ms = FunBind { fun_id = fn mkFunBind origin fn ms
, fun_matches = mkMatchGroup Generated ms = FunBind { fun_id = fn
, fun_co_fn = idHsWrapper , fun_matches = mkMatchGroup origin ms
, fun_ext = noExtField , fun_co_fn = idHsWrapper
, fun_tick = [] } , fun_ext = noExtField
, fun_tick = [] }
mkTopFunBind :: Origin -> Located Name -> [LMatch GhcRn (LHsExpr GhcRn)] mkTopFunBind :: Origin -> Located Name -> [LMatch GhcRn (LHsExpr GhcRn)]
-> HsBind GhcRn -> HsBind GhcRn
...@@ -820,7 +821,7 @@ mkTopFunBind origin fn ms = FunBind { fun_id = fn ...@@ -820,7 +821,7 @@ mkTopFunBind origin fn ms = FunBind { fun_id = fn
, fun_tick = [] } , fun_tick = [] }
mkHsVarBind :: SrcSpan -> RdrName -> LHsExpr GhcPs -> LHsBind GhcPs mkHsVarBind :: SrcSpan -> RdrName -> LHsExpr GhcPs -> LHsBind GhcPs
mkHsVarBind loc var rhs = mk_easy_FunBind loc var [] rhs mkHsVarBind loc var rhs = mkSimpleGeneratedFunBind loc var [] rhs
mkVarBind :: IdP (GhcPass p) -> LHsExpr (GhcPass p) -> LHsBind (GhcPass p) mkVarBind :: IdP (GhcPass p) -> LHsExpr (GhcPass p) -> LHsBind (GhcPass p)
mkVarBind var rhs = cL (getLoc rhs) $ mkVarBind var rhs = cL (getLoc rhs) $
...@@ -846,10 +847,12 @@ isInfixFunBind _ = False ...@@ -846,10 +847,12 @@ isInfixFunBind _ = False
------------ ------------
mk_easy_FunBind :: SrcSpan -> RdrName -> [LPat GhcPs] -- | Convenience function using 'mkFunBind'.
-- This is for generated bindings only, do not use for user-written code.
mkSimpleGeneratedFunBind :: SrcSpan -> RdrName -> [LPat GhcPs]
-> LHsExpr GhcPs -> LHsBind GhcPs -> LHsExpr GhcPs -> LHsBind GhcPs
mk_easy_FunBind loc fun pats expr mkSimpleGeneratedFunBind loc fun pats expr
= cL loc $ mkFunBind (cL loc fun) = cL loc $ mkFunBind Generated (cL loc fun)
[mkMatch (mkPrefixFunRhs (cL loc fun)) pats expr [mkMatch (mkPrefixFunRhs (cL loc fun)) pats expr
(noLoc emptyLocalBinds)] (noLoc emptyLocalBinds)]
......
...@@ -1236,10 +1236,11 @@ checkAllNonVoid :: RecTcChecker -> Delta -> [Type] -> DsM Bool ...@@ -1236,10 +1236,11 @@ checkAllNonVoid :: RecTcChecker -> Delta -> [Type] -> DsM Bool
checkAllNonVoid rec_ts amb_cs strict_arg_tys = do checkAllNonVoid rec_ts amb_cs strict_arg_tys = do
let definitely_inhabited = definitelyInhabitedType (delta_ty_st amb_cs) let definitely_inhabited = definitelyInhabitedType (delta_ty_st amb_cs)
tys_to_check <- filterOutM definitely_inhabited strict_arg_tys tys_to_check <- filterOutM definitely_inhabited strict_arg_tys
-- See Note [Fuel for the inhabitation test]
let rec_max_bound | tys_to_check `lengthExceeds` 1 let rec_max_bound | tys_to_check `lengthExceeds` 1
= 1 = 1
| otherwise | otherwise
= defaultRecTcMaxBound = 3
rec_ts' = setRecTcMaxBound rec_max_bound rec_ts rec_ts' = setRecTcMaxBound rec_max_bound rec_ts
allM (nonVoid rec_ts' amb_cs) tys_to_check allM (nonVoid rec_ts' amb_cs) tys_to_check
...@@ -1259,6 +1260,7 @@ nonVoid rec_ts amb_cs strict_arg_ty = do ...@@ -1259,6 +1260,7 @@ nonVoid rec_ts amb_cs strict_arg_ty = do
mb_cands <- inhabitationCandidates amb_cs strict_arg_ty mb_cands <- inhabitationCandidates amb_cs strict_arg_ty
case mb_cands of case mb_cands of
Right (tc, _, cands) Right (tc, _, cands)
-- See Note [Fuel for the inhabitation test]
| Just rec_ts' <- checkRecTc rec_ts tc | Just rec_ts' <- checkRecTc rec_ts tc
-> anyM (cand_is_inhabitable rec_ts' amb_cs) cands -> anyM (cand_is_inhabitable rec_ts' amb_cs) cands
-- A strict argument type is inhabitable by a terminating value if -- A strict argument type is inhabitable by a terminating value if
...@@ -1307,7 +1309,7 @@ definitelyInhabitedType ty_st ty = do ...@@ -1307,7 +1309,7 @@ definitelyInhabitedType ty_st ty = do
null (dataConImplBangs con) -- (2) null (dataConImplBangs con) -- (2)
{- Note [Strict argument type constraints] {- Note [Strict argument type constraints]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In the ConVar case of clause processing, each conlike K traditionally In the ConVar case of clause processing, each conlike K traditionally
generates two different forms of constraints: generates two different forms of constraints:
...@@ -1337,6 +1339,7 @@ say, `K2 undefined` or `K2 (let x = x in x)`.) ...@@ -1337,6 +1339,7 @@ say, `K2 undefined` or `K2 (let x = x in x)`.)
Since neither the term nor type constraints mentioned above take strict Since neither the term nor type constraints mentioned above take strict
argument types into account, we make use of the `nonVoid` function to argument types into account, we make use of the `nonVoid` function to
determine whether a strict type is inhabitable by a terminating value or not. determine whether a strict type is inhabitable by a terminating value or not.
We call this the "inhabitation test".
`nonVoid ty` returns True when either: `nonVoid ty` returns True when either:
1. `ty` has at least one InhabitationCandidate for which both its term and type 1. `ty` has at least one InhabitationCandidate for which both its term and type
...@@ -1362,15 +1365,20 @@ determine whether a strict type is inhabitable by a terminating value or not. ...@@ -1362,15 +1365,20 @@ determine whether a strict type is inhabitable by a terminating value or not.
`nonVoid MyVoid` returns False. The InhabitationCandidate for the MkMyVoid `nonVoid MyVoid` returns False. The InhabitationCandidate for the MkMyVoid
constructor contains Void as a strict argument type, and since `nonVoid Void` constructor contains Void as a strict argument type, and since `nonVoid Void`
returns False, that InhabitationCandidate is discarded, leaving no others. returns False, that InhabitationCandidate is discarded, leaving no others.
* Whether or not a type is inhabited is undecidable in general.
See Note [Fuel for the inhabitation test].
* For some types, inhabitation is evident immediately and we don't need to
perform expensive tests. See Note [Types that are definitely inhabitable].
* Performance considerations Note [Fuel for the inhabitation test]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Whether or not a type is inhabited is undecidable in general. As a result, we
can run into infinite loops in `nonVoid`. Therefore, we adopt a fuel-based
approach to prevent that.
We must be careful when recursively calling `nonVoid` on the strict argument Consider the following example:
types of an InhabitationCandidate, because doing so naïvely can cause GHC to
fall into an infinite loop. Consider the following example:
data Abyss = MkAbyss !Abyss data Abyss = MkAbyss !Abyss
stareIntoTheAbyss :: Abyss -> a stareIntoTheAbyss :: Abyss -> a
stareIntoTheAbyss x = case x of {} stareIntoTheAbyss x = case x of {}
...@@ -1391,7 +1399,6 @@ stareIntoTheAbyss above. Then again, the same problem occurs with recursive ...@@ -1391,7 +1399,6 @@ stareIntoTheAbyss above. Then again, the same problem occurs with recursive
newtypes, like in the following code: newtypes, like in the following code:
newtype Chasm = MkChasm Chasm newtype Chasm = MkChasm Chasm
gazeIntoTheChasm :: Chasm -> a gazeIntoTheChasm :: Chasm -> a
gazeIntoTheChasm x = case x of {} -- Erroneously warned as non-exhaustive gazeIntoTheChasm x = case x of {} -- Erroneously warned as non-exhaustive
...@@ -1415,9 +1422,26 @@ maximum recursion depth to 1 to mitigate the problem. If the branching factor ...@@ -1415,9 +1422,26 @@ maximum recursion depth to 1 to mitigate the problem. If the branching factor
is exactly 1 (i.e., we have a linear chain instead of a tree), then it's okay is exactly 1 (i.e., we have a linear chain instead of a tree), then it's okay
to stick with a larger maximum recursion depth. to stick with a larger maximum recursion depth.
In #17977 we saw that the defaultRecTcMaxBound (100 at the time of writing) was
too large and had detrimental effect on performance of the coverage checker.
Given that we only commit to a best effort anyway, we decided to substantially
decrement the recursion depth to 3, at the cost of precision in some edge cases
like
data Nat = Z | S Nat
data Down :: Nat -> Type where
Down :: !(Down n) -> Down (S n)
f :: Down (S (S (S (S (S Z))))) -> ()
f x = case x of {}
Since the coverage won't bother to instantiate Down 4 levels deep to see that it
is in fact uninhabited, it will emit a inexhaustivity warning for the case.
Note [Types that are definitely inhabitable]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Another microoptimization applies to data types like this one: Another microoptimization applies to data types like this one:
data S a = ![a] !T data S a = S ![a] !T
Even though there is a strict field of type [a], it's quite silly to call Even though there is a strict field of type [a], it's quite silly to call
nonVoid on it, since it's "obvious" that it is inhabitable. To make this nonVoid on it, since it's "obvious" that it is inhabitable. To make this
......
{-# LANGUAGE CPP #-} {-# LANGUAGE CPP #-}
module GHC.Platform.ARM64 where module GHC.Platform.AArch64 where
import GhcPrelude import GhcPrelude
#define MACHREGS_NO_REGS 0 #define MACHREGS_NO_REGS 0
#define MACHREGS_aarch64 1 #define MACHREGS_aarch64 1
#include "../../../includes/CodeGen.Platform.hs" #include "../../../includes/CodeGen.Platform.hs"
module GHC.Platform.Regs module GHC.Platform.Regs
(callerSaves, activeStgRegs, haveRegBase, globalRegMaybe, freeReg) (callerSaves, activeStgRegs, haveRegBase, globalRegMaybe, freeReg)
where where
...@@ -10,7 +9,7 @@ import GHC.Platform ...@@ -10,7 +9,7 @@ import GHC.Platform
import Reg import Reg
import qualified GHC.Platform.ARM as ARM import qualified GHC.Platform.ARM as ARM
import qualified GHC.Platform.ARM64 as ARM64 import qualified GHC.Platform.AArch64 as AArch64
import qualified GHC.Platform.PPC as PPC import qualified GHC.Platform.PPC as PPC
import qualified GHC.Platform.S390X as S390X import qualified GHC.Platform.S390X as S390X
import qualified GHC.Platform.SPARC as SPARC import qualified GHC.Platform.SPARC as SPARC
...@@ -26,12 +25,12 @@ callerSaves platform ...@@ -26,12 +25,12 @@ callerSaves platform
| platformUnregisterised platform = NoRegs.callerSaves | platformUnregisterised platform = NoRegs.callerSaves
| otherwise | otherwise
= case platformArch platform of = case platformArch platform of
ArchX86 -> X86.callerSaves ArchX86 -> X86.callerSaves
ArchX86_64 -> X86_64.callerSaves ArchX86_64 -> X86_64.callerSaves
ArchS390X -> S390X.callerSaves ArchS390X -> S390X.callerSaves
ArchSPARC -> SPARC.callerSaves ArchSPARC -> SPARC.callerSaves
ArchARM {} -> ARM.callerSaves ArchARM {} -> ARM.callerSaves
ArchARM64 -> ARM64.callerSaves ArchAArch64 -> AArch64.callerSaves
arch arch
| arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] -> | arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] ->
PPC.callerSaves PPC.callerSaves
...@@ -48,12 +47,12 @@ activeStgRegs platform ...@@ -48,12 +47,12 @@ activeStgRegs platform
| platformUnregisterised platform = NoRegs.activeStgRegs | platformUnregisterised platform = NoRegs.activeStgRegs
| otherwise | otherwise
= case platformArch platform of = case platformArch platform of
ArchX86 -> X86.activeStgRegs ArchX86 -> X86.activeStgRegs
ArchX86_64 -> X86_64.activeStgRegs ArchX86_64 -> X86_64.activeStgRegs
ArchS390X -> S390X.activeStgRegs ArchS390X -> S390X.activeStgRegs
ArchSPARC -> SPARC.activeStgRegs ArchSPARC -> SPARC.activeStgRegs
ArchARM {} -> ARM.activeStgRegs ArchARM {} -> ARM.activeStgRegs
ArchARM64 -> ARM64.activeStgRegs ArchAArch64 -> AArch64.activeStgRegs
arch arch
| arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] -> | arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] ->
PPC.activeStgRegs PPC.activeStgRegs
...@@ -65,12 +64,12 @@ haveRegBase platform ...@@ -65,12 +64,12 @@ haveRegBase platform
| platformUnregisterised platform = NoRegs.haveRegBase | platformUnregisterised platform = NoRegs.haveRegBase
| otherwise | otherwise
= case platformArch platform of = case platformArch platform of
ArchX86 -> X86.haveRegBase ArchX86 -> X86.haveRegBase
ArchX86_64 -> X86_64.haveRegBase ArchX86_64 -> X86_64.haveRegBase
ArchS390X -> S390X.haveRegBase ArchS390X -> S390X.haveRegBase
ArchSPARC -> SPARC.haveRegBase ArchSPARC -> SPARC.haveRegBase
ArchARM {} -> ARM.haveRegBase ArchARM {} -> ARM.haveRegBase
ArchARM64 -> ARM64.haveRegBase ArchAArch64 -> AArch64.haveRegBase
arch arch
| arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] -> | arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] ->
PPC.haveRegBase PPC.haveRegBase
...@@ -82,12 +81,12 @@ globalRegMaybe platform ...@@ -82,12 +81,12 @@ globalRegMaybe platform
| platformUnregisterised platform = NoRegs.globalRegMaybe | platformUnregisterised platform = NoRegs.globalRegMaybe
| otherwise | otherwise
= case platformArch platform of = case platformArch platform of
ArchX86 -> X86.globalRegMaybe ArchX86 -> X86.globalRegMaybe
ArchX86_64 -> X86_64.globalRegMaybe ArchX86_64 -> X86_64.globalRegMaybe
ArchS390X -> S390X.globalRegMaybe ArchS390X -> S390X.globalRegMaybe
ArchSPARC -> SPARC.globalRegMaybe ArchSPARC -> SPARC.globalRegMaybe
ArchARM {} -> ARM.globalRegMaybe ArchARM {} -> ARM.globalRegMaybe
ArchARM64 -> ARM64.globalRegMaybe ArchAArch64 -> AArch64.globalRegMaybe
arch arch
| arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] -> | arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] ->
PPC.globalRegMaybe PPC.globalRegMaybe
...@@ -99,15 +98,14 @@ freeReg platform ...@@ -99,15 +98,14 @@ freeReg platform
| platformUnregisterised platform = NoRegs.freeReg | platformUnregisterised platform = NoRegs.freeReg
| otherwise | otherwise
= case platformArch platform of = case platformArch platform of
ArchX86 -> X86.freeReg ArchX86 -> X86.freeReg
ArchX86_64 -> X86_64.freeReg ArchX86_64 -> X86_64.freeReg
ArchS390X -> S390X.freeReg ArchS390X -> S390X.freeReg
ArchSPARC -> SPARC.freeReg ArchSPARC -> SPARC.freeReg
ArchARM {} -> ARM.freeReg ArchARM {} -> ARM.freeReg
ArchARM64 -> ARM64.freeReg ArchAArch64 -> AArch64.freeReg
arch arch
| arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] -> | arch `elem` [ArchPPC, ArchPPC_64 ELF_V1, ArchPPC_64 ELF_V2] ->
PPC.freeReg PPC.freeReg
| otherwise -> NoRegs.freeReg | otherwise -> NoRegs.freeReg
...@@ -52,6 +52,8 @@ import DynFlags ...@@ -52,6 +52,8 @@ import DynFlags
import Control.Monad import Control.Monad
import TyCon (PrimRep (..))
------------------------------------------------------------------------ ------------------------------------------------------------------------
-- Top-level bindings -- Top-level bindings
------------------------------------------------------------------------ ------------------------------------------------------------------------
...@@ -716,9 +718,9 @@ link_caf node = do ...@@ -716,9 +718,9 @@ link_caf node = do
; let newCAF_lbl = mkForeignLabel (fsLit "newCAF") Nothing ; let newCAF_lbl = mkForeignLabel (fsLit "newCAF") Nothing
ForeignLabelInExternalPackage IsFunction ForeignLabelInExternalPackage IsFunction
; bh <- newTemp (bWord dflags) ; bh <- newTemp (bWord dflags)
; emitRtsCallGen [(bh,AddrHint)] newCAF_lbl ; emitRtsCallGen [(bh, AddrRep, AddrHint)] newCAF_lbl
[ (baseExpr, AddrHint), [ (baseExpr, AddrRep, AddrHint),
(CmmReg (CmmLocal node), AddrHint) ] (CmmReg (CmmLocal node), AddrRep, AddrHint) ]
False False
-- see Note [atomic CAF entry] in rts/sm/Storage.c -- see Note [atomic CAF entry] in rts/sm/Storage.c
......
...@@ -362,20 +362,19 @@ type DynTag = Int -- The tag on a *pointer* ...@@ -362,20 +362,19 @@ type DynTag = Int -- The tag on a *pointer*
-- * big, otherwise. -- * big, otherwise.
-- --
-- Small families can have the constructor tag in the tag bits. -- Small families can have the constructor tag in the tag bits.
-- Big families only use the tag value 1 to represent evaluatedness. -- Big families always use the tag values 1..mAX_PTR_TAG to represent
-- evaluatedness, the last one lumping together all overflowing ones.
-- We don't have very many tag bits: for example, we have 2 bits on -- We don't have very many tag bits: for example, we have 2 bits on
-- x86-32 and 3 bits on x86-64. -- x86-32 and 3 bits on x86-64.
--
-- Also see Note [Tagging big families] in GHC.StgToCmm.Expr
isSmallFamily :: DynFlags -> Int -> Bool isSmallFamily :: DynFlags -> Int -> Bool
isSmallFamily dflags fam_size = fam_size <= mAX_PTR_TAG dflags isSmallFamily dflags fam_size = fam_size <= mAX_PTR_TAG dflags
tagForCon :: DynFlags -> DataCon -> DynTag tagForCon :: DynFlags -> DataCon -> DynTag
tagForCon dflags con tagForCon dflags con = min (dataConTag con) (mAX_PTR_TAG dflags)
| isSmallFamily dflags fam_size = con_tag -- NB: 1-indexed
| otherwise = 1
where
con_tag = dataConTag con -- NB: 1-indexed
fam_size = tyConFamilySize (dataConTyCon con)
tagForArity :: DynFlags -> RepArity -> DynTag tagForArity :: DynFlags -> RepArity -> DynTag
tagForArity dflags arity tagForArity dflags arity
......
...@@ -84,6 +84,21 @@ cgTopRhsCon dflags id con args = ...@@ -84,6 +84,21 @@ cgTopRhsCon dflags id con args =
nv_args_w_offsets) = nv_args_w_offsets) =
mkVirtHeapOffsetsWithPadding dflags StdHeader (addArgReps args) mkVirtHeapOffsetsWithPadding dflags StdHeader (addArgReps args)
; let
-- Decompose padding into units of length 8, 4, 2, or 1 bytes to
-- allow the implementation of mk_payload to use widthFromBytes,
-- which only handles these cases.
fix_padding (x@(Padding n off) : rest)
| n == 0 = fix_padding rest
| n `elem` [1,2,4,8] = x : fix_padding rest
| n > 8 = add_pad 8
| n > 4 = add_pad 4
| n > 2 = add_pad 2
| otherwise = add_pad 1
where add_pad m = Padding m off : fix_padding (Padding (n-m) (off+m) : rest)
fix_padding (x : rest) = x : fix_padding rest
fix_padding [] = []
mk_payload (Padding len _) = return (CmmInt 0 (widthFromBytes len)) mk_payload (Padding len _) = return (CmmInt 0 (widthFromBytes len))
mk_payload (FieldOff arg _) = do mk_payload (FieldOff arg _) = do
amode <- getArgAmode arg amode <- getArgAmode arg
...@@ -99,7 +114,7 @@ cgTopRhsCon dflags id con args = ...@@ -99,7 +114,7 @@ cgTopRhsCon dflags id con args =
info_tbl = mkDataConInfoTable dflags con True ptr_wds nonptr_wds info_tbl = mkDataConInfoTable dflags con True ptr_wds nonptr_wds
; payload <- mapM mk_payload nv_args_w_offsets ; payload <- mapM mk_payload (fix_padding nv_args_w_offsets)
-- NB1: nv_args_w_offsets is sorted into ptrs then non-ptrs -- NB1: nv_args_w_offsets is sorted into ptrs then non-ptrs
-- NB2: all the amodes should be Lits! -- NB2: all the amodes should be Lits!
-- TODO (osa): Why? -- TODO (osa): Why?
......
{-# LANGUAGE CPP #-} {-# LANGUAGE CPP, BangPatterns #-}
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
-- --
...@@ -32,10 +32,11 @@ import StgSyn ...@@ -32,10 +32,11 @@ import StgSyn
import MkGraph import MkGraph
import BlockId import BlockId
import Cmm import Cmm hiding ( succ )
import CmmInfo import CmmInfo
import CoreSyn import CoreSyn
import DataCon import DataCon
import DynFlags ( mAX_PTR_TAG )
import ForeignCall import ForeignCall
import Id import Id
import PrimOp import PrimOp
...@@ -48,8 +49,9 @@ import Util ...@@ -48,8 +49,9 @@ import Util
import FastString import FastString
import Outputable import Outputable
import Control.Monad (unless,void) import Control.Monad ( unless, void )
import Control.Arrow (first) import Control.Arrow ( first )
import Data.List ( partition )
------------------------------------------------------------------------ ------------------------------------------------------------------------
-- cgExpr: the main function -- cgExpr: the main function
...@@ -569,7 +571,7 @@ isSimpleScrut _ _ = return False ...@@ -569,7 +571,7 @@ isSimpleScrut _ _ = return False
isSimpleOp :: StgOp -> [StgArg] -> FCode Bool isSimpleOp :: StgOp -> [StgArg] -> FCode Bool
-- True iff the op cannot block or allocate -- True iff the op cannot block or allocate
isSimpleOp (StgFCallOp (CCall (CCallSpec _ _ safe)) _) _ = return $! not (playSafe safe) isSimpleOp (StgFCallOp (CCall (CCallSpec _ _ safe _ _)) _) _ = return $! not (playSafe safe)
-- dataToTag# evalautes its argument, see Note [dataToTag#] in primops.txt.pp -- dataToTag# evalautes its argument, see Note [dataToTag#] in primops.txt.pp
isSimpleOp (StgPrimOp DataToTagOp) _ = return False isSimpleOp (StgPrimOp DataToTagOp) _ = return False
isSimpleOp (StgPrimOp op) stg_args = do isSimpleOp (StgPrimOp op) stg_args = do
...@@ -631,29 +633,152 @@ cgAlts gc_plan bndr (AlgAlt tycon) alts ...@@ -631,29 +633,152 @@ cgAlts gc_plan bndr (AlgAlt tycon) alts
; (mb_deflt, branches) <- cgAlgAltRhss gc_plan bndr alts ; (mb_deflt, branches) <- cgAlgAltRhss gc_plan bndr alts
; let fam_sz = tyConFamilySize tycon ; let !fam_sz = tyConFamilySize tycon
bndr_reg = CmmLocal (idToReg dflags bndr) !bndr_reg = CmmLocal (idToReg dflags bndr)
!ptag_expr = cmmConstrTag1 dflags (CmmReg bndr_reg)
-- Is the constructor tag in the node reg? !branches' = first succ <$> branches
; if isSmallFamily dflags fam_sz !maxpt = mAX_PTR_TAG dflags
then do (!via_ptr, !via_info) = partition ((< maxpt) . fst) branches'
let -- Yes, bndr_reg has constr. tag in ls bits !small = isSmallFamily dflags fam_sz
tag_expr = cmmConstrTag1 dflags (CmmReg bndr_reg)
branches' = [(tag+1,branch) | (tag,branch) <- branches] -- Is the constructor tag in the node reg?
emitSwitch tag_expr branches' mb_deflt 1 fam_sz -- See Note [Tagging big families]
; if small || null via_info
else -- No, get tag from info table then -- Yes, bndr_reg has constructor tag in ls bits
let -- Note that ptr _always_ has tag 1 emitSwitch ptag_expr branches' mb_deflt 1
-- when the family size is big enough (if small then fam_sz else maxpt)
untagged_ptr = cmmRegOffB bndr_reg (-1)
tag_expr = getConstrTag dflags (untagged_ptr) else -- No, the get exact tag from info table when mAX_PTR_TAG
in emitSwitch tag_expr branches mb_deflt 0 (fam_sz - 1) -- See Note [Double switching for big families]
do
let !untagged_ptr = cmmUntag dflags (CmmReg bndr_reg)
!itag_expr = getConstrTag dflags untagged_ptr
!info0 = first pred <$> via_info
if null via_ptr then
emitSwitch itag_expr info0 mb_deflt 0 (fam_sz - 1)
else do
infos_lbl <- newBlockId
infos_scp <- getTickScope
let spillover = (maxpt, (mkBranch infos_lbl, infos_scp))
(mb_shared_deflt, mb_shared_branch) <- case mb_deflt of
(Just (stmts, scp)) ->
do lbl <- newBlockId
return ( Just (mkLabel lbl scp <*> stmts, scp)
, Just (mkBranch lbl, scp))
_ -> return (Nothing, Nothing)
-- Switch on pointer tag
emitSwitch ptag_expr (spillover : via_ptr) mb_shared_deflt 1 maxpt
join_lbl <- newBlockId
emit (mkBranch join_lbl)
-- Switch on info table tag
emitLabel infos_lbl
emitSwitch itag_expr info0 mb_shared_branch
(maxpt - 1) (fam_sz - 1)
emitLabel join_lbl
; return AssignedDirectly } ; return AssignedDirectly }
cgAlts _ _ _ _ = panic "cgAlts" cgAlts _ _ _ _ = panic "cgAlts"
-- UbxTupAlt and PolyAlt have only one alternative -- UbxTupAlt and PolyAlt have only one alternative
-- Note [Double switching for big families]
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
--
-- An algebraic data type can have a n >= 0 summands
-- (or alternatives), which are identified (labeled) by
-- constructors. In memory they are kept apart by tags
-- (see Note [Data constructor dynamic tags] in GHC.StgToCmm.Closure).
-- Due to the characteristics of the platform that
-- contribute to the alignment of memory objects, there
-- is a natural limit of information about constructors
-- that can be encoded in the pointer tag. When the mapping
-- of constructors to the pointer tag range 1..mAX_PTR_TAG
-- is not injective, then we have a "big data type", also
-- called a "big (constructor) family" in the literature.
-- Constructor tags residing in the info table are injective,
-- but considerably more expensive to obtain, due to additional
-- memory access(es).
--
-- When doing case analysis on a value of a "big data type"
-- we need two nested switch statements to make up for the lack
-- of injectivity of pointer tagging, also taking the info
-- table tag into account. The exact mechanism is described next.
--
-- In the general case, switching on big family alternatives
-- is done by two nested switch statements. According to
-- Note [Tagging big families], the outer switch
-- looks at the pointer tag and the inner dereferences the
-- pointer and switches on the info table tag.
--
-- We can handle a simple case first, namely when none
-- of the case alternatives mention a constructor having
-- a pointer tag of 1..mAX_PTR_TAG-1. In this case we
-- simply emit a switch on the info table tag.
-- Note that the other simple case is when all mentioned
-- alternatives lie in 1..mAX_PTR_TAG-1, in which case we can
-- switch on the ptr tag only, just like in the small family case.
--
-- There is a single intricacy with a nested switch:
-- Both should branch to the same default alternative, and as such
-- avoid duplicate codegen of potentially heavy code. The outer
-- switch generates the actual code with a prepended fresh label,
-- while the inner one only generates a jump to that label.
--
-- For example, let's assume a 64-bit architecture, so that all
-- heap objects are 8-byte aligned, and hence the address of a
-- heap object ends in `000` (three zero bits).
--
-- Then consider the following data type
--
-- > data Big = T0 | T1 | T2 | T3 | T4 | T5 | T6 | T7 | T8
-- Ptr tag: 1 2 3 4 5 6 7 7 7
-- As bits: 001 010 011 100 101 110 111 111 111
-- Info pointer tag (zero based):
-- 0 1 2 3 4 5 6 7 8
--
-- Then \case T2 -> True; T8 -> True; _ -> False
-- will result in following code (slightly cleaned-up and
-- commented -ddump-cmm-from-stg):
{-
R1 = _sqI::P64; -- scrutinee
if (R1 & 7 != 0) goto cqO; else goto cqP;
cqP: // global -- enter
call (I64[R1])(R1) returns to cqO, args: 8, res: 8, upd: 8;
cqO: // global -- already WHNF
_sqJ::P64 = R1;
_cqX::P64 = _sqJ::P64 & 7; -- extract pointer tag
switch [1 .. 7] _cqX::P64 {
case 3 : goto cqW;
case 7 : goto cqR;
default: {goto cqS;}
}
cqR: // global
_cr2 = I32[I64[_sqJ::P64 & (-8)] - 4]; -- tag from info pointer
switch [6 .. 8] _cr2::I64 {
case 8 : goto cr1;
default: {goto cr0;}
}
cr1: // global
R1 = GHC.Types.True_closure+2;
call (P64[(old + 8)])(R1) args: 8, res: 0, upd: 8;
cr0: // global -- technically necessary label
goto cqS;
cqW: // global
R1 = GHC.Types.True_closure+2;
call (P64[(old + 8)])(R1) args: 8, res: 0, upd: 8;
cqS: // global
R1 = GHC.Types.False_closure+1;
call (P64[(old + 8)])(R1) args: 8, res: 0, upd: 8;
-}
--
-- For 32-bit systems we only have 2 tag bits in the pointers at our disposal,
-- so the performance win is dubious, especially in face of the increased code
-- size due to double switching. But we can take the viewpoint that 32-bit
-- architectures are not relevant for performance any more, so this can be
-- considered as moot.
-- Note [alg-alt heap check] -- Note [alg-alt heap check]
-- --
...@@ -675,6 +800,55 @@ cgAlts _ _ _ _ = panic "cgAlts" ...@@ -675,6 +800,55 @@ cgAlts _ _ _ _ = panic "cgAlts"
-- x = R1 -- x = R1
-- goto L1 -- goto L1
-- Note [Tagging big families]
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~
--
-- Both the big and the small constructor families are tagged,
-- that is, greater unions which overflow the tag space of TAG_BITS
-- (i.e. 3 on 32 resp. 7 constructors on 64 bit archs).
--
-- For example, let's assume a 64-bit architecture, so that all
-- heap objects are 8-byte aligned, and hence the address of a
-- heap object ends in `000` (three zero bits). Then consider
-- > data Maybe a = Nothing | Just a
-- > data Day a = Mon | Tue | Wed | Thu | Fri | Sat | Sun
-- > data Grade = G1 | G2 | G3 | G4 | G5 | G6 | G7 | G8 | G9 | G10
--
-- Since `Grade` has more than 7 constructors, it counts as a
-- "big data type" (also referred to as "big constructor family" in papers).
-- On the other hand, `Maybe` and `Day` have 7 constructors or fewer, so they
-- are "small data types".
--
-- Then
-- * A pointer to an unevaluated thunk of type `Maybe Int`, `Day` or `Grade` will end in `000`
-- * A tagged pointer to a `Nothing`, `Mon` or `G1` will end in `001`
-- * A tagged pointer to a `Just x`, `Tue` or `G2` will end in `010`
-- * A tagged pointer to `Wed` or `G3` will end in `011`
-- ...
-- * A tagged pointer to `Sat` or `G6` will end in `110`
-- * A tagged pointer to `Sun` or `G7` or `G8` or `G9` or `G10` will end in `111`
--
-- For big families we employ a mildly clever way of combining pointer and
-- info-table tagging. We use 1..MAX_PTR_TAG-1 as pointer-resident tags where
-- the tags in the pointer and the info table are in a one-to-one
-- relation, whereas tag MAX_PTR_TAG is used as "spill over", signifying
-- we have to fall back and get the precise constructor tag from the
-- info-table.
--
-- Consequently we now cascade switches, because we have to check
-- the pointer tag first, and when it is MAX_PTR_TAG, fetch the precise
-- tag from the info table, and switch on that. The only technically
-- tricky part is that the default case needs (logical) duplication.
-- To do this we emit an extra label for it and branch to that from
-- the second switch. This avoids duplicated codegen. See Trac #14373.
-- See note [Double switching for big families] for the mechanics
-- involved.
--
-- Also see note [Data constructor dynamic tags]
-- and the wiki https://gitlab.haskell.org/ghc/ghc/wikis/commentary/rts/haskell-execution/pointer-tagging
--
------------------- -------------------
cgAlgAltRhss :: (GcPlan,ReturnKind) -> NonVoid Id -> [CgStgAlt] cgAlgAltRhss :: (GcPlan,ReturnKind) -> NonVoid Id -> [CgStgAlt]
-> FCode ( Maybe CmmAGraphScoped -> FCode ( Maybe CmmAGraphScoped
......