Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • ghc/ghc
  • bgamari/ghc
  • syd/ghc
  • ggreif/ghc
  • watashi/ghc
  • RolandSenn/ghc
  • mpickering/ghc
  • DavidEichmann/ghc
  • carter/ghc
  • harpocrates/ghc
  • ethercrow/ghc
  • mijicd/ghc
  • adamse/ghc
  • alexbiehl/ghc
  • gridaphobe/ghc
  • trofi/ghc
  • supersven/ghc
  • ppk/ghc
  • ulysses4ever/ghc
  • AndreasK/ghc
  • ghuntley/ghc
  • shayne-fletcher-da/ghc
  • fgaz/ghc
  • yav/ghc
  • osa1/ghc
  • mbbx6spp/ghc
  • JulianLeviston/ghc
  • reactormonk/ghc
  • rae/ghc
  • takenobu-hs/ghc
  • michalt/ghc
  • andrewthad/ghc
  • hsyl20/ghc
  • scottgw/ghc
  • sjakobi/ghc
  • angerman/ghc
  • RyanGlScott/ghc
  • hvr/ghc
  • howtonotwin/ghc
  • chessai/ghc
  • m-renaud/ghc
  • brprice/ghc
  • stevehartdata/ghc
  • sighingnow/ghc
  • kgardas/ghc
  • ckoparkar/ghc
  • alp/ghc
  • smaeul/ghc
  • kakkun61/ghc
  • sykloid/ghc
  • newhoggy/ghc
  • toonn/ghc
  • nineonine/ghc
  • Phyx/ghc
  • ezyang/ghc
  • tweag/ghc
  • langston/ghc
  • ndmitchell/ghc
  • rockbmb/ghc
  • artempyanykh/ghc
  • mniip/ghc
  • mynguyenbmc/ghc
  • alexfmpe/ghc
  • crockeea/ghc
  • nh2/ghc
  • vaibhavsagar/ghc
  • phadej/ghc
  • Haskell-mouse/ghc
  • lolotp/ghc
  • spacekitteh/ghc
  • michaelpj/ghc
  • mgsloan/ghc
  • HPCohen/ghc
  • tmobile/ghc
  • radrow/ghc
  • simonmar/ghc
  • _deepfire/ghc
  • Ericson2314/ghc
  • leitao/ghc
  • fumieval/ghc
  • trac-isovector/ghc
  • cblp/ghc
  • xich/ghc
  • ciil/ghc
  • erthalion/ghc
  • xldenis/ghc
  • autotaker/ghc
  • haskell-wasm/ghc
  • kcsongor/ghc
  • agander/ghc
  • Baranowski/ghc
  • trac-dredozubov/ghc
  • 23Skidoo/ghc
  • iustin/ghc
  • ningning/ghc
  • josefs/ghc
  • kabuhr/ghc
  • gallais/ghc
  • dten/ghc
  • expipiplus1/ghc
  • Pluralia/ghc
  • rohanjr/ghc
  • intricate/ghc
  • kirelagin/ghc
  • Javran/ghc
  • DanielG/ghc
  • trac-mizunashi_mana/ghc
  • pparkkin/ghc
  • bollu/ghc
  • ntc2/ghc
  • jaspervdj/ghc
  • JoshMeredith/ghc
  • wz1000/ghc
  • zkourouma/ghc
  • code5hot/ghc
  • jdprice/ghc
  • tdammers/ghc
  • J-mie6/ghc
  • trac-lantti/ghc
  • ch1bo/ghc
  • cgohla/ghc
  • lucamolteni/ghc
  • acairncross/ghc
  • amerocu/ghc
  • chreekat/ghc
  • txsmith/ghc
  • trupill/ghc
  • typetetris/ghc
  • sergv/ghc
  • fryguybob/ghc
  • erikd/ghc
  • trac-roland/ghc
  • setupminimal/ghc
  • Friede80/ghc
  • SkyWriter/ghc
  • xplorld/ghc
  • abrar/ghc
  • obsidiansystems/ghc
  • Icelandjack/ghc
  • adinapoli/ghc
  • trac-matthewbauer/ghc
  • heatsink/ghc
  • dwijnand/ghc
  • Cmdv/ghc
  • alinab/ghc
  • pepeiborra/ghc
  • fommil/ghc
  • luochen1990/ghc
  • rlupton20/ghc
  • applePrincess/ghc
  • lehins/ghc
  • ronmrdechai/ghc
  • leeadam/ghc
  • harendra/ghc
  • mightymosquito1991/ghc
  • trac-gershomb/ghc
  • lucajulian/ghc
  • Rizary/ghc
  • VictorCMiraldo/ghc
  • jamesbrock/ghc
  • andrewdmeier/ghc
  • luke/ghc
  • pranaysashank/ghc
  • cocreature/ghc
  • hithroc/ghc
  • obreitwi/ghc
  • slrtbtfs/ghc
  • kaol/ghc
  • yairchu/ghc
  • Mathemagician98/ghc
  • trac-taylorfausak/ghc
  • leungbk/ghc
  • MichaWiedenmann/ghc
  • chris-martin/ghc
  • TDecki/ghc
  • adithyaov/ghc
  • trac-gelisam/ghc
  • Lysxia/ghc
  • complyue/ghc
  • bwignall/ghc
  • sternmull/ghc
  • sonika/ghc
  • leif/ghc
  • broadwaylamb/ghc
  • myszon/ghc
  • danbroooks/ghc
  • Mechachleopteryx/ghc
  • zardyh/ghc
  • trac-vdukhovni/ghc
  • OmarKhaledAbdo/ghc
  • arrowd/ghc
  • Bodigrim/ghc
  • matheus23/ghc
  • cardenaso11/ghc
  • trac-Athas/ghc
  • mb720/ghc
  • DylanZA/ghc
  • liff/ghc
  • typedrat/ghc
  • trac-claude/ghc
  • jbm/ghc
  • Gertjan423/ghc
  • PHO/ghc
  • JKTKops/ghc
  • kockahonza/ghc
  • msakai/ghc
  • Sir4ur0n/ghc
  • barambani/ghc
  • vishnu.c/ghc
  • dcoutts/ghc
  • trac-runeks/ghc
  • trac-MaxGabriel/ghc
  • lexi.lambda/ghc
  • strake/ghc
  • spavikevik/ghc
  • JakobBruenker/ghc
  • rmanne/ghc
  • gdziadkiewicz/ghc
  • ani/ghc
  • iliastsi/ghc
  • smunix/ghc
  • judah/ghc
  • blackgnezdo/ghc
  • emilypi/ghc
  • trac-bpfoley/ghc
  • muesli4/ghc
  • trac-gkaracha/ghc
  • Kleidukos/ghc
  • nek0/ghc
  • TristanCacqueray/ghc
  • dwulive/ghc
  • mbakke/ghc
  • arybczak/ghc
  • Yang123321/ghc
  • maksbotan/ghc
  • QuietMisdreavus/ghc
  • trac-olshanskydr/ghc
  • emekoi/ghc
  • samuela/ghc
  • josephcsible/ghc
  • dramforever/ghc
  • lpsmith/ghc
  • DenisFrezzato/ghc
  • michivi/ghc
  • jneira/ghc
  • jeffhappily/ghc
  • Ivan-Yudin/ghc
  • nakaji-dayo/ghc
  • gdevanla/ghc
  • galen/ghc
  • fendor/ghc
  • yaitskov/ghc
  • rcythr/ghc
  • awpr/ghc
  • jeremyschlatter/ghc
  • Aver1y/ghc
  • mitchellvitez/ghc
  • merijn/ghc
  • tomjaguarpaw1/ghc
  • trac-NoidedSuper/ghc
  • erewok/ghc
  • trac-junji.hashimoto/ghc
  • adamwespiser/ghc
  • bjaress/ghc
  • jhrcek/ghc
  • leonschoorl/ghc
  • lukasz-golebiewski/ghc
  • sheaf/ghc
  • last-g/ghc
  • carassius1014/ghc
  • eschwartz/ghc
  • dwincort/ghc
  • felixwiemuth/ghc
  • TimWSpence/ghc
  • marcusmonteirodesouza/ghc
  • WJWH/ghc
  • vtols/ghc
  • theobat/ghc
  • BinderDavid/ghc
  • ckoparkar0/ghc
  • alexander-kjeldaas/ghc
  • dme2/ghc
  • philderbeast/ghc
  • aaronallen8455/ghc
  • rayshih/ghc
  • benkard/ghc
  • mpardalos/ghc
  • saidelman/ghc
  • leiftw/ghc
  • ca333/ghc
  • bwroga/ghc
  • nmichael44/ghc
  • trac-crobbins/ghc
  • felixonmars/ghc
  • adityagupta1089/ghc
  • hgsipiere/ghc
  • treeowl/ghc
  • alexpeits/ghc
  • CraigFe/ghc
  • dnlkrgr/ghc
  • kerckhove_ts/ghc
  • cptwunderlich/ghc
  • eiais/ghc
  • hahohihu/ghc
  • sanchayan/ghc
  • lemmih/ghc
  • sehqlr/ghc
  • trac-dbeacham/ghc
  • luite/ghc
  • trac-f-a/ghc
  • vados/ghc
  • luntain/ghc
  • fatho/ghc
  • alexbiehl-gc/ghc
  • dcbdan/ghc
  • tvh/ghc
  • liam-ly/ghc
  • timbobbarnes/ghc
  • GovanifY/ghc
  • shanth2600/ghc
  • gliboc/ghc
  • duog/ghc
  • moxonsghost/ghc
  • zander/ghc
  • masaeedu/ghc
  • georgefst/ghc
  • guibou/ghc
  • nicuveo/ghc
  • mdebruijne/ghc
  • stjordanis/ghc
  • emiflake/ghc
  • wygulmage/ghc
  • frasertweedale/ghc
  • coot/ghc
  • aratamizuki/ghc
  • tsandstr/ghc
  • mrBliss/ghc
  • Anton-Latukha/ghc
  • tadfisher/ghc
  • vapourismo/ghc
  • Sorokin-Anton/ghc
  • basile-henry/ghc
  • trac-mightybyte/ghc
  • AbsoluteNikola/ghc
  • cobrien99/ghc
  • songzh/ghc
  • blamario/ghc
  • aj4ayushjain/ghc
  • trac-utdemir/ghc
  • tangcl/ghc
  • hdgarrood/ghc
  • maerwald/ghc
  • arjun/ghc
  • ratherforky/ghc
  • haskieLambda/ghc
  • EmilGedda/ghc
  • Bogicevic/ghc
  • eddiejessup/ghc
  • kozross/ghc
  • AlistairB/ghc
  • 3Rafal/ghc
  • christiaanb/ghc
  • trac-bit/ghc
  • matsumonkie/ghc
  • trac-parsonsmatt/ghc
  • chisui/ghc
  • jaro/ghc
  • trac-kmiyazato/ghc
  • davidsd/ghc
  • Tritlo/ghc
  • I-B-3/ghc
  • lykahb/ghc
  • AriFordsham/ghc
  • turion1/ghc
  • berberman/ghc
  • christiantakle/ghc
  • zyklotomic/ghc
  • trac-ocramz/ghc
  • CSEdd/ghc
  • doyougnu/ghc
  • mmhat/ghc
  • why-not-try-calmer/ghc
  • plutotulp/ghc
  • kjekac/ghc
  • Manvi07/ghc
  • teo/ghc
  • cactus/ghc
  • CarrieMY/ghc
  • abel/ghc
  • yihming/ghc
  • tsakki/ghc
  • jessicah/ghc
  • oliverbunting/ghc
  • meld/ghc
  • friedbrice/ghc
  • Joald/ghc
  • abarbu/ghc
  • DigitalBrains1/ghc
  • sterni/ghc
  • alexDarcy/ghc
  • hexchain/ghc
  • minimario/ghc
  • zliu41/ghc
  • tommd/ghc
  • jazcarate/ghc
  • peterbecich/ghc
  • alirezaghey/ghc
  • solomon/ghc
  • mikael.urankar/ghc
  • davjam/ghc
  • int-index/ghc
  • MorrowM/ghc
  • nrnrnr/ghc
  • Sonfamm/ghc-test-only
  • afzt1/ghc
  • nguyenhaibinh-tpc/ghc
  • trac-lierdakil/ghc
  • MichaWiedenmann1/ghc
  • jmorag/ghc
  • Ziharrk/ghc
  • trac-MitchellSalad/ghc
  • juampe/ghc
  • jwaldmann/ghc
  • snowleopard/ghc
  • juhp/ghc
  • normalcoder/ghc
  • ksqsf/ghc
  • trac-jberryman/ghc
  • roberth/ghc
  • 1ntEgr8/ghc
  • epworth/ghc
  • MrAdityaAlok/ghc
  • JunmingZhao42/ghc
  • jappeace/ghc
  • trac-Gabriel439/ghc
  • alt-romes/ghc
  • HugoPeters1024/ghc
  • 10ne1/ghc-fork
  • agentultra/ghc
  • Garfield1002/ghc
  • ChickenProp/ghc
  • clyring/ghc
  • MaxHearnden/ghc
  • jumper149/ghc
  • vem/ghc
  • ketzacoatl/ghc
  • Rosuavio/ghc
  • jackohughes/ghc
  • p4l1ly/ghc
  • konsumlamm/ghc
  • shlevy/ghc
  • torsten.schmits/ghc
  • andremarianiello/ghc
  • amesgen/ghc
  • googleson78/ghc
  • InfiniteVerma/ghc
  • uhbif19/ghc
  • yiyunliu/ghc
  • raehik/ghc
  • mrkun/ghc
  • telser/ghc
  • 1Jajen1/ghc
  • slotThe/ghc
  • WinstonHartnett/ghc
  • mpilgrem/ghc
  • dreamsmasher/ghc
  • schuelermine/ghc
  • trac-Viwor/ghc
  • undergroundquizscene/ghc
  • evertedsphere/ghc
  • coltenwebb/ghc
  • oberblastmeister/ghc
  • agrue/ghc
  • lf-/ghc
  • zacwood9/ghc
  • steshaw/ghc
  • high-cloud/ghc
  • SkamDart/ghc
  • PiDelport/ghc
  • maoif/ghc
  • RossPaterson/ghc
  • CharlesTaylor7/ghc
  • ribosomerocker/ghc
  • trac-ramirez7/ghc
  • daig/ghc
  • NicolasT/ghc
  • FinleyMcIlwaine/ghc
  • lawtonnichols/ghc
  • jmtd/ghc
  • ozkutuk/ghc
  • wildsebastian/ghc
  • nikshalark/ghc
  • lrzlin/ghc
  • tobias/ghc
  • fw/ghc
  • hawkinsw/ghc
  • type-dance/ghc
  • rui314/ghc
  • ocharles/ghc
  • wavewave/ghc
  • TheKK/ghc
  • nomeata/ghc
  • trac-csabahruska/ghc
  • jonathanjameswatson/ghc
  • L-as/ghc
  • Axman6/ghc
  • barracuda156/ghc
  • trac-jship/ghc
  • jake-87/ghc
  • meooow/ghc
  • rebeccat/ghc
  • hamana55/ghc
  • Enigmage/ghc
  • kokobd/ghc
  • agevelt/ghc
  • gshen42/ghc
  • chrismwendt/ghc
  • MangoIV/ghc
  • teto/ghc
  • Sookr1/ghc
  • trac-thomasjm/ghc
  • barci2/ghc-dev
  • trac-m4dc4p/ghc
  • dixonary/ghc
  • breakerzirconia/ghc
  • alexsio27444/ghc
  • glocq/ghc
  • sourabhxyz/ghc
  • ryantrinkle/ghc
  • Jade/ghc
  • scedfaliako/ghc
  • martijnbastiaan/ghc
  • trac-george.colpitts/ghc
  • ammarbinfaisal/ghc
  • mimi.vx/ghc
  • lortabac/ghc
  • trac-zyla/ghc
  • benbellick/ghc
  • aadaa-fgtaa/ghc
  • jvanbruegge/ghc
  • archbung/ghc
  • gilmi/ghc
  • mfonism/ghc
  • alex-mckenna/ghc
  • Ei30metry/ghc
  • DiegoDiverio/ghc
  • jorgecunhamendes/ghc
  • liesnikov/ghc
  • akrmn/ghc
  • trac-simplifierticks/ghc
  • jacco/ghc
  • rhendric/ghc
  • damhiya/ghc
  • ryndubei/ghc
  • DaveBarton/ghc
  • trac-Profpatsch/ghc
  • GZGavinZhao/ghc
  • ncfavier/ghc
  • jameshaydon/ghc
  • ajccosta/ghc
  • dschrempf/ghc
  • cydparser/ghc
  • LinuxUserGD/ghc
  • elodielander/ghc
  • facundominguez/ghc
  • psilospore/ghc
  • lachrimae/ghc
  • dylan-thinnes/ghc-type-errors-plugin
  • hamishmack/ghc
  • Leary/ghc
  • lzszt/ghc
  • lyokha/ghc
  • trac-glaubitz/ghc
  • Rewbert/ghc
  • andreabedini/ghc
  • Jasagredo/ghc
  • sol/ghc
  • OlegAlexander/ghc
  • trac-sthibaul/ghc
  • avdv/ghc
  • Wendaolee/ghc
  • ur4t/ghc
  • daylily/ghc
  • boltzmannrain/ghc
  • mmzk1526/ghc
  • trac-fizzixnerd/ghc
  • soulomoon/ghc
  • rwmjones/ghc
  • j14i/ghc
  • tracsis/ghc
  • gesh/ghc
  • flip101/ghc
  • eldritch-cookie/ghc
  • LemonjamesD/ghc
  • pgujjula/ghc
  • skeuchel/ghc
  • noteed/ghc
  • gulin.serge/ghc
  • Torrekie/ghc
  • jlwoodwa/ghc
  • ayanamists/ghc
  • husong998/ghc
  • trac-edmundnoble/ghc
  • josephf/ghc
  • contrun/ghc
  • baulig/ghc
  • edsko/ghc
  • mzschr/ghc-issue-24732
  • ulidtko/ghc
  • Arsen/ghc
  • trac-sjoerd_visscher/ghc
  • crumbtoo/ghc
  • L0neGamer/ghc
  • DrewFenwick/ghc
  • benz0li/ghc
  • MaciejWas/ghc
  • jordanrule/ghc
  • trac-qqwy/ghc
  • LiamGoodacre/ghc
  • isomorpheme/ghc
  • trac-danidiaz/ghc
  • Kariim/ghc
  • MTaimoorZaeem/ghc
  • hololeap/ghc
  • ticat-fp/ghc
  • meritamen/ghc
  • criskell/ghc
  • trac-kraai/ghc
  • aergus/ghc
  • jdral/ghc
  • SamB/ghc
  • Tristian/ghc
  • ywgrit/ghc
  • KatsuPatrick/ghc
  • OsePedro/ghc
  • mpscholten/ghc
  • fp/ghc
  • zaquest/ghc
  • fangyi-zhou/ghc
  • augyg/ghc
640 results
Show changes
Commits on Source (5581)
Showing
with 1617 additions and 278 deletions
--command sh ./hadrian/ghci --command sh -c "HADRIAN_ARGS=-j ./hadrian/ghci -j"
--reload compiler --reload compiler
--reload ghc --reload ghc
--reload includes --reload includes
--restart hadrian/ --restart hadrian/ghci
5eecb20a0368b599d03930e2dbb0e91540de4cb2
...@@ -60,6 +60,7 @@ _* ...@@ -60,6 +60,7 @@ _*
*/ghc-stage1 */ghc-stage1
.shake.* .shake.*
.hadrian_ghci .hadrian_ghci
.hadrian_ghci_multi/
.hie-bios .hie-bios
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
...@@ -97,6 +98,7 @@ _darcs/ ...@@ -97,6 +98,7 @@ _darcs/
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
# specific generated files # specific generated files
/.gitlab/jobs-metadata.json
/bindist-list /bindist-list
/bindist-list.uniq /bindist-list.uniq
/bindistprep/ /bindistprep/
...@@ -110,7 +112,8 @@ _darcs/ ...@@ -110,7 +112,8 @@ _darcs/
/compiler/ClosureTypes.h /compiler/ClosureTypes.h
/compiler/FunTypes.h /compiler/FunTypes.h
/compiler/MachRegs.h /compiler/MachRegs.h
/compiler/ghc-llvm-version.h /compiler/MachRegs
/compiler/GHC/CmmToLlvm/Version/Bounds.hs
/compiler/ghc.cabal /compiler/ghc.cabal
/compiler/ghc.cabal.old /compiler/ghc.cabal.old
/distrib/configure.ac /distrib/configure.ac
...@@ -163,6 +166,7 @@ _darcs/ ...@@ -163,6 +166,7 @@ _darcs/
/libraries/ghc-boot/ghc-boot.cabal /libraries/ghc-boot/ghc-boot.cabal
/libraries/ghc-boot-th/GNUmakefile /libraries/ghc-boot-th/GNUmakefile
/libraries/ghc-boot-th/ghc-boot-th.cabal /libraries/ghc-boot-th/ghc-boot-th.cabal
/libraries/ghc-boot-th-next/ghc-boot-th-next.cabal
/libraries/ghc-boot-th/ghc.mk /libraries/ghc-boot-th/ghc.mk
/libraries/ghc-heap/ghc-heap.cabal /libraries/ghc-heap/ghc-heap.cabal
/libraries/ghci/GNUmakefile /libraries/ghci/GNUmakefile
...@@ -182,8 +186,6 @@ _darcs/ ...@@ -182,8 +186,6 @@ _darcs/
/linter.log /linter.log
/mk/are-validating.mk /mk/are-validating.mk
/mk/build.mk /mk/build.mk
/mk/config.h
/mk/config.h.in
/mk/config.mk /mk/config.mk
/mk/config.mk.old /mk/config.mk.old
/mk/system-cxx-std-lib-1.0.conf /mk/system-cxx-std-lib-1.0.conf
...@@ -202,10 +204,9 @@ _darcs/ ...@@ -202,10 +204,9 @@ _darcs/
/utils/runghc/runghc.cabal /utils/runghc/runghc.cabal
/utils/gen-dll/gen-dll.cabal /utils/gen-dll/gen-dll.cabal
/utils/ghc-pkg/ghc-pkg.cabal /utils/ghc-pkg/ghc-pkg.cabal
utils/lndir/fs.*
utils/unlit/fs.* utils/unlit/fs.*
libraries/base/include/fs.h libraries/ghc-internal/include/fs.h
libraries/base/cbits/fs.c libraries/ghc-internal/cbits/fs.c
missing-win32-tarballs missing-win32-tarballs
/extra-gcc-opts /extra-gcc-opts
......
...@@ -2,11 +2,11 @@ variables: ...@@ -2,11 +2,11 @@ variables:
GIT_SSL_NO_VERIFY: "1" GIT_SSL_NO_VERIFY: "1"
# Commit of ghc/ci-images repository from which to pull Docker images # Commit of ghc/ci-images repository from which to pull Docker images
DOCKER_REV: ae60a90db673e679399286e3b63c21c8e7a9a9b9 DOCKER_REV: 2e2497036a91104be281a0eb24b37889aaf98341
# Sequential version number of all cached things. # Sequential version number of all cached things.
# Bump to invalidate GitLab CI cache. # Bump to invalidate GitLab CI cache.
CACHE_REV: 10 CACHE_REV: 11
# Disable shallow clones; they break our linting rules # Disable shallow clones; they break our linting rules
GIT_DEPTH: 0 GIT_DEPTH: 0
...@@ -57,37 +57,54 @@ stages: ...@@ -57,37 +57,54 @@ stages:
# Note [The CI Story] # Note [The CI Story]
# ~~~~~~~~~~~~~~~~~~~ # ~~~~~~~~~~~~~~~~~~~
# #
# There are two different types of pipelines: # There are a few different types of pipelines. Among them:
# #
# - marge-bot merges to `master`. Here we perform an exhaustive validation # 1. marge-bot merges to `master`. Here we perform an exhaustive validation
# across all of the platforms which we support. In addition, we push # across all of the platforms which we support. In addition, we push
# performance metric notes upstream, providing a persistent record of the # performance metric notes upstream, providing a persistent record of the
# performance characteristics of the compiler. # performance characteristics of the compiler.
# #
# - merge requests. Here we perform a slightly less exhaustive battery of # 2. merge requests. Here we perform a slightly less exhaustive battery of
# testing. Namely we omit some configurations (e.g. the unregisterised job). # testing. Namely we omit some configurations (e.g. the unregisterised job).
# These use the merge request's base commit for performance metric # These use the merge request's base commit for performance metric
# comparisons. # comparisons.
# #
# These and other pipelines are defined implicitly by the rules of individual
# jobs.
#
# At the top level, however, we can declare that pipelines (of whatever type)
# only run when:
#
# 1. Processing a merge request (as mentioned above)
#
# 2. Processing a tag
#
# 3. Pushing to master on the root ghc/ghc repo (as mentioned above)
#
# 4. Pushing to a release branch on the root ghc/ghc repo
#
# 5. Somebody manually triggers a pipeline from the GitLab UI
#
# In particular, note that pipelines don't automatically run just when changes
# are pushed to a feature branch.
workflow: workflow:
# N.B. Don't run on wip/ branches, instead on run on merge requests.
rules: rules:
- if: $CI_MERGE_REQUEST_ID - if: $CI_MERGE_REQUEST_ID
- if: $CI_COMMIT_TAG - if: $CI_COMMIT_TAG
- if: '$CI_COMMIT_BRANCH == "master"' # N.B.: If we weren't explicit about CI_PROJECT_ID, the following rule would
- if: '$CI_COMMIT_BRANCH =~ /ghc-[0-9]+\.[0-9]+/' # cause a duplicate pipeline for merge requests coming from the master
# branch of a fork.
- if: $CI_PROJECT_ID == "1" && $CI_COMMIT_BRANCH == "master"
- if: $CI_PROJECT_ID == "1" && $CI_COMMIT_BRANCH =~ /ghc-[0-9]+\.[0-9]+/
- if: '$CI_PIPELINE_SOURCE == "web"' - if: '$CI_PIPELINE_SOURCE == "web"'
# which versions of GHC to allow bootstrap with # which versions of GHC to allow bootstrap with
.bootstrap_matrix : &bootstrap_matrix .bootstrap_matrix : &bootstrap_matrix
matrix: matrix:
- GHC_VERSION: 9.2.5 - GHC_VERSION: 9.6.4
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10-ghc9_2:$DOCKER_REV"
- GHC_VERSION: 9.4.3
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10:$DOCKER_REV"
- GHC_VERSION: 9.6.1
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10-ghc9_6:$DOCKER_REV" DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10-ghc9_6:$DOCKER_REV"
- GHC_VERSION: 9.8.1
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10-ghc9_8:$DOCKER_REV"
# Allow linters to fail on draft MRs. # Allow linters to fail on draft MRs.
# This must be explicitly transcluded in lint jobs which # This must be explicitly transcluded in lint jobs which
...@@ -125,6 +142,12 @@ workflow: ...@@ -125,6 +142,12 @@ workflow:
rules: rules:
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
.full-ci: &full-ci
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/'
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/'
- if: '$CI_COMMIT_BRANCH == "master"'
- if: '$CI_COMMIT_BRANCH =~ /ghc-[0-9]+\.[0-9]+/'
############################################################ ############################################################
# Runner Tags # Runner Tags
############################################################ ############################################################
...@@ -274,13 +297,23 @@ lint-ci-config: ...@@ -274,13 +297,23 @@ lint-ci-config:
GIT_SUBMODULE_STRATEGY: none GIT_SUBMODULE_STRATEGY: none
before_script: before_script:
- echo "experimental-features = nix-command flakes" >> /etc/nix/nix.conf - echo "experimental-features = nix-command flakes" >> /etc/nix/nix.conf
- nix-channel --update # Note [Nix-in-Docker]
# ~~~~~~~~~~~~~~~~~~~~
# FIXME: This is a workaround for a Nix-in-Docker issue. See
# https://gitlab.haskell.org/ghc/head.hackage/-/issues/38#note_560487 for
# discussion.
- nix-shell -p gnused --run "sed -i -e 's/nixbld//' /etc/nix/nix.conf"
script: script:
- .gitlab/generate_jobs - nix run .gitlab/generate-ci#generate-jobs
# 1 if .gitlab/generate_jobs changed the output of the generated config # 1 if .gitlab/generate_jobs changed the output of the generated config
- nix shell nixpkgs#git -c git diff --exit-code - nix shell nixpkgs#git -c git diff --exit-code
# And run this to just make sure that works # And run this to generate the .gitlab/jobs-metadata.json
- .gitlab/generate_job_metadata - nix run .gitlab/generate-ci#generate-job-metadata
artifacts:
when: always
paths:
- .gitlab/jobs-metadata.json
- .gitlab/jobs.yaml
dependencies: [] dependencies: []
lint-submods: lint-submods:
...@@ -342,6 +375,8 @@ lint-submods-branch: ...@@ -342,6 +375,8 @@ lint-submods-branch:
script: script:
- .gitlab/ci.sh setup - .gitlab/ci.sh setup
- .gitlab/ci.sh configure - .gitlab/ci.sh configure
- .gitlab/ci.sh run_hadrian lint:ghc-internal
- .gitlab/ci.sh run_hadrian lint:ghc-experimental
- .gitlab/ci.sh run_hadrian lint:base - .gitlab/ci.sh run_hadrian lint:base
- .gitlab/ci.sh run_hadrian lint:compiler - .gitlab/ci.sh run_hadrian lint:compiler
...@@ -367,8 +402,12 @@ hadrian-ghc-in-ghci: ...@@ -367,8 +402,12 @@ hadrian-ghc-in-ghci:
- git clean -xdf && git submodule foreach git clean -xdf - git clean -xdf && git submodule foreach git clean -xdf
- .gitlab/ci.sh setup - .gitlab/ci.sh setup
- .gitlab/ci.sh configure - .gitlab/ci.sh configure
# Enable -Werror when building hadrian
- "echo 'package hadrian' > hadrian/cabal.project.local"
- "echo ' ghc-options: -Werror' >> hadrian/cabal.project.local"
# Load ghc-in-ghci then immediately exit and check the modules loaded # Load ghc-in-ghci then immediately exit and check the modules loaded
- echo ":q" | hadrian/ghci -j`mk/detect-cpu-count.sh`| tail -n2 | grep "Ok," - export CORES="$(mk/detect-cpu-count.sh)"
- echo ":q" | HADRIAN_ARGS=-j$CORES hadrian/ghci -j$CORES | tail -n2 | grep "Ok,"
after_script: after_script:
- .gitlab/ci.sh save_cache - .gitlab/ci.sh save_cache
- cat ci-timings - cat ci-timings
...@@ -396,7 +435,7 @@ hadrian-multi: ...@@ -396,7 +435,7 @@ hadrian-multi:
# workaround for docker permissions # workaround for docker permissions
- sudo chown ghc:ghc -R . - sudo chown ghc:ghc -R .
variables: variables:
GHC_FLAGS: -Werror GHC_FLAGS: "-Werror -Wwarn=deprecations"
CONFIGURE_ARGS: --enable-bootstrap-with-devel-snapshot CONFIGURE_ARGS: --enable-bootstrap-with-devel-snapshot
tags: tags:
- x86_64-linux - x86_64-linux
...@@ -419,8 +458,9 @@ hadrian-multi: ...@@ -419,8 +458,9 @@ hadrian-multi:
- .gitlab/ci.sh configure - .gitlab/ci.sh configure
# Now GHC means, use this GHC for hadrian # Now GHC means, use this GHC for hadrian
- export GHC=$BOOT_HC - export GHC=$BOOT_HC
- export CORES="$(mk/detect-cpu-count.sh)"
# Load hadrian-multi then immediately exit and check the modules loaded # Load hadrian-multi then immediately exit and check the modules loaded
- echo ":q" | hadrian/ghci-multi -j`mk/detect-cpu-count.sh`| tail -n2 | grep "Ok," - echo ":q" | HADRIAN_ARGS=-j$CORES hadrian/ghci-multi -j$CORES | tail -n2 | grep "Ok,"
after_script: after_script:
- .gitlab/ci.sh save_cache - .gitlab/ci.sh save_cache
cache: cache:
...@@ -428,7 +468,7 @@ hadrian-multi: ...@@ -428,7 +468,7 @@ hadrian-multi:
paths: paths:
- cabal-cache - cabal-cache
rules: rules:
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/' - *full-ci
############################################################ ############################################################
# stack-hadrian-build # stack-hadrian-build
...@@ -449,16 +489,21 @@ stack-hadrian-build: ...@@ -449,16 +489,21 @@ stack-hadrian-build:
# Testing reinstallable ghc codepath # Testing reinstallable ghc codepath
#################################### ####################################
test-cabal-reinstall-x86_64-linux-deb10: # As documented on the original ticket #19896, this feature already has a long
extends: nightly-x86_64-linux-deb10-validate # way to go before it can actually be used. Meanwhile, parts of it have
stage: full-build # bit-rotted, possibly related to some Cabal change. The job is disabled for
variables: # now.
REINSTALL_GHC: "yes" #
BUILD_FLAVOUR: validate # test-cabal-reinstall-x86_64-linux-deb10:
TEST_ENV: "x86_64-linux-deb10-cabal-install" # extends: nightly-x86_64-linux-deb10-validate
rules: # stage: full-build
- if: $NIGHTLY # variables:
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-reinstall.*/' # REINSTALL_GHC: "yes"
# BUILD_FLAVOUR: validate
# TEST_ENV: "x86_64-linux-deb10-cabal-install"
# rules:
# - if: $NIGHTLY
# - if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-reinstall.*/'
######################################## ########################################
# Testing ABI is invariant across builds # Testing ABI is invariant across builds
...@@ -507,7 +552,7 @@ doc-tarball: ...@@ -507,7 +552,7 @@ doc-tarball:
optional: true optional: true
- job: nightly-x86_64-windows-validate - job: nightly-x86_64-windows-validate
optional: true optional: true
- job: release-x86_64-windows-release+no_split_sections - job: release-x86_64-windows-release
optional: true optional: true
tags: tags:
...@@ -518,12 +563,17 @@ doc-tarball: ...@@ -518,12 +563,17 @@ doc-tarball:
LINUX_BINDIST: "ghc-x86_64-linux-deb10.tar.xz" LINUX_BINDIST: "ghc-x86_64-linux-deb10.tar.xz"
WINDOWS_BINDIST: "ghc-x86_64-windows.tar.xz" WINDOWS_BINDIST: "ghc-x86_64-windows.tar.xz"
artifacts: artifacts:
expose_as: "Documentation Preview"
paths: paths:
- haddock.html.tar.xz - haddock.html.tar.xz
- docs/haddock/
- libraries.html.tar.xz - libraries.html.tar.xz
- docs/libraries/
- users_guide.html.tar.xz - users_guide.html.tar.xz
- index.html - docs/users_guide/
- "*.pdf" - docs/index.html
- Haddock.pdf
- users_guide.pdf
script: script:
- | - |
mv "ghc-x86_64-linux-deb10-numa-slow-validate.tar.xz" "$LINUX_BINDIST" \ mv "ghc-x86_64-linux-deb10-numa-slow-validate.tar.xz" "$LINUX_BINDIST" \
...@@ -531,7 +581,7 @@ doc-tarball: ...@@ -531,7 +581,7 @@ doc-tarball:
|| mv "ghc-x86_64-linux-deb10-release.tar.xz" "$LINUX_BINDIST" \ || mv "ghc-x86_64-linux-deb10-release.tar.xz" "$LINUX_BINDIST" \
|| true || true
mv "ghc-x86_64-windows-validate.tar.xz" "$WINDOWS_BINDIST" \ mv "ghc-x86_64-windows-validate.tar.xz" "$WINDOWS_BINDIST" \
|| mv "ghc-x86_64-windows-release+no_split_sections.tar.xz" "$WINDOWS_BINDIST" \ || mv "ghc-x86_64-windows-release.tar.xz" "$WINDOWS_BINDIST" \
|| true || true
if [ ! -f "$LINUX_BINDIST" ]; then if [ ! -f "$LINUX_BINDIST" ]; then
echo "Error: $LINUX_BINDIST does not exist. Did the Debian 9 job fail?" echo "Error: $LINUX_BINDIST does not exist. Did the Debian 9 job fail?"
...@@ -543,8 +593,8 @@ doc-tarball: ...@@ -543,8 +593,8 @@ doc-tarball:
fi fi
- rm -Rf docs - rm -Rf docs
- bash -ex distrib/mkDocs/mkDocs $LINUX_BINDIST $WINDOWS_BINDIST - bash -ex distrib/mkDocs/mkDocs $LINUX_BINDIST $WINDOWS_BINDIST
- mv docs/*.tar.xz docs/*.pdf .
- ls -lh - ls -lh
- mv docs/*.tar.xz docs/index.html .
hackage-doc-tarball: hackage-doc-tarball:
stage: packaging stage: packaging
...@@ -599,6 +649,7 @@ source-tarball: ...@@ -599,6 +649,7 @@ source-tarball:
- if: $NIGHTLY - if: $NIGHTLY
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/' - if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/'
- *full-ci
generate-hadrian-bootstrap-sources: generate-hadrian-bootstrap-sources:
stage: full-build stage: full-build
...@@ -617,6 +668,7 @@ generate-hadrian-bootstrap-sources: ...@@ -617,6 +668,7 @@ generate-hadrian-bootstrap-sources:
- if: $NIGHTLY - if: $NIGHTLY
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/' - if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/'
- *full-ci
package-hadrian-bootstrap-sources: package-hadrian-bootstrap-sources:
...@@ -634,6 +686,7 @@ package-hadrian-bootstrap-sources: ...@@ -634,6 +686,7 @@ package-hadrian-bootstrap-sources:
- if: $NIGHTLY - if: $NIGHTLY
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/' - if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/'
- *full-ci
test-bootstrap: test-bootstrap:
stage: full-build stage: full-build
...@@ -670,6 +723,7 @@ test-bootstrap: ...@@ -670,6 +723,7 @@ test-bootstrap:
rules: rules:
- if: $NIGHTLY - if: $NIGHTLY
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/' - if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-bootstrap.*/'
- *full-ci
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
when: always when: always
variables: variables:
...@@ -718,6 +772,12 @@ hackage-lint: ...@@ -718,6 +772,12 @@ hackage-lint:
- job: nightly-x86_64-linux-deb10-numa-slow-validate - job: nightly-x86_64-linux-deb10-numa-slow-validate
optional: true optional: true
artifacts: false artifacts: false
- job: nightly-aarch64-linux-deb10-validate
optional: true
artifacts: false
- job: aarch64-linux-deb10-validate
optional: true
artifacts: false
extends: .hackage extends: .hackage
variables: variables:
SLOW_VALIDATE: 1 SLOW_VALIDATE: 1
...@@ -733,6 +793,9 @@ hackage-label-lint: ...@@ -733,6 +793,9 @@ hackage-label-lint:
- job: x86_64-linux-deb10-numa-slow-validate - job: x86_64-linux-deb10-numa-slow-validate
optional: true optional: true
artifacts: false artifacts: false
- job: aarch64-linux-deb10-validate
optional: true
artifacts: false
extends: .hackage extends: .hackage
variables: variables:
SLOW_VALIDATE: 1 SLOW_VALIDATE: 1
...@@ -747,6 +810,9 @@ nightly-hackage-lint: ...@@ -747,6 +810,9 @@ nightly-hackage-lint:
- job: nightly-x86_64-linux-deb10-numa-slow-validate - job: nightly-x86_64-linux-deb10-numa-slow-validate
optional: true optional: true
artifacts: false artifacts: false
- job: nightly-aarch64-linux-deb10-validate
optional: true
artifacts: false
rules: rules:
- if: $NIGHTLY - if: $NIGHTLY
variables: variables:
...@@ -761,6 +827,9 @@ nightly-hackage-perf: ...@@ -761,6 +827,9 @@ nightly-hackage-perf:
- job: nightly-x86_64-linux-fedora33-release - job: nightly-x86_64-linux-fedora33-release
optional: true optional: true
artifacts: false artifacts: false
- job: nightly-aarch64-linux-deb10-validate
optional: true
artifacts: false
rules: rules:
- if: $NIGHTLY - if: $NIGHTLY
variables: variables:
...@@ -777,13 +846,73 @@ release-hackage-lint: ...@@ -777,13 +846,73 @@ release-hackage-lint:
- job: release-x86_64-linux-fedora33-release - job: release-x86_64-linux-fedora33-release
optional: true optional: true
artifacts: false artifacts: false
- job: release-aarch64-linux-deb10-release+no_split_sections
optional: true
artifacts: false
rules: rules:
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
extends: .hackage extends: .hackage
# The ghcup metadata pipeline requires all prior jobs to
# pass. The hackage job can easily fail due to API changes
# or similar - so we allow it to fail.
allow_failure: true
variables: variables:
# No slow-validate bindist on release pipeline # No slow-validate bindist on release pipeline
EXTRA_HC_OPTS: "-dlint" EXTRA_HC_OPTS: "-dlint"
############################################################
# Testing via test-primops
############################################################
# Triggering jobs in the ghc/test-primops project
.test-primops:
stage: testing
variables:
UPSTREAM_PROJECT_PATH: "$CI_PROJECT_PATH"
UPSTREAM_PROJECT_ID: "$CI_PROJECT_ID"
UPSTREAM_PIPELINE_ID: "$CI_PIPELINE_ID"
trigger:
project: "ghc/test-primops"
branch: "upstream-testing"
strategy: "depend"
.test-primops-validate-template:
needs:
- job: x86_64-linux-deb10-validate+debug_info
artifacts: false
- job: aarch64-linux-deb10-validate
artifacts: false
- job: aarch64-darwin-validate
artifacts: false
- job: x86_64-darwin-validate
artifacts: false
extends: .test-primops
test-primops-label:
extends: .test-primops-validate-template
rules:
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*test-primops.*/'
test-primops-nightly:
extends: .test-primops
needs:
- job: nightly-x86_64-linux-deb10-validate
artifacts: false
- job: nightly-aarch64-linux-deb10-validate
artifacts: false
- job: nightly-aarch64-darwin-validate
artifacts: false
- job: nightly-x86_64-darwin-validate
artifacts: false
rules:
- if: $NIGHTLY
test-primops-release:
extends: .test-primops
rules:
- if: '$RELEASE_JOB == "yes"'
############################################################ ############################################################
# Nofib testing # Nofib testing
# (Disabled: See #21859) # (Disabled: See #21859)
...@@ -805,10 +934,7 @@ perf-nofib: ...@@ -805,10 +934,7 @@ perf-nofib:
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV"
rules: rules:
- when: never - when: never
- if: $CI_MERGE_REQUEST_ID - *full-ci
- if: '$CI_COMMIT_BRANCH == "master"'
- if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/'
tags: tags:
- x86_64-linux - x86_64-linux
before_script: before_script:
...@@ -850,10 +976,6 @@ perf: ...@@ -850,10 +976,6 @@ perf:
optional: true optional: true
dependencies: null dependencies: null
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV" image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV"
rules:
- if: $CI_MERGE_REQUEST_ID
- if: '$CI_COMMIT_BRANCH == "master"'
- if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
tags: tags:
- x86_64-linux-perf - x86_64-linux-perf
script: script:
...@@ -875,7 +997,7 @@ perf: ...@@ -875,7 +997,7 @@ perf:
paths: paths:
- out - out
rules: rules:
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/' - *full-ci
############################################################ ############################################################
# ABI testing # ABI testing
...@@ -915,7 +1037,7 @@ abi-test: ...@@ -915,7 +1037,7 @@ abi-test:
paths: paths:
- out - out
rules: rules:
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/' - *full-ci
############################################################ ############################################################
...@@ -943,7 +1065,7 @@ pages: ...@@ -943,7 +1065,7 @@ pages:
<meta charset="UTF-8"> <meta charset="UTF-8">
<meta http-equiv="refresh" content="1; url=doc/"> <meta http-equiv="refresh" content="1; url=doc/">
EOF EOF
- cp -f index.html public/doc - cp -f docs/index.html public/doc
rules: rules:
# N.B. only run this on ghc/ghc since the deployed pages are quite large # N.B. only run this on ghc/ghc since the deployed pages are quite large
# and we only serve GitLab Pages for ghc/ghc. # and we only serve GitLab Pages for ghc/ghc.
...@@ -978,9 +1100,6 @@ project-version: ...@@ -978,9 +1100,6 @@ project-version:
artifacts: artifacts:
paths: paths:
- version.sh - version.sh
rules:
- if: '$NIGHTLY'
- if: '$RELEASE_JOB == "yes"'
.ghcup-metadata: .ghcup-metadata:
stage: deploy stage: deploy
...@@ -993,6 +1112,8 @@ project-version: ...@@ -993,6 +1112,8 @@ project-version:
GIT_SUBMODULE_STRATEGY: "none" GIT_SUBMODULE_STRATEGY: "none"
before_script: before_script:
- echo "experimental-features = nix-command flakes" >> /etc/nix/nix.conf - echo "experimental-features = nix-command flakes" >> /etc/nix/nix.conf
# FIXME: See Note [Nix-in-Docker]
- nix-shell -p gnused --run "sed -i -e 's/nixbld//' /etc/nix/nix.conf"
- nix-channel --update - nix-channel --update
- cat version.sh - cat version.sh
# Calculate the project version # Calculate the project version
...@@ -1002,7 +1123,7 @@ project-version: ...@@ -1002,7 +1123,7 @@ project-version:
- PipelineYear="$(date -d $CI_PIPELINE_CREATED_AT +%Y)" - PipelineYear="$(date -d $CI_PIPELINE_CREATED_AT +%Y)"
- nix shell nixpkgs#wget -c wget "https://ghc.gitlab.haskell.org/ghcup-metadata/ghcup-nightlies-$PipelineYear-0.0.7.yaml" -O ghcup-0.0.7.yaml - nix shell nixpkgs#wget -c wget "https://ghc.gitlab.haskell.org/ghcup-metadata/ghcup-nightlies-$PipelineYear-0.0.7.yaml" -O ghcup-0.0.7.yaml
- .gitlab/generate_job_metadata - nix run .gitlab/generate-ci#generate-job-metadata
artifacts: artifacts:
paths: paths:
...@@ -1029,11 +1150,11 @@ ghcup-metadata-nightly: ...@@ -1029,11 +1150,11 @@ ghcup-metadata-nightly:
artifacts: false artifacts: false
- job: nightly-x86_64-windows-validate - job: nightly-x86_64-windows-validate
artifacts: false artifacts: false
- job: nightly-x86_64-linux-alpine3_12-int_native-validate+fully_static - job: nightly-x86_64-linux-alpine3_12-validate
artifacts: false artifacts: false
- job: nightly-x86_64-linux-deb9-validate - job: nightly-x86_64-linux-deb9-validate
artifacts: false artifacts: false
- job: nightly-i386-linux-deb9-validate - job: nightly-i386-linux-deb10-validate
artifacts: false artifacts: false
- job: nightly-x86_64-linux-deb10-validate - job: nightly-x86_64-linux-deb10-validate
artifacts: false artifacts: false
...@@ -1045,7 +1166,7 @@ ghcup-metadata-nightly: ...@@ -1045,7 +1166,7 @@ ghcup-metadata-nightly:
artifacts: false artifacts: false
- job: project-version - job: project-version
script: script:
- nix shell --extra-experimental-features nix-command -f .gitlab/rel_eng -c ghcup-metadata --metadata ghcup-0.0.7.yaml --date="$(date -d $CI_PIPELINE_CREATED_AT +%Y-%M-%d)" --pipeline-id="$CI_PIPELINE_ID" --version="$ProjectVersion" > "metadata_test.yaml" - nix shell --extra-experimental-features nix-command -f .gitlab/rel_eng -c ghcup-metadata --metadata ghcup-0.0.7.yaml --date="$(date -d $CI_PIPELINE_CREATED_AT +%Y-%m-%d)" --pipeline-id="$CI_PIPELINE_ID" --version="$ProjectVersion" > "metadata_test.yaml"
rules: rules:
- if: $NIGHTLY - if: $NIGHTLY
...@@ -1066,6 +1187,7 @@ ghcup-metadata-nightly-push: ...@@ -1066,6 +1187,7 @@ ghcup-metadata-nightly-push:
- git clone https://gitlab.haskell.org/ghc/ghcup-metadata.git - git clone https://gitlab.haskell.org/ghc/ghcup-metadata.git
- PipelineYear="$(date -d $CI_PIPELINE_CREATED_AT +%Y)" - PipelineYear="$(date -d $CI_PIPELINE_CREATED_AT +%Y)"
- cp metadata_test.yaml "ghcup-metadata/ghcup-nightlies-$PipelineYear-0.0.7.yaml" - cp metadata_test.yaml "ghcup-metadata/ghcup-nightlies-$PipelineYear-0.0.7.yaml"
- cp metadata_test.yaml "ghcup-metadata/ghcup-nightlies-0.0.7.yaml"
- cd ghcup-metadata - cd ghcup-metadata
- git config user.email "ghc-ci@gitlab-haskell.org" - git config user.email "ghc-ci@gitlab-haskell.org"
- git config user.name "GHC GitLab CI" - git config user.name "GHC GitLab CI"
...@@ -1082,7 +1204,7 @@ ghcup-metadata-release: ...@@ -1082,7 +1204,7 @@ ghcup-metadata-release:
# No explicit needs for release pipeline as we assume we need everything and everything will pass. # No explicit needs for release pipeline as we assume we need everything and everything will pass.
extends: .ghcup-metadata extends: .ghcup-metadata
script: script:
- nix shell --extra-experimental-features nix-command -f .gitlab/rel_eng -c ghcup-metadata --release-mode --metadata ghcup-0.0.7.yaml --date="$(date -d $CI_PIPELINE_CREATED_AT +%Y-%M-%d)" --pipeline-id="$CI_PIPELINE_ID" --version="$ProjectVersion" > "metadata_test.yaml" - nix shell --extra-experimental-features nix-command -f .gitlab/rel_eng -c ghcup-metadata --release-mode --metadata ghcup-0.0.7.yaml --date="$(date -d $CI_PIPELINE_CREATED_AT +%Y-%m-%d)" --pipeline-id="$CI_PIPELINE_ID" --version="$ProjectVersion" > "metadata_test.yaml"
rules: rules:
- if: '$RELEASE_JOB == "yes"' - if: '$RELEASE_JOB == "yes"'
......
...@@ -7,7 +7,7 @@ ...@@ -7,7 +7,7 @@
set -Eeuo pipefail set -Eeuo pipefail
# Configuration: # Configuration:
HACKAGE_INDEX_STATE="2020-12-21T14:48:20Z" HACKAGE_INDEX_STATE="2024-05-13T15:04:38Z"
MIN_HAPPY_VERSION="1.20" MIN_HAPPY_VERSION="1.20"
MIN_ALEX_VERSION="3.2.6" MIN_ALEX_VERSION="3.2.6"
...@@ -43,12 +43,13 @@ $0 - GHC continuous integration driver ...@@ -43,12 +43,13 @@ $0 - GHC continuous integration driver
Common Modes: Common Modes:
usage Show this usage message. usage Show this usage message.
setup Prepare environment for a build. setup Prepare environment for a build.
configure Run ./configure. configure Run ./configure.
clean Clean the tree clean Clean the tree
shell Run an interactive shell with a configured build environment. shell Run an interactive shell with a configured build environment.
save_cache Preserve the cabal cache save_test_output Generate unexpected-test-output.tar.gz
save_cache Preserve the cabal cache
Hadrian build system Hadrian build system
build_hadrian Build GHC via the Hadrian build system build_hadrian Build GHC via the Hadrian build system
...@@ -74,16 +75,6 @@ Environment variables affecting both build systems: ...@@ -74,16 +75,6 @@ Environment variables affecting both build systems:
(either "x86-64-darwin" or "aarch-darwin") (either "x86-64-darwin" or "aarch-darwin")
NO_BOOT Whether to run ./boot or not, used when testing the source dist NO_BOOT Whether to run ./boot or not, used when testing the source dist
Environment variables determining build configuration of Make system:
BUILD_FLAVOUR Which flavour to build.
BUILD_SPHINX_HTML Whether to build Sphinx HTML documentation.
BUILD_SPHINX_PDF Whether to build Sphinx PDF documentation.
INTEGER_LIBRARY Which integer library to use (integer-simple or integer-gmp).
HADDOCK_HYPERLINKED_SOURCES
Whether to build hyperlinked Haddock sources.
TEST_TYPE Which test rule to run.
Environment variables determining build configuration of Hadrian system: Environment variables determining build configuration of Hadrian system:
BUILD_FLAVOUR Which flavour to build. BUILD_FLAVOUR Which flavour to build.
...@@ -160,6 +151,8 @@ function mingw_init() { ...@@ -160,6 +151,8 @@ function mingw_init() {
# We always use mingw64 Python to avoid path length issues like #17483. # We always use mingw64 Python to avoid path length issues like #17483.
export PYTHON="/mingw64/bin/python3" export PYTHON="/mingw64/bin/python3"
# And need to use sphinx-build from the environment
export SPHINXBUILD="/mingw64/bin/sphinx-build.exe"
} }
# This will contain GHC's local native toolchain # This will contain GHC's local native toolchain
...@@ -211,7 +204,6 @@ function set_toolchain_paths() { ...@@ -211,7 +204,6 @@ function set_toolchain_paths() {
esac esac
info "Building toolchain for $NIX_SYSTEM" info "Building toolchain for $NIX_SYSTEM"
nix-build --quiet .gitlab/darwin/toolchain.nix --argstr system "$NIX_SYSTEM" -o toolchain.sh nix-build --quiet .gitlab/darwin/toolchain.nix --argstr system "$NIX_SYSTEM" -o toolchain.sh
cat toolchain.sh
fi fi
source toolchain.sh source toolchain.sh
;; ;;
...@@ -219,10 +211,10 @@ function set_toolchain_paths() { ...@@ -219,10 +211,10 @@ function set_toolchain_paths() {
# These are generally set by the Docker image but # These are generally set by the Docker image but
# we provide these handy fallbacks in case the # we provide these handy fallbacks in case the
# script isn't run from within a GHC CI docker image. # script isn't run from within a GHC CI docker image.
if [ -z "$GHC" ]; then GHC="$(which ghc)"; fi : ${GHC:=$(which ghc)}
if [ -z "$CABAL" ]; then CABAL="$(which cabal)"; fi : ${CABAL:=$(which cabal)}
if [ -z "$HAPPY" ]; then HAPPY="$(which happy)"; fi : ${HAPPY:=$(which happy)}
if [ -z "$ALEX" ]; then ALEX="$(which alex)"; fi : ${ALEX:=$(which alex)}
;; ;;
*) fail "bad toolchain_source" *) fail "bad toolchain_source"
esac esac
...@@ -240,7 +232,7 @@ function set_toolchain_paths() { ...@@ -240,7 +232,7 @@ function set_toolchain_paths() {
function cabal_update() { function cabal_update() {
# In principle -w shouldn't be necessary here but with # In principle -w shouldn't be necessary here but with
# cabal-install 3.8.1.0 it is, due to cabal#8447. # cabal-install 3.8.1.0 it is, due to cabal#8447.
run "$CABAL" update -w "$GHC" --index="$HACKAGE_INDEX_STATE" run "$CABAL" update -w "$GHC" "hackage.haskell.org,${HACKAGE_INDEX_STATE}"
} }
...@@ -315,7 +307,7 @@ function fetch_cabal() { ...@@ -315,7 +307,7 @@ function fetch_cabal() {
fail "neither CABAL nor CABAL_INSTALL_VERSION are not set" fail "neither CABAL nor CABAL_INSTALL_VERSION are not set"
fi fi
start_section "fetch GHC" start_section "fetch cabal"
case "$(uname)" in case "$(uname)" in
# N.B. Windows uses zip whereas all others use .tar.xz # N.B. Windows uses zip whereas all others use .tar.xz
MSYS_*|MINGW*) MSYS_*|MINGW*)
...@@ -342,7 +334,7 @@ function fetch_cabal() { ...@@ -342,7 +334,7 @@ function fetch_cabal() {
mv cabal "$toolchain/bin" mv cabal "$toolchain/bin"
;; ;;
esac esac
end_section "fetch GHC" end_section "fetch cabal"
fi fi
} }
...@@ -390,26 +382,6 @@ function cleanup_submodules() { ...@@ -390,26 +382,6 @@ function cleanup_submodules() {
end_section "clean submodules" end_section "clean submodules"
} }
function prepare_build_mk() {
if [[ -z "$BUILD_FLAVOUR" ]]; then fail "BUILD_FLAVOUR is not set"; fi
if [[ -z ${BUILD_SPHINX_HTML:-} ]]; then BUILD_SPHINX_HTML=YES; fi
if [[ -z ${BUILD_SPHINX_PDF:-} ]]; then BUILD_SPHINX_PDF=YES; fi
cat > mk/build.mk <<EOF
BIGNUM_BACKEND=${BIGNUM_BACKEND}
include mk/flavours/${BUILD_FLAVOUR}.mk
GhcLibHcOpts+=-haddock
EOF
if [ -n "${HADDOCK_HYPERLINKED_SOURCES:-}" ]; then
echo "EXTRA_HADDOCK_OPTS += --hyperlinked-source --quickjump" >> mk/build.mk
fi
info "build.mk is:"
cat mk/build.mk
}
function configure() { function configure() {
case "${CONFIGURE_WRAPPER:-}" in case "${CONFIGURE_WRAPPER:-}" in
emconfigure) source "$EMSDK/emsdk_env.sh" ;; emconfigure) source "$EMSDK/emsdk_env.sh" ;;
...@@ -510,6 +482,9 @@ function build_hadrian() { ...@@ -510,6 +482,9 @@ function build_hadrian() {
check_release_build check_release_build
# Just to be sure, use the same hackage index state when building Hadrian.
echo "index-state: $HACKAGE_INDEX_STATE" > hadrian/cabal.project.local
# We can safely enable parallel compression for x64. By the time # We can safely enable parallel compression for x64. By the time
# hadrian calls tar/xz to produce bindist, there's no other build # hadrian calls tar/xz to produce bindist, there's no other build
# work taking place. # work taking place.
...@@ -520,8 +495,16 @@ function build_hadrian() { ...@@ -520,8 +495,16 @@ function build_hadrian() {
if [[ -n "${REINSTALL_GHC:-}" ]]; then if [[ -n "${REINSTALL_GHC:-}" ]]; then
run_hadrian build-cabal -V run_hadrian build-cabal -V
else else
run_hadrian test:all_deps binary-dist -V case "$(uname)" in
mv _build/bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz" MSYS_*|MINGW*)
run_hadrian test:all_deps reloc-binary-dist -V
mv _build/reloc-bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
;;
*)
run_hadrian test:all_deps binary-dist -V
mv _build/bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
;;
esac
fi fi
} }
...@@ -575,6 +558,8 @@ function install_bindist() { ...@@ -575,6 +558,8 @@ function install_bindist() {
--prefix="$instdir" \ --prefix="$instdir" \
"${args[@]+"${args[@]}"}" "${args[@]+"${args[@]}"}"
make_install_destdir "$TOP"/destdir "$instdir" make_install_destdir "$TOP"/destdir "$instdir"
# And check the `--info` of the installed compiler, sometimes useful in CI log.
"$instdir"/bin/ghc --info
;; ;;
esac esac
popd popd
...@@ -615,12 +600,16 @@ function test_hadrian() { ...@@ -615,12 +600,16 @@ function test_hadrian() {
--summary-junit=./junit.xml \ --summary-junit=./junit.xml \
--test-have-intree-files \ --test-have-intree-files \
--docs=none \ --docs=none \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "cross-compiled hadrian main testsuite" "runtest.opts+=${RUNTEST_ARGS:-}" \
"runtest.opts+=--unexpected-output-dir=$TOP/unexpected-test-output" \
|| fail "cross-compiled hadrian main testsuite"
elif [[ -n "${CROSS_TARGET:-}" ]] && [[ "${CROSS_TARGET:-}" == *"wasm"* ]]; then elif [[ -n "${CROSS_TARGET:-}" ]] && [[ "${CROSS_TARGET:-}" == *"wasm"* ]]; then
run_hadrian \ run_hadrian \
test \ test \
--summary-junit=./junit.xml \ --summary-junit=./junit.xml \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian main testsuite targetting $CROSS_TARGET" "runtest.opts+=${RUNTEST_ARGS:-}" \
"runtest.opts+=--unexpected-output-dir=$TOP/unexpected-test-output" \
|| fail "hadrian main testsuite targetting $CROSS_TARGET"
elif [ -n "${CROSS_TARGET:-}" ]; then elif [ -n "${CROSS_TARGET:-}" ]; then
local instdir="$TOP/_build/install" local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe" local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
...@@ -636,7 +625,9 @@ function test_hadrian() { ...@@ -636,7 +625,9 @@ function test_hadrian() {
--test-compiler=stage-cabal \ --test-compiler=stage-cabal \
--test-root-dirs=testsuite/tests/perf \ --test-root-dirs=testsuite/tests/perf \
--test-root-dirs=testsuite/tests/typecheck \ --test-root-dirs=testsuite/tests/typecheck \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian cabal-install test" "runtest.opts+=${RUNTEST_ARGS:-}" \
"runtest.opts+=--unexpected-output-dir=$TOP/unexpected-test-output" \
|| fail "hadrian cabal-install test"
else else
local instdir="$TOP/_build/install" local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe" local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
...@@ -658,7 +649,7 @@ function test_hadrian() { ...@@ -658,7 +649,7 @@ function test_hadrian() {
then then
test_compiler_backend=$(${test_compiler} -e "GHC.Num.Backend.backendName") test_compiler_backend=$(${test_compiler} -e "GHC.Num.Backend.backendName")
if [ $test_compiler_backend != "\"$BIGNUM_BACKEND\"" ]; then if [ $test_compiler_backend != "\"$BIGNUM_BACKEND\"" ]; then
fail "Test compiler has a different BIGNUM_BACKEND ($test_compiler_backend) thean requested ($BIGNUM_BACKEND)" fail "Test compiler has a different BIGNUM_BACKEND ($test_compiler_backend) than requested ($BIGNUM_BACKEND)"
fi fi
fi fi
...@@ -674,12 +665,13 @@ function test_hadrian() { ...@@ -674,12 +665,13 @@ function test_hadrian() {
--summary-junit=./junit.xml \ --summary-junit=./junit.xml \
--test-have-intree-files \ --test-have-intree-files \
--test-compiler="${test_compiler}" \ --test-compiler="${test_compiler}" \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian main testsuite" "runtest.opts+=${RUNTEST_ARGS:-}" \
"runtest.opts+=--unexpected-output-dir=$TOP/unexpected-test-output" \
|| fail "hadrian main testsuite"
info "STAGE2_TEST=$?" info "STAGE2_TEST=$?"
fi fi
} }
function summarise_hi_files() { function summarise_hi_files() {
...@@ -771,6 +763,10 @@ function run_abi_test() { ...@@ -771,6 +763,10 @@ function run_abi_test() {
check_interfaces out/run1 out/run2 interfaces "Mismatched interface hashes" check_interfaces out/run1 out/run2 interfaces "Mismatched interface hashes"
} }
function save_test_output() {
tar -czf unexpected-test-output.tar.gz unexpected-test-output
}
function save_cache () { function save_cache () {
info "Storing cabal cache from $CABAL_DIR to $CABAL_CACHE..." info "Storing cabal cache from $CABAL_DIR to $CABAL_CACHE..."
rm -Rf "$CABAL_CACHE" rm -Rf "$CABAL_CACHE"
...@@ -806,7 +802,7 @@ function shell() { ...@@ -806,7 +802,7 @@ function shell() {
if [ -z "$cmd" ]; then if [ -z "$cmd" ]; then
cmd="bash -i" cmd="bash -i"
fi fi
run "$cmd" run $cmd
} }
function lint_author(){ function lint_author(){
...@@ -915,8 +911,8 @@ determine_metric_baseline ...@@ -915,8 +911,8 @@ determine_metric_baseline
set_toolchain_paths set_toolchain_paths
case $1 in case ${1:-help} in
usage) usage ;; help|usage) usage ;;
setup) setup && cleanup_submodules ;; setup) setup && cleanup_submodules ;;
configure) time_it "configure" configure ;; configure) time_it "configure" configure ;;
build_hadrian) time_it "build" build_hadrian ;; build_hadrian) time_it "build" build_hadrian ;;
...@@ -936,6 +932,7 @@ case $1 in ...@@ -936,6 +932,7 @@ case $1 in
lint_author) shift; lint_author "$@" ;; lint_author) shift; lint_author "$@" ;;
compare_interfaces_of) shift; compare_interfaces_of "$@" ;; compare_interfaces_of) shift; compare_interfaces_of "$@" ;;
clean) clean ;; clean) clean ;;
save_test_output) save_test_output ;;
save_cache) save_cache ;; save_cache) save_cache ;;
shell) shift; shell "$@" ;; shell) shift; shell "$@" ;;
*) fail "unknown mode $1" ;; *) fail "unknown mode $1" ;;
......
# Circle CI "backend" for Gitlab CI
# =================================
#
# Usage example:
# .gitlab/circle-ci-job.sh validate-x86_64-linux
#
# There are two things to configure to get artifacts to be
# uploaded to gitlab properly:
#
# - At https://<gitlab host>/admin/application_settings, expand the
# Continuous Integration and Deployment section and set the
# "Maximum artifacts size (MB)" field to something large enough
# to contain the bindists (the test reports are tiny in comparison).
# 500MB seems to work fine, but 200MB might be sufficient.
#
# - If gitlab is exposed behind some form of proxy (e.g nginx), make sure
# the maximum client request body size is large enough to contain all the
# artifacts of a build. For nginx, this would be the following configuration
# option: https://nginx.org/en/docs/http/ngx_http_core_module.html#client_max_body_size
# (which can be set with services.nginx.clientMaxBodySize on nixos).
#!/usr/bin/env sh
set -e
GHCCI_URL="localhost:8888"
[ $# -gt 0 ] || (echo You need to pass the Circle CI job type as argument to this script; exit 1)
[ ${CI_RUNNER_ID:-} ] || (echo "CI_RUNNER_ID is not set"; exit 1)
[ ${CI_JOB_ID:-} ] || (echo "CI_JOB_ID is not set"; exit 1)
[ ${CI_COMMIT_SHA:-} ] || (echo "CI_COMMIT_SHA is not set"; exit 1)
[ ${CI_REPOSITORY_URL:-} ] || (echo "CI_REPOSITORY_URL is not set"; exit 1)
[ ${CI_PIPELINE_ID:-} ] || (echo "CI_PIPELINE_ID is not set"; exit 1)
# the first argument to this script is the Circle CI job type:
# validate-x86_64-linux, validate-i386-linux, ...
CIRCLE_JOB="circleci-$1"
gitlab_user=$(echo $CI_REPOSITORY_URL | cut -d/ -f4)
gitlab_repo=$(echo $CI_REPOSITORY_URL | cut -d/ -f5 | cut -d. -f1)
BODY="{ \"jobType\": \"$CIRCLE_JOB\", \"source\": { \"user\": \"$gitlab_user\", \"project\":\"$gitlab_repo\", \"commit\":\"$CI_COMMIT_SHA\" }, \"pipelineID\": $CI_PIPELINE_ID, \"runnerID\": $CI_RUNNER_ID, \"jobID\": $CI_JOB_ID }"
RESP=$(curl -s -XPOST -H "Content-Type: application/json" -d "$BODY" \
http://${GHCCI_URL}/job)
if [ $? -eq 0 ]; then
build_num=$(echo $RESP | jq '.build_num')
circle_url=$(echo $RESP | jq '.url')
else
echo "Couldn't submit job"
echo $RESP
exit 1
fi
echo Circle CI build number: $build_num
echo Circle CI build page: $circle_url
outcome="null"
STATUS_URL="http://${GHCCI_URL}/job/${build_num}"
STATUS_RESP=""
while [ "$outcome" == "null" ]; do
sleep 30s
STATUS_RESP=$(curl -s $STATUS_URL)
if [ $? -eq 0 ]; then
new_outcome=$(echo $STATUS_RESP | jq '.outcome')
jq_exitcode=$?
if [ "$new_outcome" == "null" ] && [ $jq_exitcode -ne 0 ]; then
echo "Couldn't read 'outcome' field in JSON:"
echo $STATUS_RESP
echo "Skipping"
else
outcome="$new_outcome"
fi
else
echo "curl failed:"
echo $STATUS_RESP
echo "Skipping"
fi
done
if [ "$outcome" == "\"success\"" ]; then
echo The build passed
artifactsBody=$(curl -s http://${GHCCI_URL}/job/${build_num}/artifacts)
(echo $artifactsBody | jq '.[] | .url' | xargs wget -q) || echo "No artifacts"
exit 0
else
echo The build failed
artifactsBody=$(curl -s http://${GHCCI_URL}/job/${build_num}/artifacts)
(echo $artifactsBody | jq '.[] | .url' | xargs wget -q) || echo "No artifacts"
failing_step=$(echo $STATUS_RESP | jq '.steps | .[] | .actions | .[] | select(.status != "success")')
failing_step_name=$(echo $failing_step | jq '.name' | sed -e 's/^"//' -e 's/"$//' -e 's/\\r\\n/\n/')
echo "Failing step: $failing_step_name"
failing_cmds=$(echo $failing_step | jq '.bash_command' | sed -e 's/^"//' -e 's/"$//' -e 's/\\r\\n/\n/')
echo "Failing command(s):"
echo $failing_cmds
log_url=$(echo $failing_step | jq '.output_url' | sed -e 's/^"//' -e 's/"$//' -e 's/\\r\\n/\n/')
echo "Log url: $log_url"
last_log_lines=$(curl -s $log_url | gunzip | jq '.[] | select(.type == "out") | .message' | sed -e 's/^"//' -e 's/"$//' -e 's/\\r\\n/\n/' | tail -50)
echo End of the build log:
echo $last_log_lines
exit 1
fi
...@@ -12,15 +12,15 @@ ...@@ -12,15 +12,15 @@
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz" "url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
}, },
"nixpkgs": { "nixpkgs": {
"branch": "master", "branch": "nixos-unstable",
"description": "Nix Packages collection", "description": "Nix Packages collection",
"homepage": "", "homepage": "",
"owner": "nixos", "owner": "nixos",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "ce1aa29621356706746c53e2d480da7c68f6c972", "rev": "73de017ef2d18a04ac4bfd0c02650007ccb31c2a",
"sha256": "sha256:1sbs3gi1nf4rcbmnw69fw0fpvb3qvlsa84hqimv78vkpd6xb0bgg", "sha256": "1v9sy2i2dy3qksx4mf81gwzfl0jzpqccfkzq7fjxgq832f9d255i",
"type": "tarball", "type": "tarball",
"url": "https://github.com/nixos/nixpkgs/archive/ce1aa29621356706746c53e2d480da7c68f6c972.tar.gz", "url": "https://github.com/nixos/nixpkgs/archive/73de017ef2d18a04ac4bfd0c02650007ccb31c2a.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz" "url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
} }
} }
...@@ -4,6 +4,7 @@ let ...@@ -4,6 +4,7 @@ let
sources = import ./nix/sources.nix; sources = import ./nix/sources.nix;
nixpkgsSrc = sources.nixpkgs; nixpkgsSrc = sources.nixpkgs;
pkgs = import nixpkgsSrc { inherit system; }; pkgs = import nixpkgsSrc { inherit system; };
hostPkgs = import nixpkgsSrc { };
in in
let let
...@@ -13,23 +14,26 @@ let ...@@ -13,23 +14,26 @@ let
targetTriple = pkgs.stdenv.targetPlatform.config; targetTriple = pkgs.stdenv.targetPlatform.config;
ghcBindists = let version = ghc.version; in { ghcBindists = let version = ghc.version; in {
aarch64-darwin = pkgs.fetchurl { aarch64-darwin = hostPkgs.fetchurl {
url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-aarch64-apple-darwin.tar.xz"; url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-aarch64-apple-darwin.tar.xz";
sha256 = "sha256-tQUHsingxBizLktswGAoi6lJf92RKWLjsHB9CisANlg="; sha256 = "sha256-c1GTMJf3/yiW/t4QL532EswD5JVlgA4getkfsxj4TaA=";
}; };
x86_64-darwin = pkgs.fetchurl { x86_64-darwin = hostPkgs.fetchurl {
url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-x86_64-apple-darwin.tar.xz"; url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-x86_64-apple-darwin.tar.xz";
sha256 = "sha256-OjXjVe+ZODDCc/hqtihqqz6CX25TKI0ZgORzkR5O3pQ="; sha256 = "sha256-LrYniMG0phsvyW6dhQC+3ompvzcxnwAe6GezEqqzoTQ=";
}; };
}; };
ghc = pkgs.stdenv.mkDerivation rec { ghc = pkgs.stdenv.mkDerivation rec {
version = "9.4.4"; # Using 9.6.2 because of #24050
version = "9.6.2";
name = "ghc"; name = "ghc";
src = ghcBindists.${pkgs.stdenv.hostPlatform.system}; src = ghcBindists.${pkgs.stdenv.hostPlatform.system};
configureFlags = [ configureFlags = [
"CC=/usr/bin/clang" "CC=/usr/bin/clang"
"CLANG=/usr/bin/clang" "CLANG=/usr/bin/clang"
"AR=/usr/bin/ar"
"LLC=${llvm}/bin/llc" "LLC=${llvm}/bin/llc"
"OPT=${llvm}/bin/opt" "OPT=${llvm}/bin/opt"
"CONF_CC_OPTS_STAGE2=--target=${targetTriple}" "CONF_CC_OPTS_STAGE2=--target=${targetTriple}"
...@@ -92,7 +96,7 @@ let ...@@ -92,7 +96,7 @@ let
}; };
fonts = with pkgs; makeFontsConf { fontDirectories = [ dejavu_fonts ]; }; fonts = with pkgs; makeFontsConf { fontDirectories = [ dejavu_fonts ]; };
llvm = pkgs.llvm_11; llvm = pkgs.llvm_15;
in in
pkgs.writeTextFile { pkgs.writeTextFile {
name = "toolchain"; name = "toolchain";
...@@ -113,6 +117,8 @@ pkgs.writeTextFile { ...@@ -113,6 +117,8 @@ pkgs.writeTextFile {
export CABAL="$CABAL_INSTALL" export CABAL="$CABAL_INSTALL"
sdk_path="$(xcrun --sdk macosx --show-sdk-path)" sdk_path="$(xcrun --sdk macosx --show-sdk-path)"
export CONFIGURE_ARGS="$CONFIGURE_ARGS --with-ffi-libraries=$sdk_path/usr/lib --with-ffi-includes=$sdk_path/usr/include/ffi --build=${targetTriple}" : ''${CONFIGURE_ARGS:=}
CONFIGURE_ARGS+="''${CONFIGURE_ARGS:+ }--with-ffi-libraries=$sdk_path/usr/lib --with-ffi-includes=$sdk_path/usr/include/ffi --build=${targetTriple}"
export CONFIGURE_ARGS
''; '';
} }
cabal-version: 3.0
name: gen-ci
version: 0.1.0.0
build-type: Simple
common warnings
ghc-options: -Wall
executable gen_ci
import: warnings
main-is: gen_ci.hs
build-depends:
, aeson >=1.8.1
, base
, bytestring
, containers
default-language: Haskell2010
Copyright (c) 2023, The GHC Developers
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
* Neither the name of The GHC Developers nor the names of other
contributors may be used to endorse or promote products derived
from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# generate-ci
This is the generator for GHC's GitLab CI infrastructure. In particular, this
generates two outputs:
* `.gitlab/jobs.yaml`, which is a YAML (or, strictly speaking, JSON)
file which defines the bulk of the validation, nightly, and release jobs of
GHC's CI. This is committed to the GHC repository and must be updated
whenever `gen_ci.hs` is modified.
* `.gitlab/jobs-metadata.json`, which is a mapping between platforms and
produced binary distribution names used when producing `ghcup` metadata
for nightly pipeline artifacts (see the `.ghcup-metadata` job in
`/.gitlab-ci.yaml`).
## Modifying the CI configuration (nix)
The jobs are defined in `gen_ci.hs`. After modifying this you can run
```sh
nix run .gitlab/generate-ci#generate-jobs
```
from the top of the GHC repository to update the generated configuration.
## Modifying the CI configuration (without nix)
One can run `update-ci` without Nix as follows (assuming one has `jq`,
`cabal-install`, and GHC installed):
```sh
$ cabal build generate-ci
$ PATH="$(dirname $(cabal list-bin generate-ci)):$PATH"
$ ./generate-jobs
```
{
"nodes": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1687709756,
"narHash": "sha256-Y5wKlQSkgEK2weWdOu4J3riRd+kV/VCgHsqLNTTWQ/0=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "dbabf0ca0c0c4bce6ea5eaf65af5cb694d2082c7",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1687886075,
"narHash": "sha256-PeayJDDDy+uw1Ats4moZnRdL1OFuZm1Tj+KiHlD67+o=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "a565059a348422af5af9026b5174dc5c0dcefdae",
"type": "github"
},
"original": {
"id": "nixpkgs",
"type": "indirect"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}
{
description = "GHC CI Generator";
inputs.flake-utils.url = "github:numtide/flake-utils";
outputs = { self, nixpkgs, flake-utils }:
flake-utils.lib.eachDefaultSystem (system:
let pkgs = nixpkgs.legacyPackages.${system}; in
{
packages = rec {
# The Haskell generator executable
generate-ci = pkgs.haskellPackages.callCabal2nix "generate-ci" ./. {};
# Wrapper scripts
generate-job-metadata = pkgs.runCommand "generate-job-metadata" {
nativeBuildInputs = with pkgs; [ makeWrapper ];
} ''
mkdir -p $out/bin
makeWrapper ${./generate-job-metadata} $out/bin/generate-job-metadata \
--prefix PATH : ${with pkgs; lib.makeBinPath [ generate-ci gitMinimal ]}
'';
generate-jobs = pkgs.runCommand "generate-jobs" {
nativeBuildInputs = with pkgs; [ makeWrapper ];
} ''
mkdir -p $out/bin
makeWrapper ${./generate-jobs} $out/bin/generate-jobs \
--prefix PATH : ${with pkgs; lib.makeBinPath [ generate-ci jq gitMinimal ]}
'';
default = generate-jobs;
};
apps = rec {
generate-jobs = flake-utils.lib.mkApp {
drv = self.packages.${system}.generate-jobs;
};
generate-job-metadata = flake-utils.lib.mkApp {
drv = self.packages.${system}.generate-job-metadata;
};
default = generate-jobs;
};
}
);
}
...@@ -3,9 +3,7 @@ ...@@ -3,9 +3,7 @@
{-# LANGUAGE OverloadedStrings #-} {-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE DeriveFunctor #-} {-# LANGUAGE DeriveFunctor #-}
{-# LANGUAGE GeneralizedNewtypeDeriving #-} {-# LANGUAGE GeneralizedNewtypeDeriving #-}
{- cabal: {-# LANGUAGE ViewPatterns #-}
build-depends: base, aeson >= 1.8.1, containers, bytestring
-}
import Data.Aeson as A import Data.Aeson as A
import qualified Data.Map as Map import qualified Data.Map as Map
...@@ -13,10 +11,9 @@ import Data.Map (Map) ...@@ -13,10 +11,9 @@ import Data.Map (Map)
import Data.Maybe import Data.Maybe
import qualified Data.ByteString.Lazy as B import qualified Data.ByteString.Lazy as B
import qualified Data.ByteString.Lazy.Char8 as B import qualified Data.ByteString.Lazy.Char8 as B
import Data.List (intercalate)
import Data.Set (Set)
import qualified Data.Set as S import qualified Data.Set as S
import System.Environment import System.Environment
import Data.List
{- {-
Note [Generating the CI pipeline] Note [Generating the CI pipeline]
...@@ -108,12 +105,18 @@ data Opsys ...@@ -108,12 +105,18 @@ data Opsys
| Windows deriving (Eq) | Windows deriving (Eq)
data LinuxDistro data LinuxDistro
= Debian11 | Debian10 | Debian9 = Debian12
| Debian11
| Debian11Js
| Debian10
| Debian9
| Fedora33 | Fedora33
| Fedora38
| Ubuntu2004 | Ubuntu2004
| Ubuntu1804 | Ubuntu1804
| Centos7 | Centos7
| Alpine | Alpine312
| Alpine318
| AlpineWasm | AlpineWasm
| Rocky8 | Rocky8
deriving (Eq) deriving (Eq)
...@@ -141,6 +144,7 @@ data BuildConfig ...@@ -141,6 +144,7 @@ data BuildConfig
, llvmBootstrap :: Bool , llvmBootstrap :: Bool
, withAssertions :: Bool , withAssertions :: Bool
, withNuma :: Bool , withNuma :: Bool
, withZstd :: Bool
, crossTarget :: Maybe String , crossTarget :: Maybe String
, crossEmulator :: CrossEmulator , crossEmulator :: CrossEmulator
, configureWrapper :: Maybe String , configureWrapper :: Maybe String
...@@ -149,15 +153,18 @@ data BuildConfig ...@@ -149,15 +153,18 @@ data BuildConfig
, threadSanitiser :: Bool , threadSanitiser :: Bool
, noSplitSections :: Bool , noSplitSections :: Bool
, validateNonmovingGc :: Bool , validateNonmovingGc :: Bool
, textWithSIMDUTF :: Bool
} }
-- Extra arguments to pass to ./configure due to the BuildConfig -- Extra arguments to pass to ./configure due to the BuildConfig
configureArgsStr :: BuildConfig -> String configureArgsStr :: BuildConfig -> String
configureArgsStr bc = unwords $ configureArgsStr bc = unwords $
["--enable-unregisterised"| unregisterised bc ] ["--enable-unregisterised"| unregisterised bc ]
++ ["--disable-tables-next-to-code" | not (tablesNextToCode bc) ] ++ ["--disable-tables-next-to-code" | not (tablesNextToCode bc) ]
++ ["--with-intree-gmp" | Just _ <- pure (crossTarget bc) ] ++ ["--with-intree-gmp" | Just _ <- pure (crossTarget bc) ]
++ ["--with-system-libffi" | crossTarget bc == Just "wasm32-wasi" ] ++ ["--with-system-libffi" | crossTarget bc == Just "wasm32-wasi" ]
++ ["--enable-ipe-data-compression" | withZstd bc ]
++ ["--enable-strict-ghc-toolchain-check"]
-- Compute the hadrian flavour from the BuildConfig -- Compute the hadrian flavour from the BuildConfig
mkJobFlavour :: BuildConfig -> Flavour mkJobFlavour :: BuildConfig -> Flavour
...@@ -168,13 +175,19 @@ mkJobFlavour BuildConfig{..} = Flavour buildFlavour opts ...@@ -168,13 +175,19 @@ mkJobFlavour BuildConfig{..} = Flavour buildFlavour opts
[FullyStatic | fullyStatic] ++ [FullyStatic | fullyStatic] ++
[ThreadSanitiser | threadSanitiser] ++ [ThreadSanitiser | threadSanitiser] ++
[NoSplitSections | noSplitSections, buildFlavour == Release ] ++ [NoSplitSections | noSplitSections, buildFlavour == Release ] ++
[BootNonmovingGc | validateNonmovingGc ] [BootNonmovingGc | validateNonmovingGc ] ++
[TextWithSIMDUTF | textWithSIMDUTF]
data Flavour = Flavour BaseFlavour [FlavourTrans] data Flavour = Flavour BaseFlavour [FlavourTrans]
data FlavourTrans data FlavourTrans =
= Llvm | Dwarf | FullyStatic | ThreadSanitiser | NoSplitSections Llvm
| Dwarf
| FullyStatic
| ThreadSanitiser
| NoSplitSections
| BootNonmovingGc | BootNonmovingGc
| TextWithSIMDUTF
data BaseFlavour = Release | Validate | SlowValidate deriving Eq data BaseFlavour = Release | Validate | SlowValidate deriving Eq
...@@ -192,6 +205,7 @@ vanilla = BuildConfig ...@@ -192,6 +205,7 @@ vanilla = BuildConfig
, llvmBootstrap = False , llvmBootstrap = False
, withAssertions = False , withAssertions = False
, withNuma = False , withNuma = False
, withZstd = False
, crossTarget = Nothing , crossTarget = Nothing
, crossEmulator = NoEmulator , crossEmulator = NoEmulator
, configureWrapper = Nothing , configureWrapper = Nothing
...@@ -200,6 +214,7 @@ vanilla = BuildConfig ...@@ -200,6 +214,7 @@ vanilla = BuildConfig
, threadSanitiser = False , threadSanitiser = False
, noSplitSections = False , noSplitSections = False
, validateNonmovingGc = False , validateNonmovingGc = False
, textWithSIMDUTF = False
} }
splitSectionsBroken :: BuildConfig -> BuildConfig splitSectionsBroken :: BuildConfig -> BuildConfig
...@@ -224,6 +239,9 @@ debug = vanilla { buildFlavour = SlowValidate ...@@ -224,6 +239,9 @@ debug = vanilla { buildFlavour = SlowValidate
, withNuma = True , withNuma = True
} }
zstdIpe :: BuildConfig
zstdIpe = vanilla { withZstd = True }
static :: BuildConfig static :: BuildConfig
static = vanilla { fullyStatic = True } static = vanilla { fullyStatic = True }
...@@ -272,15 +290,19 @@ tags arch opsys _bc = [runnerTag arch opsys] -- Tag for which runners we can use ...@@ -272,15 +290,19 @@ tags arch opsys _bc = [runnerTag arch opsys] -- Tag for which runners we can use
-- These names are used to find the docker image so they have to match what is -- These names are used to find the docker image so they have to match what is
-- in the docker registry. -- in the docker registry.
distroName :: LinuxDistro -> String distroName :: LinuxDistro -> String
distroName Debian11 = "deb11" distroName Debian12 = "deb12"
distroName Debian11 = "deb11"
distroName Debian11Js = "deb11-emsdk-closure"
distroName Debian10 = "deb10" distroName Debian10 = "deb10"
distroName Debian9 = "deb9" distroName Debian9 = "deb9"
distroName Fedora33 = "fedora33" distroName Fedora33 = "fedora33"
distroName Fedora38 = "fedora38"
distroName Ubuntu1804 = "ubuntu18_04" distroName Ubuntu1804 = "ubuntu18_04"
distroName Ubuntu2004 = "ubuntu20_04" distroName Ubuntu2004 = "ubuntu20_04"
distroName Centos7 = "centos7" distroName Centos7 = "centos7"
distroName Alpine = "alpine3_12" distroName Alpine312 = "alpine3_12"
distroName AlpineWasm = "alpine3_17-wasm" distroName Alpine318 = "alpine3_18"
distroName AlpineWasm = "alpine3_18-wasm"
distroName Rocky8 = "rocky8" distroName Rocky8 = "rocky8"
opsysName :: Opsys -> String opsysName :: Opsys -> String
...@@ -307,24 +329,26 @@ testEnv arch opsys bc = intercalate "-" $ ...@@ -307,24 +329,26 @@ testEnv arch opsys bc = intercalate "-" $
++ ["int_" ++ bignumString (bignumBackend bc) | bignumBackend bc /= Gmp] ++ ["int_" ++ bignumString (bignumBackend bc) | bignumBackend bc /= Gmp]
++ ["unreg" | unregisterised bc ] ++ ["unreg" | unregisterised bc ]
++ ["numa" | withNuma bc ] ++ ["numa" | withNuma bc ]
++ ["zstd" | withZstd bc ]
++ ["no_tntc" | not (tablesNextToCode bc) ] ++ ["no_tntc" | not (tablesNextToCode bc) ]
++ ["cross_"++triple | Just triple <- pure $ crossTarget bc ] ++ ["cross_"++triple | Just triple <- pure $ crossTarget bc ]
++ [flavourString (mkJobFlavour bc)] ++ [flavourString (mkJobFlavour bc)]
-- | The hadrian flavour string we are going to use for this build -- | The hadrian flavour string we are going to use for this build
flavourString :: Flavour -> String flavourString :: Flavour -> String
flavourString (Flavour base trans) = baseString base ++ concatMap (("+" ++) . flavourString) trans flavourString (Flavour base trans) = base_string base ++ concatMap (("+" ++) . flavour_string) trans
where where
baseString Release = "release" base_string Release = "release"
baseString Validate = "validate" base_string Validate = "validate"
baseString SlowValidate = "slow-validate" base_string SlowValidate = "slow-validate"
flavourString Llvm = "llvm" flavour_string Llvm = "llvm"
flavourString Dwarf = "debug_info" flavour_string Dwarf = "debug_info"
flavourString FullyStatic = "fully_static" flavour_string FullyStatic = "fully_static"
flavourString ThreadSanitiser = "thread_sanitizer" flavour_string ThreadSanitiser = "thread_sanitizer_cmm"
flavourString NoSplitSections = "no_split_sections" flavour_string NoSplitSections = "no_split_sections"
flavourString BootNonmovingGc = "boot_nonmoving_gc" flavour_string BootNonmovingGc = "boot_nonmoving_gc"
flavour_string TextWithSIMDUTF = "text_simdutf"
-- The path to the docker image (just for linux builders) -- The path to the docker image (just for linux builders)
dockerImage :: Arch -> Opsys -> Maybe String dockerImage :: Arch -> Opsys -> Maybe String
...@@ -380,8 +404,8 @@ opsysVariables _ FreeBSD13 = mconcat ...@@ -380,8 +404,8 @@ opsysVariables _ FreeBSD13 = mconcat
-- [1] https://www.freebsd.org/doc/en/books/porters-handbook/using-iconv.html) -- [1] https://www.freebsd.org/doc/en/books/porters-handbook/using-iconv.html)
"CONFIGURE_ARGS" =: "--with-gmp-includes=/usr/local/include --with-gmp-libraries=/usr/local/lib --with-iconv-includes=/usr/local/include --with-iconv-libraries=/usr/local/lib" "CONFIGURE_ARGS" =: "--with-gmp-includes=/usr/local/include --with-gmp-libraries=/usr/local/lib --with-iconv-includes=/usr/local/include --with-iconv-libraries=/usr/local/lib"
, "HADRIAN_ARGS" =: "--docs=no-sphinx" , "HADRIAN_ARGS" =: "--docs=no-sphinx"
, "GHC_VERSION" =: "9.4.3" , "GHC_VERSION" =: "9.6.4"
, "CABAL_INSTALL_VERSION" =: "3.8.1.0" , "CABAL_INSTALL_VERSION" =: "3.10.2.0"
] ]
opsysVariables _ (Linux distro) = distroVariables distro opsysVariables _ (Linux distro) = distroVariables distro
opsysVariables AArch64 (Darwin {}) = opsysVariables AArch64 (Darwin {}) =
...@@ -390,47 +414,47 @@ opsysVariables AArch64 (Darwin {}) = ...@@ -390,47 +414,47 @@ opsysVariables AArch64 (Darwin {}) =
, "LANG" =: "en_US.UTF-8" , "LANG" =: "en_US.UTF-8"
, "CONFIGURE_ARGS" =: "--with-intree-gmp --with-system-libffi" , "CONFIGURE_ARGS" =: "--with-intree-gmp --with-system-libffi"
-- Fonts can't be installed on darwin -- Fonts can't be installed on darwin
, "HADRIAN_ARGS" =: "--docs=no-sphinx" , "HADRIAN_ARGS" =: "--docs=no-sphinx-pdfs"
] ]
opsysVariables Amd64 (Darwin {}) = opsysVariables Amd64 (Darwin {}) =
mconcat [ "NIX_SYSTEM" =: "x86_64-darwin" mconcat [ "NIX_SYSTEM" =: "x86_64-darwin"
, "MACOSX_DEPLOYMENT_TARGET" =: "10.10" , "MACOSX_DEPLOYMENT_TARGET" =: "10.13"
-- "# Only Sierra and onwards supports clock_gettime. See #12858" -- "# Only Sierra and onwards supports clock_gettime. See #12858"
, "ac_cv_func_clock_gettime" =: "no" , "ac_cv_func_clock_gettime" =: "no"
-- # Only newer OS Xs support utimensat. See #17895 -- # Only newer OS Xs support utimensat. See #17895
, "ac_cv_func_utimensat" =: "no" , "ac_cv_func_utimensat" =: "no"
-- # Only newer OS Xs support futimens. See #22938
, "ac_cv_func_futimens" =: "no"
, "LANG" =: "en_US.UTF-8" , "LANG" =: "en_US.UTF-8"
, "CONFIGURE_ARGS" =: "--with-intree-gmp --with-system-libffi" , "CONFIGURE_ARGS" =: "--with-intree-gmp --with-system-libffi"
-- Fonts can't be installed on darwin -- Fonts can't be installed on darwin
, "HADRIAN_ARGS" =: "--docs=no-sphinx" , "HADRIAN_ARGS" =: "--docs=no-sphinx-pdfs"
] ]
opsysVariables _ (Windows {}) = opsysVariables _ (Windows {}) =
mconcat [ "MSYSTEM" =: "CLANG64" mconcat [ "MSYSTEM" =: "CLANG64"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
, "LANG" =: "en_US.UTF-8" , "LANG" =: "en_US.UTF-8"
, "CABAL_INSTALL_VERSION" =: "3.8.1.0" , "CABAL_INSTALL_VERSION" =: "3.10.2.0"
, "GHC_VERSION" =: "9.4.3" ] , "HADRIAN_ARGS" =: "--docs=no-sphinx-pdfs"
, "GHC_VERSION" =: "9.6.4" ]
opsysVariables _ _ = mempty opsysVariables _ _ = mempty
alpineVariables = mconcat
distroVariables :: LinuxDistro -> Variables
distroVariables Alpine = mconcat
[ -- Due to #20266 [ -- Due to #20266
"CONFIGURE_ARGS" =: "--disable-ld-override" "CONFIGURE_ARGS" =: "--disable-ld-override"
, "INSTALL_CONFIGURE_ARGS" =: "--disable-ld-override" , "INSTALL_CONFIGURE_ARGS" =: "--disable-ld-override"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
-- encoding004: due to lack of locale support -- encoding004: due to lack of locale support
-- T10458, ghcilink002: due to #17869 -- T10458, ghcilink002: due to #17869
-- linker_unload_native: due to musl not supporting any means of probing dynlib dependencies , "BROKEN_TESTS" =: "encoding004 T10458"
-- (see Note [Object unloading]).
, "BROKEN_TESTS" =: "encoding004 T10458 linker_unload_native"
] ]
distroVariables :: LinuxDistro -> Variables
distroVariables Alpine312 = alpineVariables
distroVariables Alpine318 = alpineVariables
distroVariables Centos7 = mconcat [ distroVariables Centos7 = mconcat [
"HADRIAN_ARGS" =: "--docs=no-sphinx" "HADRIAN_ARGS" =: "--docs=no-sphinx"
] , "BROKEN_TESTS" =: "T22012" -- due to #23979
distroVariables Rocky8 = mconcat [
"HADRIAN_ARGS" =: "--docs=no-sphinx"
] ]
distroVariables Fedora33 = mconcat distroVariables Fedora33 = mconcat
-- LLC/OPT do not work for some reason in our fedora images -- LLC/OPT do not work for some reason in our fedora images
...@@ -494,21 +518,34 @@ instance ToJSON ArtifactsWhen where ...@@ -494,21 +518,34 @@ instance ToJSON ArtifactsWhen where
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
-- Data structure which records the condition when a job is run. -- Data structure which records the condition when a job is run.
data OnOffRules = OnOffRules { rule_set :: Set Rule -- ^ The set of enabled rules data OnOffRules = OnOffRules { rule_set :: Rule -- ^ The enabled rules
, when :: ManualFlag -- ^ The additional condition about when to run this job. , when :: ManualFlag -- ^ The additional condition about when to run this job.
} }
-- The initial set of rules where all rules are disabled and the job is always run. -- The initial set of rules, which assumes a Validate pipeline which is run with FullCI.
emptyRules :: OnOffRules emptyRules :: OnOffRules
emptyRules = OnOffRules S.empty OnSuccess emptyRules = OnOffRules (ValidateOnly (S.singleton FullCI)) OnSuccess
-- When to run the job -- When to run the job
data ManualFlag = Manual -- ^ Only run the job when explicitly triggered by a user data ManualFlag = Manual -- ^ Only run the job when explicitly triggered by a user
| OnSuccess -- ^ Always run it, if the rules pass (the default) | OnSuccess -- ^ Always run it, if the rules pass (the default)
deriving Eq deriving Eq
enableRule :: Rule -> OnOffRules -> OnOffRules setRule :: Rule -> OnOffRules -> OnOffRules
enableRule r (OnOffRules o m) = OnOffRules (S.insert r o) m setRule r (OnOffRules _ m) = OnOffRules r m
enableValidateRule :: ValidateRule -> OnOffRules -> OnOffRules
enableValidateRule r = modifyValidateRules (S.insert r)
onlyValidateRule :: ValidateRule -> OnOffRules -> OnOffRules
onlyValidateRule r = modifyValidateRules (const (S.singleton r))
removeValidateRule :: ValidateRule -> OnOffRules -> OnOffRules
removeValidateRule r = modifyValidateRules (S.delete r)
modifyValidateRules :: (S.Set ValidateRule -> S.Set ValidateRule) -> OnOffRules -> OnOffRules
modifyValidateRules f (OnOffRules (ValidateOnly rs) m) = OnOffRules (ValidateOnly (f rs)) m
modifyValidateRules _ r = error $ "Applying validate rule to nightly/release job:" ++ show (rule_set r)
manualRule :: OnOffRules -> OnOffRules manualRule :: OnOffRules -> OnOffRules
manualRule rules = rules { when = Manual } manualRule rules = rules { when = Manual }
...@@ -517,10 +554,19 @@ manualRule rules = rules { when = Manual } ...@@ -517,10 +554,19 @@ manualRule rules = rules { when = Manual }
-- For example, even if you don't explicitly disable a rule it will end up in the -- For example, even if you don't explicitly disable a rule it will end up in the
-- rule list with the OFF state. -- rule list with the OFF state.
enumRules :: OnOffRules -> [OnOffRule] enumRules :: OnOffRules -> [OnOffRule]
enumRules o = map lkup rules enumRules (OnOffRules r _) = rulesList
where where
enabled_rules = rule_set o rulesList = case r of
lkup r = OnOffRule (if S.member r enabled_rules then On else Off) r ValidateOnly rs -> [OnOffRule On (ValidateOnly rs)
, OnOffRule Off ReleaseOnly
, OnOffRule Off Nightly ]
Nightly -> [ OnOffRule Off (ValidateOnly S.empty)
, OnOffRule Off ReleaseOnly
, OnOffRule On Nightly ]
ReleaseOnly -> [ OnOffRule Off (ValidateOnly S.empty)
, OnOffRule On ReleaseOnly
, OnOffRule Off Nightly ]
data OnOffRule = OnOffRule OnOff Rule data OnOffRule = OnOffRule OnOff Rule
...@@ -542,19 +588,32 @@ instance ToJSON OnOffRules where ...@@ -542,19 +588,32 @@ instance ToJSON OnOffRules where
where where
one_rule (OnOffRule onoff r) = ruleString onoff r one_rule (OnOffRule onoff r) = ruleString onoff r
parens s = "(" ++ s ++ ")"
and_all rs = intercalate " && " (map parens rs)
parens :: [Char] -> [Char]
parens s = "(" ++ s ++ ")"
and_all :: [[Char]] -> [Char]
and_all rs = intercalate " && " (map parens rs)
or_all :: [[Char]] -> [Char]
or_all rs = intercalate " || " (map parens rs)
-- | A Rule corresponds to some condition which must be satisifed in order to -- | A Rule corresponds to some condition which must be satisifed in order to
-- run the job. -- run the job.
data Rule = FastCI -- ^ Run this job when the fast-ci label is set data Rule = ReleaseOnly -- ^ Only run this job in a release pipeline
| ReleaseOnly -- ^ Only run this job in a release pipeline
| Nightly -- ^ Only run this job in the nightly pipeline | Nightly -- ^ Only run this job in the nightly pipeline
| LLVMBackend -- ^ Only run this job when the "LLVM backend" label is present | ValidateOnly (S.Set ValidateRule) -- ^ Only run this job in a validate pipeline, when any of these rules are enabled.
| FreeBSDLabel -- ^ Only run this job when the "FreeBSD" label is set. deriving (Show, Ord, Eq)
| NonmovingGc -- ^ Only run this job when the "non-moving GC" label is set.
| Disable -- ^ Don't run this job. data ValidateRule =
deriving (Bounded, Enum, Ord, Eq) FullCI -- ^ Run this job when the "full-ci" label is present.
| LLVMBackend -- ^ Run this job when the "LLVM backend" label is present
| JSBackend -- ^ Run this job when the "javascript" label is present
| WasmBackend -- ^ Run this job when the "wasm" label is present
| FreeBSDLabel -- ^ Run this job when the "FreeBSD" label is set.
| NonmovingGc -- ^ Run this job when the "non-moving GC" label is set.
| IpeData -- ^ Run this job when the "IPE" label is set
| TestPrimops -- ^ Run this job when "test-primops" label is set
deriving (Show, Enum, Bounded, Ord, Eq)
-- A constant evaluating to True because gitlab doesn't support "true" in the -- A constant evaluating to True because gitlab doesn't support "true" in the
-- expression language. -- expression language.
...@@ -562,29 +621,45 @@ true :: String ...@@ -562,29 +621,45 @@ true :: String
true = "\"true\" == \"true\"" true = "\"true\" == \"true\""
-- A constant evaluating to False because gitlab doesn't support "true" in the -- A constant evaluating to False because gitlab doesn't support "true" in the
-- expression language. -- expression language.
false :: String _false :: String
false = "\"disabled\" != \"disabled\"" _false = "\"disabled\" != \"disabled\""
-- Convert the state of the rule into a string that gitlab understand. -- Convert the state of the rule into a string that gitlab understand.
ruleString :: OnOff -> Rule -> String ruleString :: OnOff -> Rule -> String
ruleString On FastCI = true ruleString On (ValidateOnly vs) =
ruleString Off FastCI = "$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/" case S.toList vs of
ruleString On LLVMBackend = "$CI_MERGE_REQUEST_LABELS =~ /.*LLVM backend.*/" [] -> true
ruleString Off LLVMBackend = true conds -> or_all (map validateRuleString conds)
ruleString On FreeBSDLabel = "$CI_MERGE_REQUEST_LABELS =~ /.*FreeBSD.*/" ruleString Off (ValidateOnly {}) = true
ruleString Off FreeBSDLabel = true
ruleString On NonmovingGc = "$CI_MERGE_REQUEST_LABELS =~ /.*non-moving GC.*/"
ruleString Off NonmovingGc = true
ruleString On ReleaseOnly = "$RELEASE_JOB == \"yes\"" ruleString On ReleaseOnly = "$RELEASE_JOB == \"yes\""
ruleString Off ReleaseOnly = "$RELEASE_JOB != \"yes\"" ruleString Off ReleaseOnly = "$RELEASE_JOB != \"yes\""
ruleString On Nightly = "$NIGHTLY" ruleString On Nightly = "$NIGHTLY"
ruleString Off Nightly = "$NIGHTLY == null" ruleString Off Nightly = "$NIGHTLY == null"
ruleString On Disable = false
ruleString Off Disable = true
-- Enumeration of all the rules labelString :: String -> String
rules :: [Rule] labelString s = "$CI_MERGE_REQUEST_LABELS =~ /.*" ++ s ++ ".*/"
rules = [minBound .. maxBound]
branchStringExact :: String -> String
branchStringExact s = "$CI_COMMIT_BRANCH == \"" ++ s ++ "\""
branchStringLike :: String -> String
branchStringLike s = "$CI_COMMIT_BRANCH =~ /" ++ s ++ "/"
validateRuleString :: ValidateRule -> String
validateRuleString FullCI = or_all ([ labelString "full-ci"
, labelString "marge_bot_batch_merge_job"
, branchStringExact "master"
, branchStringLike "ghc-[0-9]+\\.[0-9]+"
])
validateRuleString LLVMBackend = labelString "LLVM backend"
validateRuleString JSBackend = labelString "javascript"
validateRuleString WasmBackend = labelString "wasm"
validateRuleString FreeBSDLabel = labelString "FreeBSD"
validateRuleString NonmovingGc = labelString "non-moving GC"
validateRuleString IpeData = labelString "IPE"
validateRuleString TestPrimops = labelString "test-primops"
-- | A 'Job' is the description of a single job in a gitlab pipeline. The -- | A 'Job' is the description of a single job in a gitlab pipeline. The
-- job contains all the information about how to do the build but can be further -- job contains all the information about how to do the build but can be further
...@@ -658,10 +733,12 @@ job arch opsys buildConfig = NamedJob { name = jobName, jobInfo = Job {..} } ...@@ -658,10 +733,12 @@ job arch opsys buildConfig = NamedJob { name = jobName, jobInfo = Job {..} }
jobAfterScript jobAfterScript
| Windows <- opsys = | Windows <- opsys =
[ "bash .gitlab/ci.sh save_cache" [ "bash .gitlab/ci.sh save_cache"
, "bash .gitlab/ci.sh save_test_output"
, "bash .gitlab/ci.sh clean" , "bash .gitlab/ci.sh clean"
] ]
| otherwise = | otherwise =
[ ".gitlab/ci.sh save_cache" [ ".gitlab/ci.sh save_cache"
, ".gitlab/ci.sh save_test_output"
, ".gitlab/ci.sh clean" , ".gitlab/ci.sh clean"
, "cat ci_timings" ] , "cat ci_timings" ]
...@@ -684,16 +761,19 @@ job arch opsys buildConfig = NamedJob { name = jobName, jobInfo = Job {..} } ...@@ -684,16 +761,19 @@ job arch opsys buildConfig = NamedJob { name = jobName, jobInfo = Job {..} }
Emulator s -> "CROSS_EMULATOR" =: s Emulator s -> "CROSS_EMULATOR" =: s
NoEmulatorNeeded -> mempty NoEmulatorNeeded -> mempty
, if withNuma buildConfig then "ENABLE_NUMA" =: "1" else mempty , if withNuma buildConfig then "ENABLE_NUMA" =: "1" else mempty
, if validateNonmovingGc buildConfig , let runtestArgs =
then "RUNTEST_ARGS" =: "--way=nonmoving --way=nonmoving_thr --way=nonmoving_thr_sanity" [ "--way=nonmoving --way=nonmoving_thr --way=nonmoving_thr_sanity"
else mempty | validateNonmovingGc buildConfig
]
in "RUNTEST_ARGS" =: unwords runtestArgs
] ]
jobArtifacts = Artifacts jobArtifacts = Artifacts
{ junitReport = "junit.xml" { junitReport = "junit.xml"
, expireIn = "2 weeks" , expireIn = "2 weeks"
, artifactPaths = [binDistName arch opsys buildConfig ++ ".tar.xz" , artifactPaths = [binDistName arch opsys buildConfig ++ ".tar.xz"
,"junit.xml"] ,"junit.xml"
,"unexpected-test-output.tar.gz"]
, artifactsWhen = ArtifactsAlways , artifactsWhen = ArtifactsAlways
} }
...@@ -724,16 +804,28 @@ modifyJobs = fmap ...@@ -724,16 +804,28 @@ modifyJobs = fmap
-- | Modify just the validate jobs in a 'JobGroup' -- | Modify just the validate jobs in a 'JobGroup'
modifyValidateJobs :: (a -> a) -> JobGroup a -> JobGroup a modifyValidateJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyValidateJobs f jg = jg { v = f <$> v jg } modifyValidateJobs f jg = jg { v = fmap f <$> v jg }
-- | Modify just the nightly jobs in a 'JobGroup' -- | Modify just the nightly jobs in a 'JobGroup'
modifyNightlyJobs :: (a -> a) -> JobGroup a -> JobGroup a modifyNightlyJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyNightlyJobs f jg = jg { n = f <$> n jg } modifyNightlyJobs f jg = jg { n = fmap f <$> n jg }
-- Generic helpers -- Generic helpers
addJobRule :: Rule -> Job -> Job setJobRule :: Rule -> Job -> Job
addJobRule r j = j { jobRules = enableRule r (jobRules j) } setJobRule r j = j { jobRules = setRule r (jobRules j) }
addValidateJobRule :: ValidateRule -> Job -> Job
addValidateJobRule r = modifyValidateJobRule (enableValidateRule r)
onlyValidateJobRule :: ValidateRule -> Job -> Job
onlyValidateJobRule r = modifyValidateJobRule (onlyValidateRule r)
removeValidateJobRule :: ValidateRule -> Job -> Job
removeValidateJobRule r = modifyValidateJobRule (removeValidateRule r)
modifyValidateJobRule :: (OnOffRules -> OnOffRules) -> Job -> Job
modifyValidateJobRule f j = j { jobRules = f (jobRules j) }
addVariable :: String -> String -> Job -> Job addVariable :: String -> String -> Job -> Job
addVariable k v j = j { jobVariables = mminsertWith (++) k [v] (jobVariables j) } addVariable k v j = j { jobVariables = mminsertWith (++) k [v] (jobVariables j) }
...@@ -750,17 +842,25 @@ delVariable k j = j { jobVariables = MonoidalMap $ Map.delete k $ unMonoidalMap ...@@ -750,17 +842,25 @@ delVariable k j = j { jobVariables = MonoidalMap $ Map.delete k $ unMonoidalMap
validate :: Arch -> Opsys -> BuildConfig -> NamedJob Job validate :: Arch -> Opsys -> BuildConfig -> NamedJob Job
validate = job validate = job
-- Nightly and release apply the FastCI configuration to all jobs so that they all run in
-- the pipeline (not conditional on the full-ci label)
nightlyRule :: Job -> Job
nightlyRule = setJobRule Nightly
releaseRule :: Job -> Job
releaseRule = setJobRule ReleaseOnly
-- | Make a normal nightly CI job -- | Make a normal nightly CI job
nightly :: Arch -> Opsys -> BuildConfig -> NamedJob Job nightly :: Arch -> Opsys -> BuildConfig -> NamedJob Job
nightly arch opsys bc = nightly arch opsys bc =
let NamedJob n j = job arch opsys bc let NamedJob n j = job arch opsys bc
in NamedJob { name = "nightly-" ++ n, jobInfo = addJobRule Nightly . keepArtifacts "8 weeks" . highCompression $ j} in NamedJob { name = "nightly-" ++ n, jobInfo = nightlyRule . keepArtifacts "8 weeks" . highCompression $ j}
-- | Make a normal release CI job -- | Make a normal release CI job
release :: Arch -> Opsys -> BuildConfig -> NamedJob Job release :: Arch -> Opsys -> BuildConfig -> NamedJob Job
release arch opsys bc = release arch opsys bc =
let NamedJob n j = job arch opsys (bc { buildFlavour = Release }) let NamedJob n j = job arch opsys (bc { buildFlavour = Release })
in NamedJob { name = "release-" ++ n, jobInfo = addJobRule ReleaseOnly . keepArtifacts "1 year" . ignorePerfFailures . useHashUnitIds . highCompression $ j} in NamedJob { name = "release-" ++ n, jobInfo = releaseRule . keepArtifacts "1 year" . ignorePerfFailures . useHashUnitIds . highCompression $ j}
-- Specific job modification functions -- Specific job modification functions
...@@ -789,8 +889,9 @@ useHashUnitIds :: Job -> Job ...@@ -789,8 +889,9 @@ useHashUnitIds :: Job -> Job
useHashUnitIds = addVariable "HADRIAN_ARGS" "--hash-unit-ids" useHashUnitIds = addVariable "HADRIAN_ARGS" "--hash-unit-ids"
-- | Mark the validate job to run in fast-ci mode -- | Mark the validate job to run in fast-ci mode
-- This is default way, to enable all jobs you have to apply the `full-ci` label.
fastCI :: JobGroup Job -> JobGroup Job fastCI :: JobGroup Job -> JobGroup Job
fastCI = modifyValidateJobs (addJobRule FastCI) fastCI = modifyValidateJobs (removeValidateJobRule FullCI)
-- | Mark a group of jobs as allowed to fail. -- | Mark a group of jobs as allowed to fail.
allowFailureGroup :: JobGroup Job -> JobGroup Job allowFailureGroup :: JobGroup Job -> JobGroup Job
...@@ -798,15 +899,19 @@ allowFailureGroup = modifyJobs allowFailure ...@@ -798,15 +899,19 @@ allowFailureGroup = modifyJobs allowFailure
-- | Add a 'Rule' to just the validate job, for example, only run a job if a certain -- | Add a 'Rule' to just the validate job, for example, only run a job if a certain
-- label is set. -- label is set.
addValidateRule :: Rule -> JobGroup Job -> JobGroup Job addValidateRule :: ValidateRule -> JobGroup Job -> JobGroup Job
addValidateRule t = modifyValidateJobs (addJobRule t) addValidateRule t = modifyValidateJobs (addValidateJobRule t)
-- | Only run a validate job if a certain rule is enabled
onlyRule :: ValidateRule -> JobGroup Job -> JobGroup Job
onlyRule t = modifyValidateJobs (onlyValidateJobRule t)
-- | Don't run the validate job, normally used to alleviate CI load by marking -- | Don't run the validate job, normally used to alleviate CI load by marking
-- jobs which are unlikely to fail (ie different linux distros) -- jobs which are unlikely to fail (ie different linux distros)
disableValidate :: JobGroup Job -> JobGroup Job disableValidate :: JobGroup Job -> JobGroup Job
disableValidate = addValidateRule Disable disableValidate st = st { v = Nothing }
data NamedJob a = NamedJob { name :: String, jobInfo :: a } deriving Functor data NamedJob a = NamedJob { name :: String, jobInfo :: a } deriving (Show, Functor)
renameJob :: (String -> String) -> NamedJob a -> NamedJob a renameJob :: (String -> String) -> NamedJob a -> NamedJob a
renameJob f (NamedJob n i) = NamedJob (f n) i renameJob f (NamedJob n i) = NamedJob (f n) i
...@@ -816,31 +921,32 @@ instance ToJSON a => ToJSON (NamedJob a) where ...@@ -816,31 +921,32 @@ instance ToJSON a => ToJSON (NamedJob a) where
[ "name" A..= name nj [ "name" A..= name nj
, "jobInfo" A..= jobInfo nj ] , "jobInfo" A..= jobInfo nj ]
--data NamedJobGroup a = NamedJobGroup { platform :: String, jg :: JobGroup a }
-- Jobs are grouped into either triples or pairs depending on whether the -- Jobs are grouped into either triples or pairs depending on whether the
-- job is just validate and nightly, or also release. -- job is just validate and nightly, or also release.
data JobGroup a = StandardTriple { v :: NamedJob a data JobGroup a = StandardTriple { v :: Maybe (NamedJob a)
, n :: NamedJob a , n :: Maybe (NamedJob a)
, r :: NamedJob a } , r :: Maybe (NamedJob a) } deriving (Functor, Show)
| ValidateOnly { v :: NamedJob a
, n :: NamedJob a } deriving Functor
instance ToJSON a => ToJSON (JobGroup a) where instance ToJSON a => ToJSON (JobGroup a) where
toJSON jg = object toJSON StandardTriple{..} = object
[ "n" A..= n jg [ "v" A..= v
, "r" A..= r jg , "n" A..= n
, "r" A..= r
] ]
rename :: (String -> String) -> JobGroup a -> JobGroup a rename :: (String -> String) -> JobGroup a -> JobGroup a
rename f (StandardTriple nv nn nr) = StandardTriple (renameJob f nv) (renameJob f nn) (renameJob f nr) rename f (StandardTriple nv nn nr) = StandardTriple (renameJob f <$> nv) (renameJob f <$> nn) (renameJob f <$> nr)
rename f (ValidateOnly nv nn) = ValidateOnly (renameJob f nv) (renameJob f nn)
-- | Construct a 'JobGroup' which consists of a validate, nightly and release build with -- | Construct a 'JobGroup' which consists of a validate, nightly and release build with
-- a specific config. -- a specific config.
standardBuildsWithConfig :: Arch -> Opsys -> BuildConfig -> JobGroup Job standardBuildsWithConfig :: Arch -> Opsys -> BuildConfig -> JobGroup Job
standardBuildsWithConfig a op bc = standardBuildsWithConfig a op bc =
StandardTriple (validate a op bc) StandardTriple (Just (validate a op bc))
(nightly a op bc) (Just (nightly a op bc))
(release a op bc) (Just (release a op bc))
-- | Construct a 'JobGroup' which consists of a validate, nightly and release builds with -- | Construct a 'JobGroup' which consists of a validate, nightly and release builds with
-- the 'vanilla' config. -- the 'vanilla' config.
...@@ -850,11 +956,12 @@ standardBuilds a op = standardBuildsWithConfig a op vanilla ...@@ -850,11 +956,12 @@ standardBuilds a op = standardBuildsWithConfig a op vanilla
-- | Construct a 'JobGroup' which just consists of a validate and nightly build. We don't -- | Construct a 'JobGroup' which just consists of a validate and nightly build. We don't
-- produce releases for these jobs. -- produce releases for these jobs.
validateBuilds :: Arch -> Opsys -> BuildConfig -> JobGroup Job validateBuilds :: Arch -> Opsys -> BuildConfig -> JobGroup Job
validateBuilds a op bc = ValidateOnly (validate a op bc) (nightly a op bc) validateBuilds a op bc = StandardTriple { v = Just (validate a op bc)
, n = Just (nightly a op bc)
, r = Nothing }
flattenJobGroup :: JobGroup a -> [(String, a)] flattenJobGroup :: JobGroup a -> [(String, a)]
flattenJobGroup (StandardTriple a b c) = map flattenNamedJob [a,b,c] flattenJobGroup (StandardTriple a b c) = map flattenNamedJob (catMaybes [a,b,c])
flattenJobGroup (ValidateOnly a b) = map flattenNamedJob [a, b]
flattenNamedJob :: NamedJob a -> (String, a) flattenNamedJob :: NamedJob a -> (String, a)
flattenNamedJob (NamedJob n i) = (n, i) flattenNamedJob (NamedJob n i) = (n, i)
...@@ -862,26 +969,24 @@ flattenNamedJob (NamedJob n i) = (n, i) ...@@ -862,26 +969,24 @@ flattenNamedJob (NamedJob n i) = (n, i)
-- | Specification for all the jobs we want to build. -- | Specification for all the jobs we want to build.
jobs :: Map String Job jobs :: Map String Job
jobs = Map.fromList $ concatMap (filter is_enabled_job . flattenJobGroup) job_groups jobs = Map.fromList $ concatMap (flattenJobGroup) job_groups
where
is_enabled_job (_, Job {jobRules = OnOffRules {..}}) = not $ Disable `S.member` rule_set
job_groups :: [JobGroup Job] job_groups :: [JobGroup Job]
job_groups = job_groups =
[ disableValidate (standardBuilds Amd64 (Linux Debian10)) [ disableValidate (standardBuilds Amd64 (Linux Debian10))
, standardBuildsWithConfig Amd64 (Linux Debian10) dwarf , addValidateRule TestPrimops (standardBuildsWithConfig Amd64 (Linux Debian10) dwarf)
, validateBuilds Amd64 (Linux Debian10) nativeInt , validateBuilds Amd64 (Linux Debian10) nativeInt
, fastCI (validateBuilds Amd64 (Linux Debian10) unreg) , validateBuilds Amd64 (Linux Debian10) unreg
, fastCI (validateBuilds Amd64 (Linux Debian10) debug) , fastCI (validateBuilds Amd64 (Linux Debian10) debug)
, -- Nightly allowed to fail: #22520 , -- More work is needed to address TSAN failures: #22520
modifyNightlyJobs allowFailure modifyNightlyJobs allowFailure
(modifyValidateJobs manual tsan_jobs) (modifyValidateJobs (allowFailure . manual) tsan_jobs)
, -- Nightly allowed to fail: #22343 , -- Nightly allowed to fail: #22343
modifyNightlyJobs allowFailure modifyNightlyJobs allowFailure
(modifyValidateJobs manual (validateBuilds Amd64 (Linux Debian10) noTntc)) (modifyValidateJobs manual (validateBuilds Amd64 (Linux Debian10) noTntc))
, addValidateRule LLVMBackend (validateBuilds Amd64 (Linux Debian10) llvm) , onlyRule LLVMBackend (validateBuilds Amd64 (Linux Debian12) llvm)
, disableValidate (standardBuilds Amd64 (Linux Debian11)) , disableValidate (standardBuilds Amd64 (Linux Debian11))
, disableValidate (standardBuilds Amd64 (Linux Debian12))
-- We still build Deb9 bindists for now due to Ubuntu 18 and Linux Mint 19 -- We still build Deb9 bindists for now due to Ubuntu 18 and Linux Mint 19
-- not being at EOL until April 2023 and they still need tinfo5. -- not being at EOL until April 2023 and they still need tinfo5.
, disableValidate (standardBuildsWithConfig Amd64 (Linux Debian9) (splitSectionsBroken vanilla)) , disableValidate (standardBuildsWithConfig Amd64 (Linux Debian9) (splitSectionsBroken vanilla))
...@@ -891,42 +996,50 @@ job_groups = ...@@ -891,42 +996,50 @@ job_groups =
, disableValidate (standardBuildsWithConfig Amd64 (Linux Centos7) (splitSectionsBroken vanilla)) , disableValidate (standardBuildsWithConfig Amd64 (Linux Centos7) (splitSectionsBroken vanilla))
-- Fedora33 job is always built with perf so there's one job in the normal -- Fedora33 job is always built with perf so there's one job in the normal
-- validate pipeline which is built with perf. -- validate pipeline which is built with perf.
, standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig , fastCI (standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig)
-- This job is only for generating head.hackage docs -- This job is only for generating head.hackage docs
, hackage_doc_job (disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig)) , hackage_doc_job (disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) dwarf) , disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) dwarf)
, fastCI (standardBuildsWithConfig Amd64 Windows (splitSectionsBroken vanilla)) , disableValidate (standardBuilds Amd64 (Linux Fedora38))
, disableValidate (standardBuildsWithConfig Amd64 Windows (splitSectionsBroken nativeInt)) , fastCI (standardBuildsWithConfig Amd64 Windows vanilla)
, standardBuilds Amd64 Darwin , disableValidate (standardBuildsWithConfig Amd64 Windows nativeInt)
, allowFailureGroup (addValidateRule FreeBSDLabel (validateBuilds Amd64 FreeBSD13 vanilla)) , addValidateRule TestPrimops (standardBuilds Amd64 Darwin)
, standardBuilds AArch64 Darwin , fastCI (standardBuilds AArch64 Darwin)
, standardBuildsWithConfig AArch64 (Linux Debian10) (splitSectionsBroken vanilla) , fastCI (standardBuildsWithConfig AArch64 (Linux Debian10) (splitSectionsBroken vanilla))
, disableValidate (validateBuilds AArch64 (Linux Debian10) llvm) , disableValidate (standardBuildsWithConfig AArch64 (Linux Debian11) (splitSectionsBroken vanilla))
, standardBuildsWithConfig I386 (Linux Debian9) (splitSectionsBroken vanilla) , onlyRule LLVMBackend (validateBuilds AArch64 (Linux Debian12) llvm)
, standardBuildsWithConfig I386 (Linux Debian10) (splitSectionsBroken vanilla)
-- Fully static build, in theory usable on any linux distribution. -- Fully static build, in theory usable on any linux distribution.
, fullyStaticBrokenTests (standardBuildsWithConfig Amd64 (Linux Alpine) (splitSectionsBroken static)) , fullyStaticBrokenTests (standardBuildsWithConfig Amd64 (Linux Alpine312) (splitSectionsBroken static))
-- Dynamically linked build, suitable for building your own static executables on alpine -- Dynamically linked build, suitable for building your own static executables on alpine
, disableValidate (standardBuildsWithConfig Amd64 (Linux Alpine) (splitSectionsBroken vanilla)) , disableValidate (standardBuildsWithConfig Amd64 (Linux Alpine312) (splitSectionsBroken vanilla))
, fullyStaticBrokenTests (disableValidate (allowFailureGroup (standardBuildsWithConfig Amd64 (Linux Alpine) staticNativeInt))) , disableValidate (standardBuildsWithConfig AArch64 (Linux Alpine318) (splitSectionsBroken vanilla))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Alpine318) (splitSectionsBroken vanilla))
, fullyStaticBrokenTests (disableValidate (allowFailureGroup (standardBuildsWithConfig Amd64 (Linux Alpine312) staticNativeInt)))
, validateBuilds Amd64 (Linux Debian11) (crossConfig "aarch64-linux-gnu" (Emulator "qemu-aarch64 -L /usr/aarch64-linux-gnu") Nothing) , validateBuilds Amd64 (Linux Debian11) (crossConfig "aarch64-linux-gnu" (Emulator "qemu-aarch64 -L /usr/aarch64-linux-gnu") Nothing)
, validateBuilds Amd64 (Linux Debian11) (crossConfig "javascript-unknown-ghcjs" (Emulator "js-emulator") (Just "emconfigure")
) , addValidateRule JSBackend (validateBuilds Amd64 (Linux Debian11Js) javascriptConfig)
{ bignumBackend = Native
}
, make_wasm_jobs wasm_build_config , make_wasm_jobs wasm_build_config
, modifyValidateJobs manual $ , modifyValidateJobs manual $
make_wasm_jobs wasm_build_config {bignumBackend = Native} make_wasm_jobs wasm_build_config {bignumBackend = Native}
, modifyValidateJobs manual $ , modifyValidateJobs manual $
make_wasm_jobs wasm_build_config {unregisterised = True} make_wasm_jobs wasm_build_config {unregisterised = True}
, addValidateRule NonmovingGc (standardBuildsWithConfig Amd64 (Linux Debian11) vanilla {validateNonmovingGc = True}) , onlyRule NonmovingGc (validateBuilds Amd64 (Linux Debian11) vanilla {validateNonmovingGc = True})
, onlyRule IpeData (validateBuilds Amd64 (Linux Debian10) zstdIpe)
] ]
where where
javascriptConfig = (crossConfig "javascript-unknown-ghcjs" (Emulator "js-emulator") (Just "emconfigure"))
{ bignumBackend = Native }
-- ghcilink002 broken due to #17869 -- ghcilink002 broken due to #17869
fullyStaticBrokenTests = modifyJobs (addVariable "BROKEN_TESTS" "ghcilink002 ") --
-- linker_unload_native: due to musl not supporting any means of probing dynlib dependencies
-- (see Note [Object unloading]).
fullyStaticBrokenTests = modifyJobs (addVariable "BROKEN_TESTS" "ghcilink002 linker_unload_native")
hackage_doc_job = rename (<> "-hackage") . modifyJobs (addVariable "HADRIAN_ARGS" "--haddock-base-url") hackage_doc_job = rename (<> "-hackage") . modifyJobs (addVariable "HADRIAN_ARGS" "--haddock-for-hackage")
tsan_jobs = tsan_jobs =
modifyJobs modifyJobs
...@@ -934,7 +1047,7 @@ job_groups = ...@@ -934,7 +1047,7 @@ job_groups =
-- Haddock is large enough to make TSAN choke without massive quantities of -- Haddock is large enough to make TSAN choke without massive quantities of
-- memory. -- memory.
. addVariable "HADRIAN_ARGS" "--docs=none") $ . addVariable "HADRIAN_ARGS" "--docs=none") $
validateBuilds Amd64 (Linux Debian10) tsan validateBuilds Amd64 (Linux Debian12) tsan
make_wasm_jobs cfg = make_wasm_jobs cfg =
modifyJobs modifyJobs
...@@ -942,13 +1055,14 @@ job_groups = ...@@ -942,13 +1055,14 @@ job_groups =
. setVariable "HADRIAN_ARGS" "--docs=none" . setVariable "HADRIAN_ARGS" "--docs=none"
. delVariable "INSTALL_CONFIGURE_ARGS" . delVariable "INSTALL_CONFIGURE_ARGS"
) )
$ validateBuilds Amd64 (Linux AlpineWasm) cfg $ addValidateRule WasmBackend $ validateBuilds Amd64 (Linux AlpineWasm) cfg
wasm_build_config = wasm_build_config =
(crossConfig "wasm32-wasi" NoEmulatorNeeded Nothing) (crossConfig "wasm32-wasi" NoEmulatorNeeded Nothing)
{ {
fullyStatic = True fullyStatic = True
, buildFlavour = Release -- TODO: This needs to be validate but wasm backend doesn't pass yet , buildFlavour = Release -- TODO: This needs to be validate but wasm backend doesn't pass yet
, textWithSIMDUTF = True
} }
...@@ -965,27 +1079,55 @@ mkPlatform arch opsys = archName arch <> "-" <> opsysName opsys ...@@ -965,27 +1079,55 @@ mkPlatform arch opsys = archName arch <> "-" <> opsysName opsys
-- * Prefer jobs which have a corresponding release pipeline -- * Prefer jobs which have a corresponding release pipeline
-- * Explicitly require tie-breaking for other cases. -- * Explicitly require tie-breaking for other cases.
platform_mapping :: Map String (JobGroup BindistInfo) platform_mapping :: Map String (JobGroup BindistInfo)
platform_mapping = Map.map go $ platform_mapping = Map.map go combined_result
Map.fromListWith combine [ (uncurry mkPlatform (jobPlatform (jobInfo $ v j)), j) | j <- filter hasReleaseBuild job_groups ]
where where
whitelist = [ "x86_64-linux-alpine3_12-int_native-validate+fully_static" whitelist = [ "x86_64-linux-alpine3_12-validate"
, "x86_64-linux-deb10-validate"
, "x86_64-linux-deb11-validate" , "x86_64-linux-deb11-validate"
, "x86_64-linux-deb12-validate"
, "x86_64-linux-deb10-validate+debug_info"
, "x86_64-linux-fedora33-release" , "x86_64-linux-fedora33-release"
, "x86_64-linux-deb11-cross_aarch64-linux-gnu-validate"
, "x86_64-windows-validate" , "x86_64-windows-validate"
, "nightly-x86_64-linux-alpine3_18-wasm-cross_wasm32-wasi-release+fully_static+text_simdutf"
, "nightly-x86_64-linux-deb11-validate"
, "nightly-x86_64-linux-deb12-validate"
, "x86_64-linux-alpine3_18-wasm-cross_wasm32-wasi-release+fully_static+text_simdutf"
, "x86_64-linux-deb12-validate+thread_sanitizer_cmm"
, "nightly-aarch64-linux-deb10-validate"
, "nightly-x86_64-linux-alpine3_12-validate"
, "nightly-x86_64-linux-deb10-validate"
, "nightly-x86_64-linux-fedora33-release"
, "nightly-x86_64-windows-validate"
, "release-x86_64-linux-alpine3_12-release+no_split_sections"
, "release-x86_64-linux-deb10-release"
, "release-x86_64-linux-deb11-release"
, "release-x86_64-linux-deb12-release"
, "release-x86_64-linux-fedora33-release"
, "release-x86_64-windows-release"
] ]
process sel = Map.fromListWith combine [ (uncurry mkPlatform (jobPlatform (jobInfo $ j)), j) | (sel -> Just j) <- job_groups ]
vs = process v
ns = process n
rs = process r
all_platforms = Map.keysSet vs <> Map.keysSet ns <> Map.keysSet rs
combined_result = Map.fromList [ (p, StandardTriple { v = Map.lookup p vs
, n = Map.lookup p ns
, r = Map.lookup p rs })
| p <- S.toList all_platforms ]
combine a b combine a b
| name (v a) `elem` whitelist = a -- Explicitly selected | name a `elem` whitelist = a -- Explicitly selected
| name (v b) `elem` whitelist = b | name b `elem` whitelist = b
| otherwise = error (show (name (v a)) ++ show (name (v b))) | otherwise = error (show (name a) ++ show (name b))
go = fmap (BindistInfo . unwords . fromJust . mmlookup "BIN_DIST_NAME" . jobVariables) go = fmap (BindistInfo . unwords . fromJust . mmlookup "BIN_DIST_NAME" . jobVariables)
hasReleaseBuild (StandardTriple{}) = True
hasReleaseBuild (ValidateOnly{}) = False
data BindistInfo = BindistInfo { bindistName :: String } data BindistInfo = BindistInfo { bindistName :: String } deriving Show
instance ToJSON BindistInfo where instance ToJSON BindistInfo where
toJSON (BindistInfo n) = object [ "bindistName" A..= n ] toJSON (BindistInfo n) = object [ "bindistName" A..= n ]
...@@ -1000,6 +1142,7 @@ main = do ...@@ -1000,6 +1142,7 @@ main = do
("metadata":as) -> write_result as platform_mapping ("metadata":as) -> write_result as platform_mapping
_ -> error "gen_ci.hs <gitlab|metadata> [file.json]" _ -> error "gen_ci.hs <gitlab|metadata> [file.json]"
write_result :: ToJSON a => [FilePath] -> a -> IO ()
write_result as obj = write_result as obj =
(case as of (case as of
[] -> B.putStrLn [] -> B.putStrLn
......
cabal-version: 3.0
name: generate-ci
version: 0.1.0.0
license: BSD-3-Clause
license-file: LICENSE
build-type: Simple
executable generate-ci
main-is: gen_ci.hs
ghc-options: -Wall
build-depends: base,
containers,
bytestring,
aeson >= 1.8.1
default-language: Haskell2010
#!/usr/bin/env bash
set -e
out_dir="$(git rev-parse --show-toplevel)/.gitlab"
# Update job metadata for ghcup
generate-ci metadata "$out_dir/jobs-metadata.json"
echo "Updated $out_dir/jobs-metadata.json"
#!/usr/bin/env bash
set -e
out_dir="$(git rev-parse --show-toplevel)/.gitlab"
tmp="$(mktemp)"
generate-ci gitlab "$tmp"
rm -f "$out_dir/jobs.yaml"
echo "### THIS IS A GENERATED FILE, DO NOT MODIFY DIRECTLY" > "$out_dir/jobs.yaml"
cat "$tmp" | jq >> "$out_dir/jobs.yaml"
rm "$tmp"
echo "Updated $out_dir/jobs.yaml"
File moved
#! /usr/bin/env nix-shell
#!nix-shell -i bash -p cabal-install "haskell.packages.ghc92.ghcWithPackages (pkgs: with pkgs; [aeson])" git jq
cd "$(dirname "${BASH_SOURCE[0]}")"
cabal run gen_ci -- metadata jobs-metadata.json
#!/usr/bin/env nix-shell
#!nix-shell -i bash -p cabal-install "haskell.packages.ghc92.ghcWithPackages (pkgs: with pkgs; [aeson])" git jq
# shellcheck shell=bash
set -euo pipefail
cd "$(dirname "${BASH_SOURCE[0]}")"
tmp=$(mktemp)
cabal run gen_ci -- gitlab $tmp
rm -f jobs.yaml
echo "### THIS IS A GENERATED FILE, DO NOT MODIFY DIRECTLY" > jobs.yaml
cat $tmp | jq | tee -a jobs.yaml