Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • ghc/ghc
  • bgamari/ghc
  • syd/ghc
  • ggreif/ghc
  • watashi/ghc
  • RolandSenn/ghc
  • mpickering/ghc
  • DavidEichmann/ghc
  • carter/ghc
  • harpocrates/ghc
  • ethercrow/ghc
  • mijicd/ghc
  • adamse/ghc
  • alexbiehl/ghc
  • gridaphobe/ghc
  • trofi/ghc
  • supersven/ghc
  • ppk/ghc
  • ulysses4ever/ghc
  • AndreasK/ghc
  • ghuntley/ghc
  • shayne-fletcher-da/ghc
  • fgaz/ghc
  • yav/ghc
  • osa1/ghc
  • mbbx6spp/ghc
  • JulianLeviston/ghc
  • reactormonk/ghc
  • rae/ghc
  • takenobu-hs/ghc
  • michalt/ghc
  • andrewthad/ghc
  • hsyl20/ghc
  • scottgw/ghc
  • sjakobi/ghc
  • angerman/ghc
  • RyanGlScott/ghc
  • hvr/ghc
  • howtonotwin/ghc
  • chessai/ghc
  • m-renaud/ghc
  • brprice/ghc
  • stevehartdata/ghc
  • sighingnow/ghc
  • kgardas/ghc
  • ckoparkar/ghc
  • alp/ghc
  • smaeul/ghc
  • kakkun61/ghc
  • sykloid/ghc
  • newhoggy/ghc
  • toonn/ghc
  • nineonine/ghc
  • Phyx/ghc
  • ezyang/ghc
  • tweag/ghc
  • langston/ghc
  • ndmitchell/ghc
  • rockbmb/ghc
  • artempyanykh/ghc
  • mniip/ghc
  • mynguyenbmc/ghc
  • alexfmpe/ghc
  • crockeea/ghc
  • nh2/ghc
  • vaibhavsagar/ghc
  • phadej/ghc
  • Haskell-mouse/ghc
  • lolotp/ghc
  • spacekitteh/ghc
  • michaelpj/ghc
  • mgsloan/ghc
  • HPCohen/ghc
  • tmobile/ghc
  • radrow/ghc
  • simonmar/ghc
  • _deepfire/ghc
  • Ericson2314/ghc
  • leitao/ghc
  • fumieval/ghc
  • trac-isovector/ghc
  • cblp/ghc
  • xich/ghc
  • ciil/ghc
  • erthalion/ghc
  • xldenis/ghc
  • autotaker/ghc
  • haskell-wasm/ghc
  • kcsongor/ghc
  • agander/ghc
  • Baranowski/ghc
  • trac-dredozubov/ghc
  • 23Skidoo/ghc
  • iustin/ghc
  • ningning/ghc
  • josefs/ghc
  • kabuhr/ghc
  • gallais/ghc
  • dten/ghc
  • expipiplus1/ghc
  • Pluralia/ghc
  • rohanjr/ghc
  • intricate/ghc
  • kirelagin/ghc
  • Javran/ghc
  • DanielG/ghc
  • trac-mizunashi_mana/ghc
  • pparkkin/ghc
  • bollu/ghc
  • ntc2/ghc
  • jaspervdj/ghc
  • JoshMeredith/ghc
  • wz1000/ghc
  • zkourouma/ghc
  • code5hot/ghc
  • jdprice/ghc
  • tdammers/ghc
  • J-mie6/ghc
  • trac-lantti/ghc
  • ch1bo/ghc
  • cgohla/ghc
  • lucamolteni/ghc
  • acairncross/ghc
  • amerocu/ghc
  • chreekat/ghc
  • txsmith/ghc
  • trupill/ghc
  • typetetris/ghc
  • sergv/ghc
  • fryguybob/ghc
  • erikd/ghc
  • trac-roland/ghc
  • setupminimal/ghc
  • Friede80/ghc
  • SkyWriter/ghc
  • xplorld/ghc
  • abrar/ghc
  • obsidiansystems/ghc
  • Icelandjack/ghc
  • adinapoli/ghc
  • trac-matthewbauer/ghc
  • heatsink/ghc
  • dwijnand/ghc
  • Cmdv/ghc
  • alinab/ghc
  • pepeiborra/ghc
  • fommil/ghc
  • luochen1990/ghc
  • rlupton20/ghc
  • applePrincess/ghc
  • lehins/ghc
  • ronmrdechai/ghc
  • leeadam/ghc
  • harendra/ghc
  • mightymosquito1991/ghc
  • trac-gershomb/ghc
  • lucajulian/ghc
  • Rizary/ghc
  • VictorCMiraldo/ghc
  • jamesbrock/ghc
  • andrewdmeier/ghc
  • luke/ghc
  • pranaysashank/ghc
  • cocreature/ghc
  • hithroc/ghc
  • obreitwi/ghc
  • slrtbtfs/ghc
  • kaol/ghc
  • yairchu/ghc
  • Mathemagician98/ghc
  • trac-taylorfausak/ghc
  • leungbk/ghc
  • MichaWiedenmann/ghc
  • chris-martin/ghc
  • TDecki/ghc
  • adithyaov/ghc
  • trac-gelisam/ghc
  • Lysxia/ghc
  • complyue/ghc
  • bwignall/ghc
  • sternmull/ghc
  • sonika/ghc
  • leif/ghc
  • broadwaylamb/ghc
  • myszon/ghc
  • danbroooks/ghc
  • Mechachleopteryx/ghc
  • zardyh/ghc
  • trac-vdukhovni/ghc
  • OmarKhaledAbdo/ghc
  • arrowd/ghc
  • Bodigrim/ghc
  • matheus23/ghc
  • cardenaso11/ghc
  • trac-Athas/ghc
  • mb720/ghc
  • DylanZA/ghc
  • liff/ghc
  • typedrat/ghc
  • trac-claude/ghc
  • jbm/ghc
  • Gertjan423/ghc
  • PHO/ghc
  • JKTKops/ghc
  • kockahonza/ghc
  • msakai/ghc
  • Sir4ur0n/ghc
  • barambani/ghc
  • vishnu.c/ghc
  • dcoutts/ghc
  • trac-runeks/ghc
  • trac-MaxGabriel/ghc
  • lexi.lambda/ghc
  • strake/ghc
  • spavikevik/ghc
  • JakobBruenker/ghc
  • rmanne/ghc
  • gdziadkiewicz/ghc
  • ani/ghc
  • iliastsi/ghc
  • smunix/ghc
  • judah/ghc
  • blackgnezdo/ghc
  • emilypi/ghc
  • trac-bpfoley/ghc
  • muesli4/ghc
  • trac-gkaracha/ghc
  • Kleidukos/ghc
  • nek0/ghc
  • TristanCacqueray/ghc
  • dwulive/ghc
  • mbakke/ghc
  • arybczak/ghc
  • Yang123321/ghc
  • maksbotan/ghc
  • QuietMisdreavus/ghc
  • trac-olshanskydr/ghc
  • emekoi/ghc
  • samuela/ghc
  • josephcsible/ghc
  • dramforever/ghc
  • lpsmith/ghc
  • DenisFrezzato/ghc
  • michivi/ghc
  • jneira/ghc
  • jeffhappily/ghc
  • Ivan-Yudin/ghc
  • nakaji-dayo/ghc
  • gdevanla/ghc
  • galen/ghc
  • fendor/ghc
  • yaitskov/ghc
  • rcythr/ghc
  • awpr/ghc
  • jeremyschlatter/ghc
  • Aver1y/ghc
  • mitchellvitez/ghc
  • merijn/ghc
  • tomjaguarpaw1/ghc
  • trac-NoidedSuper/ghc
  • erewok/ghc
  • trac-junji.hashimoto/ghc
  • adamwespiser/ghc
  • bjaress/ghc
  • jhrcek/ghc
  • leonschoorl/ghc
  • lukasz-golebiewski/ghc
  • sheaf/ghc
  • last-g/ghc
  • carassius1014/ghc
  • eschwartz/ghc
  • dwincort/ghc
  • felixwiemuth/ghc
  • TimWSpence/ghc
  • marcusmonteirodesouza/ghc
  • WJWH/ghc
  • vtols/ghc
  • theobat/ghc
  • BinderDavid/ghc
  • ckoparkar0/ghc
  • alexander-kjeldaas/ghc
  • dme2/ghc
  • philderbeast/ghc
  • aaronallen8455/ghc
  • rayshih/ghc
  • benkard/ghc
  • mpardalos/ghc
  • saidelman/ghc
  • leiftw/ghc
  • ca333/ghc
  • bwroga/ghc
  • nmichael44/ghc
  • trac-crobbins/ghc
  • felixonmars/ghc
  • adityagupta1089/ghc
  • hgsipiere/ghc
  • treeowl/ghc
  • alexpeits/ghc
  • CraigFe/ghc
  • dnlkrgr/ghc
  • kerckhove_ts/ghc
  • cptwunderlich/ghc
  • eiais/ghc
  • hahohihu/ghc
  • sanchayan/ghc
  • lemmih/ghc
  • sehqlr/ghc
  • trac-dbeacham/ghc
  • luite/ghc
  • trac-f-a/ghc
  • vados/ghc
  • luntain/ghc
  • fatho/ghc
  • alexbiehl-gc/ghc
  • dcbdan/ghc
  • tvh/ghc
  • liam-ly/ghc
  • timbobbarnes/ghc
  • GovanifY/ghc
  • shanth2600/ghc
  • gliboc/ghc
  • duog/ghc
  • moxonsghost/ghc
  • zander/ghc
  • masaeedu/ghc
  • georgefst/ghc
  • guibou/ghc
  • nicuveo/ghc
  • mdebruijne/ghc
  • stjordanis/ghc
  • emiflake/ghc
  • wygulmage/ghc
  • frasertweedale/ghc
  • coot/ghc
  • aratamizuki/ghc
  • tsandstr/ghc
  • mrBliss/ghc
  • Anton-Latukha/ghc
  • tadfisher/ghc
  • vapourismo/ghc
  • Sorokin-Anton/ghc
  • basile-henry/ghc
  • trac-mightybyte/ghc
  • AbsoluteNikola/ghc
  • cobrien99/ghc
  • songzh/ghc
  • blamario/ghc
  • aj4ayushjain/ghc
  • trac-utdemir/ghc
  • tangcl/ghc
  • hdgarrood/ghc
  • maerwald/ghc
  • arjun/ghc
  • ratherforky/ghc
  • haskieLambda/ghc
  • EmilGedda/ghc
  • Bogicevic/ghc
  • eddiejessup/ghc
  • kozross/ghc
  • AlistairB/ghc
  • 3Rafal/ghc
  • christiaanb/ghc
  • trac-bit/ghc
  • matsumonkie/ghc
  • trac-parsonsmatt/ghc
  • chisui/ghc
  • jaro/ghc
  • trac-kmiyazato/ghc
  • davidsd/ghc
  • Tritlo/ghc
  • I-B-3/ghc
  • lykahb/ghc
  • AriFordsham/ghc
  • turion1/ghc
  • berberman/ghc
  • christiantakle/ghc
  • zyklotomic/ghc
  • trac-ocramz/ghc
  • CSEdd/ghc
  • doyougnu/ghc
  • mmhat/ghc
  • why-not-try-calmer/ghc
  • plutotulp/ghc
  • kjekac/ghc
  • Manvi07/ghc
  • teo/ghc
  • cactus/ghc
  • CarrieMY/ghc
  • abel/ghc
  • yihming/ghc
  • tsakki/ghc
  • jessicah/ghc
  • oliverbunting/ghc
  • meld/ghc
  • friedbrice/ghc
  • Joald/ghc
  • abarbu/ghc
  • DigitalBrains1/ghc
  • sterni/ghc
  • alexDarcy/ghc
  • hexchain/ghc
  • minimario/ghc
  • zliu41/ghc
  • tommd/ghc
  • jazcarate/ghc
  • peterbecich/ghc
  • alirezaghey/ghc
  • solomon/ghc
  • mikael.urankar/ghc
  • davjam/ghc
  • int-index/ghc
  • MorrowM/ghc
  • nrnrnr/ghc
  • Sonfamm/ghc-test-only
  • afzt1/ghc
  • nguyenhaibinh-tpc/ghc
  • trac-lierdakil/ghc
  • MichaWiedenmann1/ghc
  • jmorag/ghc
  • Ziharrk/ghc
  • trac-MitchellSalad/ghc
  • juampe/ghc
  • jwaldmann/ghc
  • snowleopard/ghc
  • juhp/ghc
  • normalcoder/ghc
  • ksqsf/ghc
  • trac-jberryman/ghc
  • roberth/ghc
  • 1ntEgr8/ghc
  • epworth/ghc
  • MrAdityaAlok/ghc
  • JunmingZhao42/ghc
  • jappeace/ghc
  • trac-Gabriel439/ghc
  • alt-romes/ghc
  • HugoPeters1024/ghc
  • 10ne1/ghc-fork
  • agentultra/ghc
  • Garfield1002/ghc
  • ChickenProp/ghc
  • clyring/ghc
  • MaxHearnden/ghc
  • jumper149/ghc
  • vem/ghc
  • ketzacoatl/ghc
  • Rosuavio/ghc
  • jackohughes/ghc
  • p4l1ly/ghc
  • konsumlamm/ghc
  • shlevy/ghc
  • torsten.schmits/ghc
  • andremarianiello/ghc
  • amesgen/ghc
  • googleson78/ghc
  • InfiniteVerma/ghc
  • uhbif19/ghc
  • yiyunliu/ghc
  • raehik/ghc
  • mrkun/ghc
  • telser/ghc
  • 1Jajen1/ghc
  • slotThe/ghc
  • WinstonHartnett/ghc
  • mpilgrem/ghc
  • dreamsmasher/ghc
  • schuelermine/ghc
  • trac-Viwor/ghc
  • undergroundquizscene/ghc
  • evertedsphere/ghc
  • coltenwebb/ghc
  • oberblastmeister/ghc
  • agrue/ghc
  • lf-/ghc
  • zacwood9/ghc
  • steshaw/ghc
  • high-cloud/ghc
  • SkamDart/ghc
  • PiDelport/ghc
  • maoif/ghc
  • RossPaterson/ghc
  • CharlesTaylor7/ghc
  • ribosomerocker/ghc
  • trac-ramirez7/ghc
  • daig/ghc
  • NicolasT/ghc
  • FinleyMcIlwaine/ghc
  • lawtonnichols/ghc
  • jmtd/ghc
  • ozkutuk/ghc
  • wildsebastian/ghc
  • nikshalark/ghc
  • lrzlin/ghc
  • tobias/ghc
  • fw/ghc
  • hawkinsw/ghc
  • type-dance/ghc
  • rui314/ghc
  • ocharles/ghc
  • wavewave/ghc
  • TheKK/ghc
  • nomeata/ghc
  • trac-csabahruska/ghc
  • jonathanjameswatson/ghc
  • L-as/ghc
  • Axman6/ghc
  • barracuda156/ghc
  • trac-jship/ghc
  • jake-87/ghc
  • meooow/ghc
  • rebeccat/ghc
  • hamana55/ghc
  • Enigmage/ghc
  • kokobd/ghc
  • agevelt/ghc
  • gshen42/ghc
  • chrismwendt/ghc
  • MangoIV/ghc
  • teto/ghc
  • Sookr1/ghc
  • trac-thomasjm/ghc
  • barci2/ghc-dev
  • trac-m4dc4p/ghc
  • dixonary/ghc
  • breakerzirconia/ghc
  • alexsio27444/ghc
  • glocq/ghc
  • sourabhxyz/ghc
  • ryantrinkle/ghc
  • Jade/ghc
  • scedfaliako/ghc
  • martijnbastiaan/ghc
  • trac-george.colpitts/ghc
  • ammarbinfaisal/ghc
  • mimi.vx/ghc
  • lortabac/ghc
  • trac-zyla/ghc
  • benbellick/ghc
  • aadaa-fgtaa/ghc
  • jvanbruegge/ghc
  • archbung/ghc
  • gilmi/ghc
  • mfonism/ghc
  • alex-mckenna/ghc
  • Ei30metry/ghc
  • DiegoDiverio/ghc
  • jorgecunhamendes/ghc
  • liesnikov/ghc
  • akrmn/ghc
  • trac-simplifierticks/ghc
  • jacco/ghc
  • rhendric/ghc
  • damhiya/ghc
  • ryndubei/ghc
  • DaveBarton/ghc
  • trac-Profpatsch/ghc
  • GZGavinZhao/ghc
  • ncfavier/ghc
  • jameshaydon/ghc
  • ajccosta/ghc
  • dschrempf/ghc
  • cydparser/ghc
  • LinuxUserGD/ghc
  • elodielander/ghc
  • facundominguez/ghc
  • psilospore/ghc
  • lachrimae/ghc
  • dylan-thinnes/ghc-type-errors-plugin
  • hamishmack/ghc
  • Leary/ghc
  • lzszt/ghc
  • lyokha/ghc
  • trac-glaubitz/ghc
  • Rewbert/ghc
  • andreabedini/ghc
  • Jasagredo/ghc
  • sol/ghc
  • OlegAlexander/ghc
  • trac-sthibaul/ghc
  • avdv/ghc
  • Wendaolee/ghc
  • ur4t/ghc
  • daylily/ghc
  • boltzmannrain/ghc
  • mmzk1526/ghc
  • trac-fizzixnerd/ghc
  • soulomoon/ghc
  • rwmjones/ghc
  • j14i/ghc
  • tracsis/ghc
  • gesh/ghc
  • flip101/ghc
  • eldritch-cookie/ghc
  • LemonjamesD/ghc
  • pgujjula/ghc
  • skeuchel/ghc
  • noteed/ghc
  • gulin.serge/ghc
  • Torrekie/ghc
  • jlwoodwa/ghc
  • ayanamists/ghc
  • husong998/ghc
  • trac-edmundnoble/ghc
  • josephf/ghc
  • contrun/ghc
  • baulig/ghc
  • edsko/ghc
  • mzschr/ghc-issue-24732
  • ulidtko/ghc
  • Arsen/ghc
  • trac-sjoerd_visscher/ghc
  • crumbtoo/ghc
  • L0neGamer/ghc
  • DrewFenwick/ghc
  • benz0li/ghc
  • MaciejWas/ghc
  • jordanrule/ghc
  • trac-qqwy/ghc
  • LiamGoodacre/ghc
  • isomorpheme/ghc
  • trac-danidiaz/ghc
  • Kariim/ghc
  • MTaimoorZaeem/ghc
  • hololeap/ghc
  • ticat-fp/ghc
  • meritamen/ghc
  • criskell/ghc
  • trac-kraai/ghc
  • aergus/ghc
  • jdral/ghc
  • SamB/ghc
  • Tristian/ghc
  • ywgrit/ghc
  • KatsuPatrick/ghc
  • OsePedro/ghc
  • mpscholten/ghc
  • fp/ghc
  • zaquest/ghc
  • fangyi-zhou/ghc
  • augyg/ghc
640 results
Show changes
Commits on Source (333)
Showing
with 1605 additions and 925 deletions
......@@ -2,11 +2,11 @@ variables:
GIT_SSL_NO_VERIFY: "1"
# Commit of ghc/ci-images repository from which to pull Docker images
DOCKER_REV: 205afce5c7ebdb8666b96638f6758fe527f40a7f
DOCKER_REV: dd01591a50ea4e2aa3c106cf50ca54d38663f912
# Sequential version number of all cached things.
# Bump to invalidate GitLab CI cache.
CACHE_REV: 8
CACHE_REV: 9
# Disable shallow clones; they break our linting rules
GIT_DEPTH: 0
......@@ -17,6 +17,9 @@ variables:
# Overridden by individual jobs
CONFIGURE_ARGS: ""
# Overridden by individual jobs
CONFIGURE_WRAPPER: ""
GIT_SUBMODULE_STRATEGY: "normal"
# Makes ci.sh isolate CABAL_DIR
......@@ -79,9 +82,9 @@ workflow:
# which versions of GHC to allow bootstrap with
.bootstrap_matrix : &bootstrap_matrix
matrix:
- GHC_VERSION: 9.0.2
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10-ghc9_0:$DOCKER_REV"
- GHC_VERSION: 9.2.2
- GHC_VERSION: 9.2.5
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10-ghc9_2:$DOCKER_REV"
- GHC_VERSION: 9.4.3
DOCKER_IMAGE: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10:$DOCKER_REV"
# Allow linters to fail on draft MRs.
......@@ -163,7 +166,9 @@ not-interruptible:
stage: not-interruptible
script: "true"
interruptible: false
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10:$DOCKER_REV"
image: "debian:10"
variables:
GIT_STRATEGY: none
tags:
- lint
rules:
......@@ -259,26 +264,20 @@ lint-author:
- *drafts-can-fail-lint
lint-ci-config:
image: "nixos/nix:2.8.0"
image: nixos/nix:2.12.0
extends: .lint
# We don't need history/submodules in this job
variables:
BUILD_FLAVOUR: default
GIT_DEPTH: 1
GIT_SUBMODULE_STRATEGY: none
before_script:
- mkdir -p ~/.cabal
- cp -Rf cabal-cache/* ~/.cabal || true
- echo "experimental-features = nix-command flakes" >> /etc/nix/nix.conf
- nix-channel --update
script:
- nix shell --extra-experimental-features nix-command --extra-experimental-features flakes nixpkgs#cabal-install nixpkgs#ghc -c cabal update
- .gitlab/generate_jobs
# 1 if .gitlab/generate_jobs changed the output of the generated config
- nix shell --extra-experimental-features nix-command --extra-experimental-features flakes nixpkgs#git -c git diff --exit-code
after_script:
- rm -Rf cabal-cache
- cp -Rf ~/.cabal cabal-cache
# 1 if .gitlab/generate_jobs changed the output of the generated config
- nix shell nixpkgs#git -c git diff --exit-code
dependencies: []
cache:
key: lint-ci-$CACHE_REV
paths:
- cabal-cache
lint-submods:
extends: .lint-submods
......@@ -373,6 +372,59 @@ hadrian-ghc-in-ghci:
paths:
- cabal-cache
############################################################
# Hadrian Multi-Repl
############################################################
hadrian-multi:
stage: testing
needs:
- job: x86_64-linux-fedora33-release
optional: true
- job: nightly-x86_64-linux-fedora33-release
optional: true
- job: release-x86_64-linux-fedora33-release
optional: true
dependencies: null
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV"
before_script:
# workaround for docker permissions
- sudo chown ghc:ghc -R .
variables:
GHC_FLAGS: -Werror
CONFIGURE_ARGS: --enable-bootstrap-with-devel-snapshot
tags:
- x86_64-linux
script:
- export BOOT_HC=$GHC
- root=$(pwd)/ghc
- ls
- |
mkdir tmp
tar -xf ghc-x86_64-linux-fedora33-release.tar.xz -C tmp
pushd tmp/ghc-*/
./configure --prefix=$root
make install
popd
rm -Rf tmp
- export HC=$root/bin/ghc
# This GHC means, use this GHC to configure with
- export GHC=$root/bin/ghc
- .gitlab/ci.sh setup
- .gitlab/ci.sh configure
# Now GHC means, use this GHC for hadrian
- export GHC=$BOOT_HC
# Load hadrian-multi then immediately exit and check the modules loaded
- echo ":q" | hadrian/ghci-multi -j`mk/detect-cpu-count.sh`| tail -n2 | grep "Ok,"
after_script:
- .gitlab/ci.sh save_cache
cache:
key: hadrian-ghci-$CACHE_REV
paths:
- cabal-cache
rules:
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/'
############################################################
# stack-hadrian-build
############################################################
......@@ -402,6 +454,32 @@ test-cabal-reinstall-x86_64-linux-deb10:
rules:
- if: $NIGHTLY
########################################
# Testing ABI is invariant across builds
########################################
abi-test-nightly:
stage: full-build
needs:
- job: nightly-x86_64-linux-fedora33-release-hackage
- job: nightly-x86_64-linux-fedora33-release
tags:
- x86_64-linux
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV"
dependencies: null
before_script:
- mkdir -p normal
- mkdir -p hackage
- tar -xf ghc-x86_64-linux-fedora33-release.tar.xz -C normal/
- tar -xf ghc-x86_64-linux-fedora33-release-hackage_docs.tar.xz -C hackage/
script:
- .gitlab/ci.sh compare_interfaces_of "normal/ghc-*" "hackage/ghc-*"
artifacts:
paths:
- out
rules:
- if: $NIGHTLY
############################################################
# Packaging
############################################################
......@@ -420,7 +498,7 @@ doc-tarball:
optional: true
- job: nightly-x86_64-windows-validate
optional: true
- job: release-x86_64-windows-release
- job: release-x86_64-windows-release+no_split_sections
optional: true
tags:
......@@ -444,7 +522,7 @@ doc-tarball:
|| mv "ghc-x86_64-linux-deb10-release.tar.xz" "$LINUX_BINDIST" \
|| true
mv "ghc-x86_64-windows-validate.tar.xz" "$WINDOWS_BINDIST" \
|| mv "ghc-x86_64-windows-release.tar.xz" "$WINDOWS_BINDIST" \
|| mv "ghc-x86_64-windows-release+no_split_sections.tar.xz" "$WINDOWS_BINDIST" \
|| true
if [ ! -f "$LINUX_BINDIST" ]; then
echo "Error: $LINUX_BINDIST does not exist. Did the Debian 9 job fail?"
......@@ -462,8 +540,6 @@ doc-tarball:
hackage-doc-tarball:
stage: packaging
needs:
- job: x86_64-linux-fedora33-release-hackage
optional: true
- job: nightly-x86_64-linux-fedora33-release-hackage
optional: true
- job: release-x86_64-linux-fedora33-release-hackage
......@@ -486,7 +562,7 @@ hackage-doc-tarball:
- tar -xf ghc-x86_64-linux-fedora33-release.tar.xz -C ghc*/
script:
- cd ghc*/
- mv .gitlab/upload_ghc_libs.py .
- mv .gitlab/rel_eng/upload_ghc_libs.py .
- .gitlab/ci.sh setup
- .gitlab/ci.sh configure
- ./upload_ghc_libs.py prepare --bindist ghc*linux/
......@@ -505,6 +581,7 @@ source-tarball:
paths:
- ghc-*.tar.xz
script:
- sudo chown ghc:ghc -R .
- ./boot
- ./configure
- ./hadrian/build source-dist
......@@ -558,6 +635,7 @@ test-bootstrap:
parallel: *bootstrap_matrix
dependencies: null
script:
- sudo chown ghc:ghc -R .
- mkdir test-bootstrap
- tar -xf ghc-*[0-9]-src.tar.xz -C test-bootstrap
- tar -xf ghc-*-testsuite.tar.xz -C test-bootstrap
......@@ -598,19 +676,21 @@ test-bootstrap:
# access to an unprivileged access token with the ability to query the ghc/ghc
# project such that it can find the job ID of the fedora33 job for the current
# pipeline.
#
# hackage-lint: Can be triggered on any MR, normal validate pipeline or nightly build.
# Runs head.hackage with -dlint and a slow-validate bindist
#
# hackage-label-lint: Trigged on MRs with "user-facing" label, runs the slow-validate
# head.hackage build with -dlint.
#
# nightly-hackage-lint: Runs automatically on nightly pipelines with slow-validate + dlint config.
#
# nightly-hackage-perf: Runs automaticaly on nightly pipelines with release build and eventlogging enabled.
#
# release-hackage-lint: Runs automatically on release pipelines with -dlint on a release bindist.
.hackage:
stage: testing
needs:
- job: x86_64-linux-fedora33-release
optional: true
artifacts: false
- job: nightly-x86_64-linux-fedora33-release
optional: true
artifacts: false
- job: release-x86_64-linux-fedora33-release
optional: true
artifacts: false
variables:
UPSTREAM_PROJECT_PATH: "$CI_PROJECT_PATH"
UPSTREAM_PROJECT_ID: "$CI_PROJECT_ID"
......@@ -618,34 +698,60 @@ test-bootstrap:
RELEASE_JOB: "$RELEASE_JOB"
trigger:
project: "ghc/head.hackage"
branch: "master"
branch: "upstream-testing"
strategy: "depend"
hackage-lint:
needs:
- job: x86_64-linux-deb10-numa-slow-validate
optional: true
artifacts: false
- job: nightly-x86_64-linux-deb10-numa-slow-validate
optional: true
artifacts: false
extends: .hackage
variables:
EXTRA_HC_OPTS: "-dcore-lint"
SLOW_VALIDATE: 1
EXTRA_HC_OPTS: "-dlint"
# No for release jobs because there isn't a slow-valdate bindist. There is an
# automatic pipeline for release bindists (see release-hackage-lint)
rules:
- if: '$RELEASE_JOB != "yes"'
when: manual
hackage-label-lint:
needs:
- job: x86_64-linux-deb10-numa-slow-validate
optional: true
artifacts: false
extends: .hackage
variables:
EXTRA_HC_OPTS: "-dcore-lint"
SLOW_VALIDATE: 1
EXTRA_HC_OPTS: "-dlint"
rules:
- if: '$CI_MERGE_REQUEST_LABELS =~ /.*user-facing.*/'
# The head.hackage job is split into two jobs because enabling `-dcore-lint`
# The head.hackage job is split into two jobs because enabling `-dlint`
# affects the total allocation numbers for the simplifier portion significantly.
nightly-hackage-lint:
needs:
- job: nightly-x86_64-linux-deb10-numa-slow-validate
optional: true
artifacts: false
rules:
- if: $NIGHTLY
variables:
NIGHTLY: "$NIGHTLY"
extends: .hackage
variables:
EXTRA_HC_OPTS: "-dcore-lint"
SLOW_VALIDATE: 1
EXTRA_HC_OPTS: "-dlint"
nightly-hackage-perf:
needs:
- job: nightly-x86_64-linux-fedora33-release
optional: true
artifacts: false
rules:
- if: $NIGHTLY
variables:
......@@ -657,6 +763,18 @@ nightly-hackage-perf:
# Ask head.hackage to generate eventlogs
EVENTLOGGING: 1
release-hackage-lint:
needs:
- job: release-x86_64-linux-fedora33-release
optional: true
artifacts: false
rules:
- if: '$RELEASE_JOB == "yes"'
extends: .hackage
variables:
# No slow-validate bindist on release pipeline
EXTRA_HC_OPTS: "-dlint"
############################################################
# Nofib testing
# (Disabled: See #21859)
......@@ -750,6 +868,46 @@ perf:
rules:
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/'
############################################################
# ABI testing
############################################################
abi-test:
stage: testing
needs:
- job: x86_64-linux-fedora33-release
optional: true
- job: nightly-x86_64-linux-fedora33-release
optional: true
- job: release-x86_64-linux-fedora33-release
optional: true
dependencies: null
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-fedora33:$DOCKER_REV"
rules:
- if: $CI_MERGE_REQUEST_ID
- if: '$CI_COMMIT_BRANCH == "master"'
- if: '$CI_COMMIT_BRANCH =~ /ghc-[0.9]+\.[0-9]+/'
tags:
- x86_64-linux
script:
- root=$(pwd)/ghc
- |
mkdir tmp
tar -xf ghc-x86_64-linux-fedora33-release.tar.xz -C tmp
pushd tmp/ghc-*/
./configure --prefix=$root
make install
popd
rm -Rf tmp
- export BOOT_HC=$(which ghc)
- export HC=$root/bin/ghc
- .gitlab/ci.sh abi_test
artifacts:
paths:
- out
rules:
- if: '$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/'
############################################################
# Documentation deployment via GitLab Pages
......@@ -787,62 +945,137 @@ pages:
paths:
- public
.x86_64-linux-ubuntu20_04-cross_wasm32-wasi-release:
stage: full-build
rules:
- when: always
#############################################################
# Generation of GHCUp metadata
#############################################################
# TODO: MP: This way of determining the project version is sadly very slow.
# It seems overkill to have to setup a complete environment, and build hadrian to get
# it to generate a single file containing the version information.
project-version:
stage: packaging
image: "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10:$DOCKER_REV"
tags:
- x86_64-linux
image: registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-ubuntu20_04:$DOCKER_REV
before_script:
- sudo chown ghc:ghc -R .
variables:
BIN_DIST_NAME: ghc-x86_64-linux-ubuntu20_04-cross_wasm32-wasi-int_$BIGNUM_BACKEND-release
BUILD_FLAVOUR: perf
CONFIGURE_ARGS: --with-intree-gmp --with-system-libffi
XZ_OPT: "-9"
CONF_CC_OPTS_STAGE2: -Wno-int-conversion -Wno-strict-prototypes -mnontrapping-fptoint -msign-ext -mbulk-memory -mmutable-globals -mreference-types
CONF_CXX_OPTS_STAGE2: -fno-exceptions -Wno-int-conversion -Wno-strict-prototypes -mnontrapping-fptoint -msign-ext -mbulk-memory -mmutable-globals -mreference-types
CONF_GCC_LINKER_OPTS_STAGE2: -Wl,--error-limit=0,--growable-table,--stack-first -Wno-unused-command-line-argument
CROSS_EMULATOR: wasmtime
CROSS_TARGET: wasm32-wasi
HADRIAN_ARGS: --docs=none
TEST_ENV: x86_64-linux-ubuntu20_04-cross_wasm32-wasi-int_$BIGNUM_BACKEND-release
BUILD_FLAVOUR: default
script:
- |
pushd libraries/process
curl https://patch-diff.githubusercontent.com/raw/haskell/process/pull/240.diff | git apply
popd
pushd utils/hsc2hs
curl https://patch-diff.githubusercontent.com/raw/haskell/hsc2hs/pull/68.diff | git apply
popd
# Calculate the project version
- sudo chown ghc:ghc -R .
- .gitlab/ci.sh setup
- .gitlab/ci.sh configure
- .gitlab/ci.sh run_hadrian VERSION
- echo "ProjectVersion=$(cat VERSION)" > version.sh
pushd "$(mktemp -d)"
curl -L https://gitlab.haskell.org/ghc/ghc-wasm-meta/-/archive/master/ghc-wasm-meta-master.tar.gz | tar xz --strip-components=1
PREFIX=/tmp/.ghc-wasm SKIP_GHC=1 ./setup.sh
source /tmp/.ghc-wasm/env
popd
needs: []
dependencies: []
artifacts:
paths:
- version.sh
rules:
- if: '$NIGHTLY'
- if: '$RELEASE_JOB == "yes"'
.gitlab/ci.sh setup
.gitlab/ci.sh configure
.gitlab/ci.sh build_hadrian
.gitlab/ci.sh test_hadrian
.ghcup-metadata:
stage: deploy
image: "nixos/nix:2.12.0"
dependencies: null
tags:
- x86_64-linux
variables:
BUILD_FLAVOUR: default
GIT_SUBMODULE_STRATEGY: "none"
before_script:
- echo "experimental-features = nix-command flakes" >> /etc/nix/nix.conf
- nix-channel --update
- cat version.sh
# Calculate the project version
- . ./version.sh
after_script:
- cat ci-timings
# Download existing ghcup metadata
- nix shell --extra-experimental-features nix-command --extra-experimental-features flakes nixpkgs#wget -c wget "https://raw.githubusercontent.com/haskell/ghcup-metadata/develop/ghcup-0.0.7.yaml"
- .gitlab/generate_job_metadata
artifacts:
expire_in: 1 year
paths:
- ghc-x86_64-linux-ubuntu20_04-cross_wasm32-wasi-int_$BIGNUM_BACKEND-release.tar.xz
when: always
- metadata_test.yaml
- version.sh
x86_64-linux-ubuntu20_04-cross_wasm32-wasi-int_gmp-release:
extends: .x86_64-linux-ubuntu20_04-cross_wasm32-wasi-release
ghcup-metadata-nightly:
extends: .ghcup-metadata
# Explicit needs for validate pipeline because we only need certain bindists
needs:
- job: nightly-x86_64-linux-fedora33-release
artifacts: false
- job: nightly-x86_64-linux-centos7-validate
artifacts: false
- job: nightly-x86_64-darwin-validate
artifacts: false
- job: nightly-aarch64-darwin-validate
artifacts: false
- job: nightly-x86_64-windows-validate
artifacts: false
- job: nightly-x86_64-linux-alpine3_12-int_native-validate+fully_static
artifacts: false
- job: nightly-x86_64-linux-deb9-validate
artifacts: false
- job: nightly-i386-linux-deb9-validate
artifacts: false
- job: nightly-x86_64-linux-deb10-validate
artifacts: false
- job: nightly-aarch64-linux-deb10-validate
artifacts: false
- job: nightly-x86_64-linux-deb11-validate
artifacts: false
- job: source-tarball
artifacts: false
- job: project-version
script:
- nix shell --extra-experimental-features nix-command -f .gitlab/rel_eng -c ghcup-metadata --metadata ghcup-0.0.7.yaml --pipeline-id="$CI_PIPELINE_ID" --version="$ProjectVersion" > "metadata_test.yaml"
rules:
- if: $NIGHTLY
ghcup-metadata-release:
# No explicit needs for release pipeline as we assume we need everything and everything will pass.
extends: .ghcup-metadata
script:
- nix shell --extra-experimental-features nix-command -f .gitlab/rel_eng -c ghcup-metadata --release-mode --metadata ghcup-0.0.7.yaml --pipeline-id="$CI_PIPELINE_ID" --version="$ProjectVersion" > "metadata_test.yaml"
rules:
- if: '$RELEASE_JOB == "yes"'
.ghcup-metadata-testing:
stage: deploy
variables:
UPSTREAM_PROJECT_PATH: "$CI_PROJECT_PATH"
UPSTREAM_PROJECT_ID: "$CI_PROJECT_ID"
UPSTREAM_PIPELINE_ID: "$CI_PIPELINE_ID"
RELEASE_JOB: "$RELEASE_JOB"
trigger:
project: "ghc/ghcup-ci"
branch: "upstream-testing"
strategy: "depend"
ghcup-metadata-testing-nightly:
needs:
- job: ghcup-metadata-nightly
artifacts: false
extends: .ghcup-metadata-testing
variables:
BIGNUM_BACKEND: gmp
NIGHTLY: "$NIGHTLY"
UPSTREAM_JOB_NAME: "ghcup-metadata-nightly"
rules:
- if: '$NIGHTLY == "1"'
x86_64-linux-ubuntu20_04-cross_wasm32-wasi-int_native-release:
extends: .x86_64-linux-ubuntu20_04-cross_wasm32-wasi-release
ghcup-metadata-testing-release:
needs:
- job: ghcup-metadata-release
artifacts: false
extends: .ghcup-metadata-testing
variables:
BIGNUM_BACKEND: native
UPSTREAM_JOB_NAME: "ghcup-metadata-release"
rules:
- if: '$RELEASE_JOB == "yes"'
when: manual
......@@ -59,12 +59,13 @@ Environment variables affecting both build systems:
CROSS_TARGET Triple of cross-compilation target.
VERBOSE Set to non-empty for verbose build output
RUNTEST_ARGS Arguments passed to runtest.py
MSYSTEM (Windows-only) Which platform to build form (MINGW64 or MINGW32).
MSYSTEM (Windows-only) Which platform to build from (CLANG64).
IGNORE_PERF_FAILURES
Whether to ignore perf failures (one of "increases",
"decreases", or "all")
HERMETIC Take measures to avoid looking at anything in \$HOME
CONFIGURE_ARGS Arguments passed to configure script.
CONFIGURE_WRAPPER Wrapper for the configure script (e.g. Emscripten's emconfigure).
ENABLE_NUMA Whether to enable numa support for the build (disabled by default)
INSTALL_CONFIGURE_ARGS
Arguments passed to the binary distribution configure script
......@@ -140,11 +141,7 @@ function setup_locale() {
function mingw_init() {
case "$MSYSTEM" in
MINGW32)
target_triple="i386-unknown-mingw32"
boot_triple="i386-unknown-mingw32" # triple of bootstrap GHC
;;
MINGW64)
CLANG64)
target_triple="x86_64-unknown-mingw32"
boot_triple="x86_64-unknown-mingw32" # triple of bootstrap GHC
;;
......@@ -212,10 +209,14 @@ function set_toolchain_paths() {
x86_64-darwin|aarch64-darwin) ;;
*) fail "unknown NIX_SYSTEM" ;;
esac
nix build -f .gitlab/darwin/toolchain.nix --argstr system "$NIX_SYSTEM" -o toolchain.sh
info "Building toolchain for $NIX_SYSTEM"
nix-build .gitlab/darwin/toolchain.nix --argstr system "$NIX_SYSTEM" -o toolchain.sh
cat toolchain.sh
fi
source toolchain.sh ;;
source toolchain.sh
info "--info for GHC for $NIX_SYSTEM"
$GHC --info
;;
env)
# These are generally set by the Docker image but
# we provide these handy fallbacks in case the
......@@ -232,10 +233,16 @@ function set_toolchain_paths() {
export CABAL
export HAPPY
export ALEX
if [[ "${CROSS_TARGET:-}" == *"wasm"* ]]; then
source "/home/ghc/.ghc-wasm/env"
fi
}
function cabal_update() {
run "$CABAL" update --index="$HACKAGE_INDEX_STATE"
# In principle -w shouldn't be necessary here but with
# cabal-install 3.8.1.0 it is, due to cabal#8447.
run "$CABAL" update -w "$GHC" --index="$HACKAGE_INDEX_STATE"
}
......@@ -310,11 +317,10 @@ function fetch_cabal() {
# N.B. Windows uses zip whereas all others use .tar.xz
MSYS_*|MINGW*)
case "$MSYSTEM" in
MINGW32) cabal_arch="i386" ;;
MINGW64) cabal_arch="x86_64" ;;
CLANG64) cabal_arch="x86_64" ;;
*) fail "unknown MSYSTEM $MSYSTEM" ;;
esac
url="https://downloads.haskell.org/~cabal/cabal-install-$v/cabal-install-$v-$cabal_arch-unknown-mingw32.zip"
url="https://downloads.haskell.org/~cabal/cabal-install-$v/cabal-install-$v-$cabal_arch-windows.zip"
info "Fetching cabal binary distribution from $url..."
curl "$url" > "$TMP/cabal.zip"
unzip "$TMP/cabal.zip"
......@@ -402,6 +408,11 @@ EOF
}
function configure() {
case "${CONFIGURE_WRAPPER:-}" in
emconfigure) source "$EMSDK/emsdk_env.sh" ;;
*) ;;
esac
if [[ -z "${NO_BOOT:-}" ]]; then
start_section "booting"
run python3 boot
......@@ -421,7 +432,7 @@ function configure() {
start_section "configuring"
# See https://stackoverflow.com/questions/7577052 for a rationale for the
# args[@] symbol-soup below.
run ./configure \
run ${CONFIGURE_WRAPPER:-} ./configure \
--enable-tarballs-autodownload \
"${args[@]+"${args[@]}"}" \
GHC="$GHC" \
......@@ -496,10 +507,17 @@ function build_hadrian() {
check_release_build
# We can safely enable parallel compression for x64. By the time
# hadrian calls tar/xz to produce bindist, there's no other build
# work taking place.
if [[ "${CI_JOB_NAME:-}" != *"i386"* ]]; then
XZ_OPT="${XZ_OPT:-} -T$cores"
fi
if [[ -n "${REINSTALL_GHC:-}" ]]; then
run_hadrian build-cabal -V
else
run_hadrian binary-dist -V
run_hadrian test:all_deps binary-dist -V
mv _build/bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
fi
......@@ -528,6 +546,11 @@ function make_install_destdir() {
# install the binary distribution in directory $1 to $2.
function install_bindist() {
case "${CONFIGURE_WRAPPER:-}" in
emconfigure) source "$EMSDK/emsdk_env.sh" ;;
*) ;;
esac
local bindist="$1"
local instdir="$2"
pushd "$bindist"
......@@ -545,7 +568,7 @@ function install_bindist() {
args+=( "--target=$CROSS_TARGET" "--host=$CROSS_TARGET" )
fi
run ./configure \
run ${CONFIGURE_WRAPPER:-} ./configure \
--prefix="$instdir" \
"${args[@]+"${args[@]}"}"
make_install_destdir "$TOP"/destdir "$instdir"
......@@ -575,19 +598,21 @@ function test_hadrian() {
fi
if [ -n "${CROSS_TARGET:-}" ]; then
if [ -n "${CROSS_EMULATOR:-}" ]; then
local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
install_bindist _build/bindist/ghc-*/ "$instdir"
echo 'main = putStrLn "hello world"' > expected
run "$test_compiler" -package ghc "$TOP/.gitlab/hello.hs" -o hello
$CROSS_EMULATOR ./hello > actual
run diff expected actual
else
info "Cannot test cross-compiled build without CROSS_EMULATOR being set."
return
fi
if [[ "${CROSS_EMULATOR:-}" == "NOT_SET" ]]; then
info "Cannot test cross-compiled build without CROSS_EMULATOR being set."
return
elif [ -n "${CROSS_TARGET:-}" ]; then
local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
install_bindist _build/bindist/ghc-*/ "$instdir"
echo 'main = putStrLn "hello world"' > expected
run "$test_compiler" -package ghc "$TOP/.gitlab/hello.hs" -o hello
# Despite "-o hello", ghc may output something like hello.exe or
# hello.wasm depending on the backend. For the time being let's
# just move it to hello before proceeding to running it.
mv hello.wasm hello || true
${CROSS_EMULATOR:-} ./hello > actual
run diff expected actual
elif [[ -n "${REINSTALL_GHC:-}" ]]; then
run_hadrian \
test \
......@@ -598,10 +623,11 @@ function test_hadrian() {
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian cabal-install test"
else
local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/ghc$exe"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
install_bindist _build/bindist/ghc-*/ "$instdir"
if [[ "${WINDOWS_HOST}" == "no" ]]; then
if [[ "${WINDOWS_HOST}" == "no" ]] && [ -z "${CROSS_TARGET:-}" ]
then
run_hadrian \
test \
--test-root-dirs=testsuite/tests/stage1 \
......@@ -610,10 +636,14 @@ function test_hadrian() {
info "STAGE1_TEST=$?"
fi
# Ensure the resulting compiler has the correct bignum-flavour
test_compiler_backend=$(${test_compiler} -e "GHC.Num.Backend.backendName")
if [ $test_compiler_backend != "\"$BIGNUM_BACKEND\"" ]; then
fail "Test compiler has a different BIGNUM_BACKEND ($test_compiler_backend) thean requested ($BIGNUM_BACKEND)"
# Ensure the resulting compiler has the correct bignum-flavour,
# except for cross-compilers as they may not support the interpreter
if [ -z "${CROSS_TARGET:-}" ]
then
test_compiler_backend=$(${test_compiler} -e "GHC.Num.Backend.backendName")
if [ $test_compiler_backend != "\"$BIGNUM_BACKEND\"" ]; then
fail "Test compiler has a different BIGNUM_BACKEND ($test_compiler_backend) thean requested ($BIGNUM_BACKEND)"
fi
fi
# If we are doing a release job, check the compiler can build a profiled executable
......@@ -636,6 +666,36 @@ function test_hadrian() {
}
function summarise_hi_files() {
for iface in $(find . -type f -name "*.hi" | sort); do echo "$iface $($HC --show-iface $iface | grep " ABI hash:")"; done | tee $OUT/abis
for iface in $(find . -type f -name "*.hi" | sort); do echo "$iface $($HC --show-iface $iface | grep " interface hash:")"; done | tee $OUT/interfaces
for iface in $(find . -type f -name "*.hi" | sort); do
fname="$OUT/$(dirname $iface)"
mkdir -p $fname
$HC --show-iface $iface > "$OUT/$iface"
done
}
function cabal_abi_test() {
if [ -z "$OUT" ]; then
fail "OUT not set"
fi
cp -r libraries/Cabal $DIR
pushd $DIR
echo $PWD
start_section "Cabal test: $OUT"
mkdir -p "$OUT"
run "$HC" \
-hidir tmp -odir tmp -fforce-recomp -haddock \
-iCabal/Cabal/src -XNoPolyKinds Distribution.Simple -j"$cores" \
"$@" 2>&1 | tee $OUT/log
summarise_hi_files
popd
end_section "Cabal test: $OUT"
}
function cabal_test() {
if [ -z "$OUT" ]; then
fail "OUT not set"
......@@ -666,6 +726,35 @@ function run_perf_test() {
OUT=out/Cabal-O2 cabal_test -O2
}
function check_interfaces(){
difference=$(diff "$1/$3" "$2/$3") || warn "diff failed"
if [ -z "$difference" ]
then
info "$1 and $2 $3 match"
else
echo $difference
for line in $(echo "$difference" | tr ' ' '\n' | grep ".hi" | sort | uniq); do
diff "$1/$line" "$2/$line"
done
fail "$3"
fi
}
function abi_test() {
for i in {1..20}; do info "iteration $i"; run_abi_test; done
}
function run_abi_test() {
if [ -z "$HC" ]; then
fail "HC not set"
fi
mkdir -p out
OUT="$PWD/out/run1" DIR=$(mktemp -d XXXX-looooooooong) cabal_abi_test -O0
OUT="$PWD/out/run2" DIR=$(mktemp -d XXXX-short) cabal_abi_test -O0
check_interfaces out/run1 out/run2 abis "Mismatched ABI hash"
check_interfaces out/run1 out/run2 interfaces "Mismatched interface hashes"
}
function save_cache () {
info "Storing cabal cache from $CABAL_DIR to $CABAL_CACHE..."
rm -Rf "$CABAL_CACHE"
......@@ -715,6 +804,22 @@ function lint_author(){
done
}
function abi_of(){
DIR=$(realpath $1)
mkdir -p "$OUT"
pushd $DIR
summarise_hi_files
popd
}
# Checks that the interfaces in folder $1 match the interfaces in folder $2
function compare_interfaces_of(){
OUT=$PWD/out/run1 abi_of $1
OUT=$PWD/out/run2 abi_of $2
check_interfaces out/run1 out/run2 abis "Mismatched ABI hash"
check_interfaces out/run1 out/run2 interfaces "Mismatched interface hashes"
}
setup_locale
......@@ -810,8 +915,10 @@ case $1 in
exit $res ;;
run_hadrian) shift; run_hadrian "$@" ;;
perf_test) run_perf_test ;;
abi_test) abi_test ;;
cabal_test) cabal_test ;;
lint_author) shift; lint_author "$@" ;;
compare_interfaces_of) shift; compare_interfaces_of "$@" ;;
clean) clean ;;
save_cache) save_cache ;;
shell) shift; shell "$@" ;;
......
......@@ -12,15 +12,15 @@
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
},
"nixpkgs": {
"branch": "wip/ghc-8.10.7-darwin",
"branch": "master",
"description": "Nix Packages collection",
"homepage": "",
"owner": "bgamari",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "37c60356e3f83c708a78a96fdd914b5ffc1f551c",
"sha256": "0i5j7nwk4ky0fg4agla3aznadpxz0jyrdwp2q92hyxidra987syn",
"rev": "ce1aa29621356706746c53e2d480da7c68f6c972",
"sha256": "sha256:1sbs3gi1nf4rcbmnw69fw0fpvb3qvlsa84hqimv78vkpd6xb0bgg",
"type": "tarball",
"url": "https://github.com/bgamari/nixpkgs/archive/37c60356e3f83c708a78a96fdd914b5ffc1f551c.tar.gz",
"url": "https://github.com/nixos/nixpkgs/archive/ce1aa29621356706746c53e2d480da7c68f6c972.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
}
}
......@@ -15,16 +15,16 @@ let
ghcBindists = let version = ghc.version; in {
aarch64-darwin = pkgs.fetchurl {
url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-aarch64-apple-darwin.tar.xz";
sha256 = "sha256:0p2f35pihlnmkm7x73b5xm3dyhiczrywc19khr7i7vb2q1y4zw6i";
sha256 = "sha256-tQUHsingxBizLktswGAoi6lJf92RKWLjsHB9CisANlg=";
};
x86_64-darwin = pkgs.fetchurl {
url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-x86_64-apple-darwin.tar.xz";
sha256 = "sha256:0gzq0vfjbhr9n8z63capvdwrw7bisy15d5c1y1gynfix13bbnjlk";
sha256 = "sha256-OjXjVe+ZODDCc/hqtihqqz6CX25TKI0ZgORzkR5O3pQ=";
};
};
ghc = pkgs.stdenv.mkDerivation rec {
version = "9.2.2";
version = "9.4.4";
name = "ghc";
src = ghcBindists.${pkgs.stdenv.hostPlatform.system};
configureFlags = [
......@@ -38,6 +38,21 @@ let
];
buildPhase = "true";
# This is a horrible hack because the configure script invokes /usr/bin/clang
# without a `--target` flag. Then depending on whether the `nix` binary itself is
# a native x86 or arm64 binary means that /usr/bin/clang thinks it needs to run in
# x86 or arm64 mode.
# The correct answer for the check in question is the first one we try, so by replacing
# the condition to true; we select the right C++ standard library still.
preConfigure = ''
sed "s/\"\$CC\" -o actest actest.o \''${1} 2>\/dev\/null/true/i" configure > configure.new
mv configure.new configure
chmod +x configure
cat configure
'';
# N.B. Work around #20253.
nativeBuildInputs = [ pkgs.gnused ];
postInstallPhase = ''
......@@ -98,6 +113,6 @@ pkgs.writeTextFile {
export CABAL="$CABAL_INSTALL"
sdk_path="$(xcrun --sdk macosx --show-sdk-path)"
export CONFIGURE_ARGS="$CONFIGURE_ARGS --with-ffi-libraries=$sdk_path/usr/lib --with-ffi-includes=$sdk_path/usr/include/ffi"
export CONFIGURE_ARGS="$CONFIGURE_ARGS --with-ffi-libraries=$sdk_path/usr/lib --with-ffi-includes=$sdk_path/usr/include/ffi --build=${targetTriple}"
'';
}
cabal-version: 3.0
name: gen-ci
version: 0.1.0.0
build-type: Simple
common warnings
ghc-options: -Wall
executable gen_ci
import: warnings
main-is: gen_ci.hs
build-depends:
, aeson >=1.8.1
, base
, bytestring
, containers
default-language: Haskell2010
......@@ -7,17 +7,17 @@
build-depends: base, aeson >= 1.8.1, containers, bytestring
-}
import Data.Coerce
import Data.String (String)
import Data.Aeson as A
import qualified Data.Map as Map
import Data.Map (Map)
import qualified Data.ByteString.Lazy as B hiding (putStrLn)
import Data.Maybe
import qualified Data.ByteString.Lazy as B
import qualified Data.ByteString.Lazy.Char8 as B
import Data.List (intercalate)
import Data.Set (Set)
import qualified Data.Set as S
import System.Environment
import Data.Maybe
{-
Note [Generating the CI pipeline]
......@@ -85,6 +85,16 @@ names of jobs to update these other places.
3. The ghc-head-from script downloads release artifacts based on a pipeline change.
4. Some subsequent CI jobs have explicit dependencies (for example docs-tarball, perf, perf-nofib)
Note [Generation Modes]
~~~~~~~~~~~~~~~~~~~~~~~
There are two different modes this script can operate in:
* `gitlab`: Generates a job.yaml which defines all the pipelines for the platforms
* `metadata`: Generates a file which maps a platform the the "default" validate and
nightly pipeline. This file is intended to be used when generating
ghcup metadata.
-}
-----------------------------------------------------------------------------
......@@ -99,9 +109,15 @@ data Opsys
| Windows deriving (Eq)
data LinuxDistro
= Debian11 | Debian10 | Debian9 | Fedora33 | Ubuntu2004 | Centos7 | Alpine deriving (Eq)
= Debian11 | Debian10 | Debian9
| Fedora33
| Ubuntu2004
| Centos7
| Alpine
| Rocky8
deriving (Eq)
data Arch = Amd64 | AArch64 | ARMv7 | I386
data Arch = Amd64 | AArch64 | I386
data BignumBackend = Native | Gmp deriving Eq
......@@ -109,6 +125,11 @@ bignumString :: BignumBackend -> String
bignumString Gmp = "gmp"
bignumString Native = "native"
data CrossEmulator
= NoEmulator
| NoEmulatorNeeded
| Emulator String
-- | A BuildConfig records all the options which can be modified to affect the
-- bindists produced by the compiler.
data BuildConfig
......@@ -120,18 +141,21 @@ data BuildConfig
, withAssertions :: Bool
, withNuma :: Bool
, crossTarget :: Maybe String
, crossEmulator :: Maybe String
, crossEmulator :: CrossEmulator
, configureWrapper :: Maybe String
, fullyStatic :: Bool
, tablesNextToCode :: Bool
, threadSanitiser :: Bool
, noSplitSections :: Bool
}
-- Extra arguments to pass to ./configure due to the BuildConfig
configureArgsStr :: BuildConfig -> String
configureArgsStr bc = intercalate " " $
configureArgsStr bc = unwords $
["--enable-unregisterised"| unregisterised bc ]
++ ["--disable-tables-next-to-code" | not (tablesNextToCode bc) ]
++ ["--with-intree-gmp" | Just _ <- pure (crossTarget bc) ]
++ ["--with-system-libffi" | crossTarget bc == Just "wasm32-wasi" ]
-- Compute the hadrian flavour from the BuildConfig
mkJobFlavour :: BuildConfig -> Flavour
......@@ -140,13 +164,14 @@ mkJobFlavour BuildConfig{..} = Flavour buildFlavour opts
opts = [Llvm | llvmBootstrap] ++
[Dwarf | withDwarf] ++
[FullyStatic | fullyStatic] ++
[ThreadSanitiser | threadSanitiser]
[ThreadSanitiser | threadSanitiser] ++
[NoSplitSections | noSplitSections, buildFlavour == Release ]
data Flavour = Flavour BaseFlavour [FlavourTrans]
data FlavourTrans = Llvm | Dwarf | FullyStatic | ThreadSanitiser
data FlavourTrans = Llvm | Dwarf | FullyStatic | ThreadSanitiser | NoSplitSections
data BaseFlavour = Release | Validate | SlowValidate
data BaseFlavour = Release | Validate | SlowValidate deriving Eq
-----------------------------------------------------------------------------
-- Build Configs
......@@ -163,12 +188,17 @@ vanilla = BuildConfig
, withAssertions = False
, withNuma = False
, crossTarget = Nothing
, crossEmulator = Nothing
, crossEmulator = NoEmulator
, configureWrapper = Nothing
, fullyStatic = False
, tablesNextToCode = True
, threadSanitiser = False
, noSplitSections = False
}
splitSectionsBroken :: BuildConfig -> BuildConfig
splitSectionsBroken bc = bc { noSplitSections = True }
nativeInt :: BuildConfig
nativeInt = vanilla { bignumBackend = Native }
......@@ -195,11 +225,13 @@ staticNativeInt :: BuildConfig
staticNativeInt = static { bignumBackend = Native }
crossConfig :: String -- ^ target triple
-> Maybe String -- ^ emulator for testing
-> CrossEmulator -- ^ emulator for testing
-> Maybe String -- ^ Configure wrapper
-> BuildConfig
crossConfig triple emulator =
crossConfig triple emulator configure_wrapper =
vanilla { crossTarget = Just triple
, crossEmulator = emulator
, configureWrapper = configure_wrapper
}
llvm :: BuildConfig
......@@ -217,16 +249,16 @@ noTntc = vanilla { tablesNextToCode = False }
-- | These tags have to match what we call the runners on gitlab
runnerTag :: Arch -> Opsys -> String
runnerTag arch (Linux distro) =
runnerTag arch (Linux _) =
case arch of
Amd64 -> "x86_64-linux"
AArch64 -> "aarch64-linux"
ARMv7 -> "armv7-linux"
I386 -> "x86_64-linux"
runnerTag AArch64 Darwin = "aarch64-darwin"
runnerTag Amd64 Darwin = "x86_64-darwin-m1"
runnerTag Amd64 Windows = "new-x86_64-windows"
runnerTag Amd64 FreeBSD13 = "x86_64-freebsd13"
runnerTag _ _ = error "Invalid arch/opsys"
tags :: Arch -> Opsys -> BuildConfig -> [String]
tags arch opsys _bc = [runnerTag arch opsys] -- Tag for which runners we can use
......@@ -241,6 +273,7 @@ distroName Fedora33 = "fedora33"
distroName Ubuntu2004 = "ubuntu20_04"
distroName Centos7 = "centos7"
distroName Alpine = "alpine3_12"
distroName Rocky8 = "rocky8"
opsysName :: Opsys -> String
opsysName (Linux distro) = "linux-" ++ distroName distro
......@@ -251,7 +284,6 @@ opsysName Windows = "windows"
archName :: Arch -> String
archName Amd64 = "x86_64"
archName AArch64 = "aarch64"
archName ARMv7 = "armv7"
archName I386 = "i386"
binDistName :: Arch -> Opsys -> BuildConfig -> String
......@@ -283,6 +315,7 @@ flavourString (Flavour base trans) = baseString base ++ concatMap (("+" ++) . fl
flavourString Dwarf = "debug_info"
flavourString FullyStatic = "fully_static"
flavourString ThreadSanitiser = "thread_sanitizer"
flavourString NoSplitSections = "no_split_sections"
-- The path to the docker image (just for linux builders)
dockerImage :: Arch -> Opsys -> Maybe String
......@@ -310,18 +343,21 @@ dockerImage _ _ = Nothing
-- The "proper" solution would be to use a dependent monoidal map where each key specifies
-- the combination behaviour of it's values. Ie, whether setting it multiple times is an error
-- or they should be combined.
newtype MonoidalMap k v = MonoidalMap (Map k v)
newtype MonoidalMap k v = MonoidalMap { unMonoidalMap :: Map k v }
deriving (Eq, Show, Functor, ToJSON)
instance (Ord k, Semigroup v) => Semigroup (MonoidalMap k v) where
(MonoidalMap a) <> (MonoidalMap b) = MonoidalMap (Map.unionWith (<>) a b)
instance (Ord k, Semigroup v) => Monoid (MonoidalMap k v) where
mempty = MonoidalMap (Map.empty)
mempty = MonoidalMap Map.empty
mminsertWith :: Ord k => (a -> a -> a) -> k -> a -> MonoidalMap k a -> MonoidalMap k a
mminsertWith f k v (MonoidalMap m) = MonoidalMap (Map.insertWith f k v m)
mmlookup :: Ord k => k -> MonoidalMap k a -> Maybe a
mmlookup k (MonoidalMap m) = Map.lookup k m
type Variables = MonoidalMap String [String]
(=:) :: String -> String -> Variables
......@@ -335,21 +371,9 @@ opsysVariables _ FreeBSD13 = mconcat
-- [1] https://www.freebsd.org/doc/en/books/porters-handbook/using-iconv.html)
"CONFIGURE_ARGS" =: "--with-gmp-includes=/usr/local/include --with-gmp-libraries=/usr/local/lib --with-iconv-includes=/usr/local/include --with-iconv-libraries=/usr/local/lib"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
, "GHC_VERSION" =: "9.2.2"
, "CABAL_INSTALL_VERSION" =: "3.6.2.0"
, "GHC_VERSION" =: "9.4.3"
, "CABAL_INSTALL_VERSION" =: "3.8.1.0"
]
opsysVariables ARMv7 (Linux distro) =
distroVariables distro <>
mconcat [ "CONFIGURE_ARGS" =: "--host=armv7-linux-gnueabihf --build=armv7-linux-gnueabihf --target=armv7-linux-gnueabihf"
-- N.B. We disable ld.lld explicitly here because it appears to fail
-- non-deterministically on ARMv7. See #18280.
, "LD" =: "ld.gold"
, "GccUseLdOpt" =: "-fuse-ld=gold"
-- Awkwardly, this appears to be necessary to work around a
-- live-lock exhibited by the CPython (at least in 3.9 and 3.8)
-- interpreter on ARMv7
, "HADRIAN_ARGS" =: "--test-verbose=3"
]
opsysVariables _ (Linux distro) = distroVariables distro
opsysVariables AArch64 (Darwin {}) =
mconcat [ "NIX_SYSTEM" =: "aarch64-darwin"
......@@ -373,11 +397,11 @@ opsysVariables Amd64 (Darwin {}) =
]
opsysVariables _ (Windows {}) =
mconcat [ "MSYSTEM" =: "MINGW64"
mconcat [ "MSYSTEM" =: "CLANG64"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
, "LANG" =: "en_US.UTF-8"
, "CABAL_INSTALL_VERSION" =: "3.2.0.0"
, "GHC_VERSION" =: "9.2.2" ]
, "CABAL_INSTALL_VERSION" =: "3.8.1.0"
, "GHC_VERSION" =: "9.4.3" ]
opsysVariables _ _ = mempty
......@@ -396,6 +420,9 @@ distroVariables Alpine = mconcat
distroVariables Centos7 = mconcat [
"HADRIAN_ARGS" =: "--docs=no-sphinx"
]
distroVariables Rocky8 = mconcat [
"HADRIAN_ARGS" =: "--docs=no-sphinx"
]
distroVariables Fedora33 = mconcat
-- LLC/OPT do not work for some reason in our fedora images
-- These tests fail with this error: T11649 T5681 T7571 T8131b
......@@ -496,13 +523,13 @@ instance ToJSON ManualFlag where
toJSON OnSuccess = "on_success"
instance ToJSON OnOffRules where
toJSON rules = toJSON [(object ([
toJSON rules = toJSON [object ([
"if" A..= and_all (map one_rule (enumRules rules))
, "when" A..= toJSON (when rules)]
-- Necessary to stop manual jobs stopping pipeline progress
-- https://docs.gitlab.com/ee/ci/yaml/#rulesallow_failure
++
["allow_failure" A..= True | when rules == Manual ]))]
["allow_failure" A..= True | when rules == Manual ])]
where
one_rule (OnOffRule onoff r) = ruleString onoff r
......@@ -516,7 +543,6 @@ data Rule = FastCI -- ^ Run this job when the fast-ci label is set
| Nightly -- ^ Only run this job in the nightly pipeline
| LLVMBackend -- ^ Only run this job when the "LLVM backend" label is present
| FreeBSDLabel -- ^ Only run this job when the "FreeBSD" label is set.
| ARMLabel -- ^ Only run this job when the "ARM" label is set.
| Disable -- ^ Don't run this job.
deriving (Bounded, Enum, Ord, Eq)
......@@ -537,8 +563,6 @@ ruleString On LLVMBackend = "$CI_MERGE_REQUEST_LABELS =~ /.*LLVM backend.*/"
ruleString Off LLVMBackend = true
ruleString On FreeBSDLabel = "$CI_MERGE_REQUEST_LABELS =~ /.*FreeBSD.*/"
ruleString Off FreeBSDLabel = true
ruleString On ARMLabel = "$CI_MERGE_REQUEST_LABELS =~ /.*ARM.*/"
ruleString Off ARMLabel = true
ruleString On ReleaseOnly = "$RELEASE_JOB == \"yes\""
ruleString Off ReleaseOnly = "$RELEASE_JOB != \"yes\""
ruleString On Nightly = "$NIGHTLY"
......@@ -567,6 +591,7 @@ data Job
, jobArtifacts :: Artifacts
, jobCache :: Cache
, jobRules :: OnOffRules
, jobPlatform :: (Arch, Opsys)
}
instance ToJSON Job where
......@@ -581,7 +606,7 @@ instance ToJSON Job where
, "allow_failure" A..= jobAllowFailure
-- Joining up variables like this may well be the wrong thing to do but
-- at least it doesn't lose information silently by overriding.
, "variables" A..= fmap (intercalate " ") jobVariables
, "variables" A..= fmap unwords jobVariables
, "artifacts" A..= jobArtifacts
, "cache" A..= jobCache
, "after_script" A..= jobAfterScript
......@@ -590,9 +615,11 @@ instance ToJSON Job where
]
-- | Build a job description from the system description and 'BuildConfig'
job :: Arch -> Opsys -> BuildConfig -> (String, Job)
job arch opsys buildConfig = (jobName, Job {..})
job :: Arch -> Opsys -> BuildConfig -> NamedJob Job
job arch opsys buildConfig = NamedJob { name = jobName, jobInfo = Job {..} }
where
jobPlatform = (arch, opsys)
jobRules = emptyRules
jobName = testEnv arch opsys buildConfig
......@@ -609,6 +636,7 @@ job arch opsys buildConfig = (jobName, Job {..})
, "bash .gitlab/ci.sh test_hadrian" ]
| otherwise
= [ "find libraries -name config.sub -exec cp config.sub {} \\;" | Darwin == opsys ] ++
[ "sudo apk del --purge glibc*" | opsys == Linux Alpine, isNothing $ crossTarget buildConfig ] ++
[ "sudo chown ghc:ghc -R ." | Linux {} <- [opsys]] ++
[ ".gitlab/ci.sh setup"
, ".gitlab/ci.sh configure"
......@@ -636,8 +664,14 @@ job arch opsys buildConfig = (jobName, Job {..})
, "BUILD_FLAVOUR" =: flavourString jobFlavour
, "BIGNUM_BACKEND" =: bignumString (bignumBackend buildConfig)
, "CONFIGURE_ARGS" =: configureArgsStr buildConfig
, maybe mempty ("CONFIGURE_WRAPPER" =:) (configureWrapper buildConfig)
, maybe mempty ("CROSS_TARGET" =:) (crossTarget buildConfig)
, maybe mempty ("CROSS_EMULATOR" =:) (crossEmulator buildConfig)
, case crossEmulator buildConfig of
NoEmulator -> case crossTarget buildConfig of
Nothing -> mempty
Just _ -> "CROSS_EMULATOR" =: "NOT_SET" -- we need an emulator but it isn't set. Won't run the testsuite
Emulator s -> "CROSS_EMULATOR" =: s
NoEmulatorNeeded -> mempty
, if withNuma buildConfig then "ENABLE_NUMA" =: "1" else mempty
]
......@@ -672,11 +706,11 @@ job arch opsys buildConfig = (jobName, Job {..})
-- | Modify all jobs in a 'JobGroup'
modifyJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyJobs f = fmap f
modifyJobs = fmap
-- | Modify just the validate jobs in a 'JobGroup'
modifyValidateJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyValidateJobs f jg = jg { v = f <$> (v jg) }
modifyValidateJobs f jg = jg { v = f <$> v jg }
-- Generic helpers
......@@ -686,22 +720,29 @@ addJobRule r j = j { jobRules = enableRule r (jobRules j) }
addVariable :: String -> String -> Job -> Job
addVariable k v j = j { jobVariables = mminsertWith (++) k [v] (jobVariables j) }
setVariable :: String -> String -> Job -> Job
setVariable k v j = j { jobVariables = MonoidalMap $ Map.insert k [v] $ unMonoidalMap $ jobVariables j }
delVariable :: String -> Job -> Job
delVariable k j = j { jobVariables = MonoidalMap $ Map.delete k $ unMonoidalMap $ jobVariables j }
-- Building the standard jobs
--
-- | Make a normal validate CI job
validate :: Arch -> Opsys -> BuildConfig -> (String, Job)
validate arch opsys bc =
job arch opsys bc
validate :: Arch -> Opsys -> BuildConfig -> NamedJob Job
validate = job
-- | Make a normal nightly CI job
nightly :: Arch -> Opsys -> BuildConfig -> NamedJob Job
nightly arch opsys bc =
let (n, j) = job arch opsys bc
in ("nightly-" ++ n, addJobRule Nightly . keepArtifacts "8 weeks" . highCompression $ j)
let NamedJob n j = job arch opsys bc
in NamedJob { name = "nightly-" ++ n, jobInfo = addJobRule Nightly . keepArtifacts "8 weeks" . highCompression $ j}
-- | Make a normal release CI job
release :: Arch -> Opsys -> BuildConfig -> NamedJob Job
release arch opsys bc =
let (n, j) = job arch opsys (bc { buildFlavour = Release })
in ("release-" ++ n, addJobRule ReleaseOnly . keepArtifacts "1 year" . ignorePerfFailures . highCompression $ j)
let NamedJob n j = job arch opsys (bc { buildFlavour = Release })
in NamedJob { name = "release-" ++ n, jobInfo = addJobRule ReleaseOnly . keepArtifacts "1 year" . ignorePerfFailures . highCompression $ j}
-- Specific job modification functions
......@@ -744,17 +785,33 @@ addValidateRule t = modifyValidateJobs (addJobRule t)
disableValidate :: JobGroup Job -> JobGroup Job
disableValidate = addValidateRule Disable
data NamedJob a = NamedJob { name :: String, jobInfo :: a } deriving Functor
renameJob :: (String -> String) -> NamedJob a -> NamedJob a
renameJob f (NamedJob n i) = NamedJob (f n) i
instance ToJSON a => ToJSON (NamedJob a) where
toJSON nj = object
[ "name" A..= name nj
, "jobInfo" A..= jobInfo nj ]
-- Jobs are grouped into either triples or pairs depending on whether the
-- job is just validate and nightly, or also release.
data JobGroup a = StandardTriple { v :: (String, a)
, n :: (String, a)
, r :: (String, a) }
| ValidateOnly { v :: (String, a)
, n :: (String, a) } deriving Functor
data JobGroup a = StandardTriple { v :: NamedJob a
, n :: NamedJob a
, r :: NamedJob a }
| ValidateOnly { v :: NamedJob a
, n :: NamedJob a } deriving Functor
instance ToJSON a => ToJSON (JobGroup a) where
toJSON jg = object
[ "n" A..= n jg
, "r" A..= r jg
]
rename :: (String -> String) -> JobGroup a -> JobGroup a
rename f (StandardTriple (nv, v) (nn, n) (nr, r)) = StandardTriple (f nv, v) (f nn, n) (f nr, r)
rename f (ValidateOnly (nv, v) (nn, n)) = ValidateOnly (f nv, v) (f nn, n)
rename f (StandardTriple nv nn nr) = StandardTriple (renameJob f nv) (renameJob f nn) (renameJob f nr)
rename f (ValidateOnly nv nn) = ValidateOnly (renameJob f nv) (renameJob f nn)
-- | Construct a 'JobGroup' which consists of a validate, nightly and release build with
-- a specific config.
......@@ -775,16 +832,24 @@ validateBuilds :: Arch -> Opsys -> BuildConfig -> JobGroup Job
validateBuilds a op bc = ValidateOnly (validate a op bc) (nightly a op bc)
flattenJobGroup :: JobGroup a -> [(String, a)]
flattenJobGroup (StandardTriple a b c) = [a,b,c]
flattenJobGroup (ValidateOnly a b) = [a, b]
flattenJobGroup (StandardTriple a b c) = map flattenNamedJob [a,b,c]
flattenJobGroup (ValidateOnly a b) = map flattenNamedJob [a, b]
flattenNamedJob :: NamedJob a -> (String, a)
flattenNamedJob (NamedJob n i) = (n, i)
-- | Specification for all the jobs we want to build.
jobs :: Map String Job
jobs = Map.fromList $ concatMap flattenJobGroup $
jobs = Map.fromList $ concatMap (filter is_enabled_job . flattenJobGroup) job_groups
where
is_enabled_job (_, Job {jobRules = OnOffRules {..}}) = not $ Disable `S.member` rule_set
job_groups :: [JobGroup Job]
job_groups =
[ disableValidate (standardBuilds Amd64 (Linux Debian10))
, (standardBuildsWithConfig Amd64 (Linux Debian10) dwarf)
, (validateBuilds Amd64 (Linux Debian10) nativeInt)
, standardBuildsWithConfig Amd64 (Linux Debian10) dwarf
, validateBuilds Amd64 (Linux Debian10) nativeInt
, fastCI (validateBuilds Amd64 (Linux Debian10) unreg)
, fastCI (validateBuilds Amd64 (Linux Debian10) debug)
, modifyValidateJobs manual tsan_jobs
......@@ -794,31 +859,38 @@ jobs = Map.fromList $ concatMap flattenJobGroup $
, disableValidate (standardBuilds Amd64 (Linux Debian11))
-- We still build Deb9 bindists for now due to Ubuntu 18 and Linux Mint 19
-- not being at EOL until April 2023 and they still need tinfo5.
, disableValidate (standardBuilds Amd64 (Linux Debian9))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Debian9) (splitSectionsBroken vanilla))
, disableValidate (standardBuilds Amd64 (Linux Ubuntu2004))
, disableValidate (standardBuilds Amd64 (Linux Centos7))
, disableValidate (standardBuilds Amd64 (Linux Rocky8))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Centos7) (splitSectionsBroken vanilla))
-- Fedora33 job is always built with perf so there's one job in the normal
-- validate pipeline which is built with perf.
, (standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig)
, standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig
-- This job is only for generating head.hackage docs
, hackage_doc_job (disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) dwarf)
, fastCI (standardBuilds Amd64 Windows)
, disableValidate (standardBuildsWithConfig Amd64 Windows nativeInt)
, fastCI (standardBuildsWithConfig Amd64 Windows (splitSectionsBroken vanilla))
, disableValidate (standardBuildsWithConfig Amd64 Windows (splitSectionsBroken nativeInt))
, standardBuilds Amd64 Darwin
, allowFailureGroup (addValidateRule FreeBSDLabel (standardBuilds Amd64 FreeBSD13))
, standardBuilds AArch64 Darwin
, standardBuilds AArch64 (Linux Debian10)
, allowFailureGroup (addValidateRule ARMLabel (standardBuilds ARMv7 (Linux Debian10)))
, standardBuilds I386 (Linux Debian9)
, allowFailureGroup (standardBuildsWithConfig Amd64 (Linux Alpine) static)
, standardBuildsWithConfig AArch64 (Linux Debian10) (splitSectionsBroken vanilla)
, disableValidate (validateBuilds AArch64 (Linux Debian10) llvm)
, standardBuildsWithConfig I386 (Linux Debian9) (splitSectionsBroken vanilla)
, standardBuildsWithConfig Amd64 (Linux Alpine) (splitSectionsBroken static)
, disableValidate (allowFailureGroup (standardBuildsWithConfig Amd64 (Linux Alpine) staticNativeInt))
, validateBuilds Amd64 (Linux Debian11) (crossConfig "aarch64-linux-gnu" (Just "qemu-aarch64 -L /usr/aarch64-linux-gnu"))
, validateBuilds Amd64 (Linux Debian11) (crossConfig "aarch64-linux-gnu" (Emulator "qemu-aarch64 -L /usr/aarch64-linux-gnu") Nothing)
, validateBuilds Amd64 (Linux Debian11) (crossConfig "js-unknown-ghcjs" NoEmulatorNeeded (Just "emconfigure")
)
{ bignumBackend = Native
}
, make_wasm_jobs wasm_build_config
, disableValidate $ make_wasm_jobs wasm_build_config { bignumBackend = Native }
, disableValidate $ make_wasm_jobs wasm_build_config { unregisterised = True }
]
where
hackage_doc_job = rename (<> "-hackage") . modifyJobs (addVariable "HADRIAN_ARGS" "--haddock-base-url")
tsan_jobs =
modifyJobs
( addVariable "TSAN_OPTIONS" "suppressions=$CI_PROJECT_DIR/rts/.tsan-suppressions"
......@@ -827,10 +899,74 @@ jobs = Map.fromList $ concatMap flattenJobGroup $
. addVariable "HADRIAN_ARGS" "--docs=none") $
validateBuilds Amd64 (Linux Debian10) tsan
make_wasm_jobs cfg =
modifyJobs
( delVariable "BROKEN_TESTS"
. setVariable "HADRIAN_ARGS" "--docs=none"
. delVariable "INSTALL_CONFIGURE_ARGS"
)
$ validateBuilds Amd64 (Linux Alpine) cfg
wasm_build_config =
(crossConfig "wasm32-wasi" NoEmulatorNeeded Nothing)
{
fullyStatic = True
, buildFlavour = Release -- TODO: This needs to be validate but wasm backend doesn't pass yet
}
mkPlatform :: Arch -> Opsys -> String
mkPlatform arch opsys = archName arch <> "-" <> opsysName opsys
-- | This map tells us for a specific arch/opsys combo what the job name for
-- nightly/release pipelines is. This is used by the ghcup metadata generation so that
-- things like bindist names etc are kept in-sync.
--
-- For cases where there are just
--
-- Otherwise:
-- * Prefer jobs which have a corresponding release pipeline
-- * Explicitly require tie-breaking for other cases.
platform_mapping :: Map String (JobGroup BindistInfo)
platform_mapping = Map.map go $
Map.fromListWith combine [ (uncurry mkPlatform (jobPlatform (jobInfo $ v j)), j) | j <- job_groups ]
where
whitelist = [ "x86_64-linux-alpine3_12-int_native-validate+fully_static"
, "x86_64-linux-deb10-validate"
, "x86_64-linux-fedora33-release"
, "x86_64-windows-validate"
]
combine a b
| name (v a) `elem` whitelist = a -- Explicitly selected
| name (v b) `elem` whitelist = b
| hasReleaseBuild a, not (hasReleaseBuild b) = a -- Has release build, but other doesn't
| hasReleaseBuild b, not (hasReleaseBuild a) = b
| otherwise = error (show (name (v a)) ++ show (name (v b)))
go = fmap (BindistInfo . unwords . fromJust . mmlookup "BIN_DIST_NAME" . jobVariables)
hasReleaseBuild (StandardTriple{}) = True
hasReleaseBuild (ValidateOnly{}) = False
data BindistInfo = BindistInfo { bindistName :: String }
instance ToJSON BindistInfo where
toJSON (BindistInfo n) = object [ "bindistName" A..= n ]
main :: IO ()
main = do
as <- getArgs
ass <- getArgs
case ass of
-- See Note [Generation Modes]
("gitlab":as) -> write_result as jobs
("metadata":as) -> write_result as platform_mapping
_ -> error "gen_ci.hs <gitlab|metadata> [file.json]"
write_result as obj =
(case as of
[] -> B.putStrLn
(fp:_) -> B.writeFile fp)
(A.encode jobs)
(A.encode obj)
#! /usr/bin/env nix-shell
#!nix-shell -i bash -p cabal-install "haskell.packages.ghc924.ghcWithPackages (pkgs: with pkgs; [aeson])" git jq
cd "$(dirname "${BASH_SOURCE[0]}")"
cabal run gen_ci -- metadata jobs-metadata.json
#! /usr/bin/env nix-shell
#! nix-shell -i bash -p cabal-install ghc jq
#!/usr/bin/env nix-shell
#!nix-shell -i bash -p cabal-install "haskell.packages.ghc924.ghcWithPackages (pkgs: with pkgs; [aeson])" git jq
# shellcheck shell=bash
set -euo pipefail
cd "$(dirname "${BASH_SOURCE[0]}")"
tmp=$(mktemp)
./gen_ci.hs $tmp
cabal run gen_ci -- gitlab $tmp
rm -f jobs.yaml
echo "### THIS IS A GENERATED FILE, DO NOT MODIFY DIRECTLY" > jobs.yaml
cat $tmp | jq | tee -a jobs.yaml
{-# OPTIONS_GHC -Wno-missing-fields #-}
{-# OPTIONS_GHC -Wall -Wno-missing-fields #-}
import GHC hiding (parseModule)
import GHC.Data.StringBuffer
......@@ -9,6 +9,7 @@ import GHC.Platform
import GHC.Plugins
import GHC.Settings
import GHC.Settings.Config
import System.Mem.Weak
fakeSettings :: Settings
fakeSettings =
......@@ -41,5 +42,6 @@ parse dflags src = do
main :: IO ()
main = do
_ <- mkWeak runGhc runGhc Nothing
m <- parse fakeDynFlags "main = putStrLn \"hello world\""
putStrLn $ showSDoc fakeDynFlags $ ppr m
cradle:
cabal:
This diff is collapsed.
let sources = import ./nix/sources.nix; in
{ nixpkgs ? (import sources.nixpkgs {}) }:
with nixpkgs;
let
fetch-gitlab-artifacts = nixpkgs.callPackage ./fetch-gitlab-artifacts {};
mk-ghcup-metadata = nixpkgs.callPackage ./mk-ghcup-metadata { fetch-gitlab=fetch-gitlab-artifacts;};
bindistPrepEnv = pkgs.buildFHSUserEnv {
name = "enter-fhs";
targetPkgs = pkgs: with pkgs; [
# all
gcc binutils gnumake gmp ncurses5 git elfutils
# source-release.sh
xorg.lndir curl python3 which automake autoconf m4 file
haskell.compiler.ghc8107 haskellPackages.happy haskellPackages.alex
];
runScript = "$SHELL -x";
};
scripts = stdenv.mkDerivation {
name = "rel-eng-scripts";
nativeBuildInputs = [ makeWrapper ];
preferLocalBuild = true;
buildCommand = ''
mkdir -p $out/bin
makeWrapper ${./upload.sh} $out/bin/upload.sh \
--prefix PATH : ${moreutils}/bin \
--prefix PATH : ${lftp}/bin \
--prefix PATH : ${lzip}/bin \
--prefix PATH : ${zip}/bin \
--prefix PATH : ${s3cmd}/bin \
--prefix PATH : ${gnupg}/bin \
--prefix PATH : ${pinentry}/bin \
--prefix PATH : ${parallel}/bin \
--prefix PATH : ${python3}/bin \
--set ENTER_FHS_ENV ${bindistPrepEnv}/bin/enter-fhs \
--set BASH ${bash}/bin/bash
makeWrapper ${./upload_ghc_libs.py} $out/bin/upload-ghc-libs
'';
};
in
symlinkJoin {
name = "ghc-rel-eng";
preferLocalBuild = true;
paths = [
scripts
fetch-gitlab-artifacts
mk-ghcup-metadata
];
}
result
fetch-gitlab
out
# fetch-gitlab-artifacts
This script is used to fetch and rename GHC binary distributions from GitLab
Pipelines for upload to `downloads.haskell.org`.
## Workflow
1. [Configure]() a `python-gitlab` profile for <https://gitlab.haskell.org/>:
```
$ cat > $HOME/.python-gitlab.cfg <<EOF
[haskell]
url = https://gitlab.haskell.org/
private_token = $PRIVATE_GITLAB_TOKEN
ssl_verify = true
api_version = 4
EOF
```
1. Push a release tag to ghc/ghc>
1. Wait until the CI pipeline completes
1. Run `fetch-gitlab -p $PIPELINE_ID -r $RELEASE_NAME` where `$PIPELINE_ID` is
the ID of the GitLab release pipeline and `$RELEASE_NAME` is the name of the
GHC release (e.g. `8.8.1` or `8.8.1-alpha1`)
1. The binary distributions will be placed in the `out` directory.
\ No newline at end of file
{ nix-gitignore, python3Packages, unzip }:
let
fetch-gitlab = { buildPythonPackage, python-gitlab, unzip }:
buildPythonPackage {
pname = "fetch-gitlab";
version = "0.0.1";
src = nix-gitignore.gitignoreSource [] ./.;
propagatedBuildInputs = [ python3Packages.python-gitlab unzip ];
preferLocalBuild = true;
};
in
python3Packages.callPackage fetch-gitlab { inherit unzip; }
import logging
from pathlib import Path
import subprocess
import gitlab
import json
logging.basicConfig(level=logging.INFO)
def strip_prefix(s, prefix):
if s.startswith(prefix):
return s[len(prefix):]
else:
return None
def job_triple(job_name):
bindists = {
'release-x86_64-windows-release': 'x86_64-unknown-mingw32',
'release-x86_64-windows-int_native-release': 'x86_64-unknown-mingw32-int_native',
'release-x86_64-ubuntu20_04-release': 'x86_64-ubuntu20_04-linux',
'release-x86_64-linux-fedora33-release+debug_info': 'x86_64-fedora33-linux-dwarf',
'release-x86_64-linux-fedora33-release': 'x86_64-fedora33-linux',
'release-x86_64-linux-fedora27-release': 'x86_64-fedora27-linux',
'release-x86_64-linux-deb11-release': 'x86_64-deb11-linux',
'release-x86_64-linux-deb10-release+debug_info': 'x86_64-deb10-linux-dwarf',
'release-x86_64-linux-deb10-release': 'x86_64-deb10-linux',
'release-x86_64-linux-deb9-release': 'x86_64-deb9-linux',
'release-x86_64-linux-centos7-release': 'x86_64-centos7-linux',
'release-x86_64-linux-alpine3_12-release+fully_static': 'x86_64-alpine3_12-linux-static',
'release-x86_64-linux-alpine3_12-int_native-release+fully_static': 'x86_64-alpine3_12-linux-static-int_native',
'release-x86_64-darwin-release': 'x86_64-apple-darwin',
'release-i386-linux-deb9-release': 'i386-deb9-linux',
'release-armv7-linux-deb10-release': 'armv7-deb10-linux',
'release-aarch64-linux-deb10-release': 'aarch64-deb10-linux',
'release-aarch64-darwin-release': 'aarch64-apple-darwin',
'source-tarball': 'src',
'package-hadrian-bootstrap-sources': 'hadrian-bootstrap-sources',
'doc-tarball': 'docs',
'hackage-doc-tarball': 'hackage_docs',
}
# Some bindists use the +no_split_sections transformer due to upstream
# toolchain bugs.
bindists.update({
f'{k}+no_split_sections': v
for k,v in bindists.items()
})
if job_name in bindists:
return bindists[job_name]
else:
#return strip_prefix(job.name, 'validate-')
return None
def fetch_artifacts(release: str, pipeline_id: int,
dest_dir: Path, gl: gitlab.Gitlab):
dest_dir.mkdir(exist_ok=True)
# Write the pipeline id into output directory
with open(f"{dest_dir}/metadata.json", 'w') as out: json.dump({ "pipeline_id": pipeline_id }, out)
proj = gl.projects.get('ghc/ghc')
pipeline = proj.pipelines.get(pipeline_id)
tmpdir = Path("fetch-gitlab")
tmpdir.mkdir(exist_ok=True)
for pipeline_job in pipeline.jobs.list(all=True):
if len(pipeline_job.artifacts) == 0:
logging.info(f'job {pipeline_job.name} ({pipeline_job.id}) has no artifacts')
continue
job = proj.jobs.get(pipeline_job.id)
triple = job_triple(job.name)
if triple is None:
logging.info(f'ignoring {job.name}')
continue
#artifactZips = [ artifact
# for artifact in job.artifacts
# if artifact['filename'] == 'artifacts.zip' ]
try:
destdir = tmpdir / job.name
zip_name = Path(f"{tmpdir}/{job.name}.zip")
if not zip_name.exists() or zip_name.stat().st_size == 0:
logging.info(f'downloading archive {zip_name} for job {job.name} (job {job.id})...')
with open(zip_name, 'wb') as f:
job.artifacts(streamed=True, action=f.write)
if zip_name.stat().st_size == 0:
logging.info(f'artifact archive for job {job.name} (job {job.id}) is empty')
continue
subprocess.run(['unzip', '-bo', zip_name, '-d', destdir])
bindist_files = list(destdir.glob('ghc*.tar.xz'))
if job.name == 'source-tarball':
for f in bindist_files:
dest = dest_dir / f.name
logging.info(f'extracted {job.name} to {dest}')
f.replace(dest)
elif job.name == 'package-hadrian-bootstrap-sources':
all_bootstrap_sources = destdir / 'hadrian-bootstrap-sources-all.tar.gz'
dest = dest_dir / 'hadrian-bootstrap-sources'
dest.mkdir()
subprocess.run(['tar', '-xf', all_bootstrap_sources, '-C', dest])
logging.info(f'extracted {job.name}/{all_bootstrap_sources} to {dest}')
elif job.name == 'doc-tarball':
dest = dest_dir / 'docs'
dest.mkdir()
doc_files = list(destdir.glob('*.tar.xz'))
for f in doc_files:
subprocess.run(['tar', '-xf', f, '-C', dest])
logging.info(f'extracted docs {f} to {dest}')
index_path = destdir / 'index.html'
index_path.replace(dest / 'index.html')
elif job.name == 'hackage-doc-tarball':
dest = dest_dir / 'hackage_docs'
logging.info(f'moved hackage_docs to {dest}')
(destdir / 'hackage_docs').replace(dest)
else:
dest = dest_dir / f'ghc-{release}-{triple}.tar.xz'
if dest.exists():
logging.info(f'bindist {dest} already exists')
continue
if len(bindist_files) == 0:
logging.warn(f'Bindist does not exist')
continue
bindist = bindist_files[0]
logging.info(f'extracted {job.name} to {dest}')
bindist.replace(dest)
except Exception as e:
logging.error(f'Error fetching job {job.name}: {e}')
pass
def main():
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('--pipeline', '-p', required=True, type=int, help="pipeline id")
parser.add_argument('--release', '-r', required=True, type=str, help="release name")
parser.add_argument('--output', '-o', type=Path, default=Path.cwd(), help="output directory")
parser.add_argument('--profile', '-P', default='haskell',
help='python-gitlab.cfg profile name')
args = parser.parse_args()
gl = gitlab.Gitlab.from_config(args.profile)
fetch_artifacts(args.release, args.pipeline,
dest_dir=args.output, gl=gl)
#!/usr/bin/env python
from distutils.core import setup
setup(name='fetch-gitlab',
author='Ben Gamari',
author_email='ben@smart-cactus.org',
py_modules=['fetch_gitlab'],
entry_points={
'console_scripts': [
'fetch-gitlab=fetch_gitlab:main',
]
}
)
result
fetch-gitlab
out
# mk-ghcup-metadata
This script is used to automatically generate metadata suitable for consumption by
GHCUp.
# Usage
```
nix run -f .gitlab/rel_eng/ -c ghcup-metadata
```
```
options:
-h, --help show this help message and exit
--metadata METADATA Path to GHCUp metadata
--pipeline-id PIPELINE_ID
Which pipeline to generate metadata for
--release-mode Generate metadata which points to downloads folder
--fragment Output the generated fragment rather than whole modified file
--version VERSION Version of the GHC compiler
```
The script also requires the `.gitlab/jobs-metadata.yaml` file which can be generated
by running `.gitlab/generate_jobs_metadata` script if you want to run it locally.
## CI Pipelines
The metadata is generated by the nightly and release pipelines.
* Nightly pipelines generate metadata where the bindist URLs point immediatley to
nightly artifacts.
* Release jobs can pass the `--release-mode` flag which downloads the artifacts from
the pipeline but the final download URLs for users point into the downloads folder.
The mapping from platform to bindist is not clever, it is just what the GHCUp developers
tell us to use.
## Testing Pipelines
The metadata is tested by the `ghcup-ci` repo which is triggered by the
`ghcup-metadata-testing-nightly` job.
This job sets the following variables which are then used by the downstream job
to collect the metadata from the correct place:
* `UPSTREAM_PIPELINE_ID` - The pipeline ID which the generated metadata lives in
* `UPSTREAM_PROJECT_ID` - The project ID for the upstream project (almost always `1` (for ghc/ghc))
* `UPSTREAM_JOB_NAME` - The job which the metadata belongs to (ie `ghcup-metadata-nightly`)
* `UPSTREAM_PROJECT_PATH` - The path of the upstream project (almost always ghc/ghc)
Nightly pipelines are tested automaticaly but release pipelines are manually triggered
as the testing requires the bindists to be uploaded into the final release folder.
{ nix-gitignore, python3Packages, fetch-gitlab }:
let
ghcup-metadata = { buildPythonPackage, python-gitlab, pyyaml }:
buildPythonPackage {
pname = "ghcup-metadata";
version = "0.0.1";
src = nix-gitignore.gitignoreSource [] ./.;
propagatedBuildInputs = [fetch-gitlab python-gitlab pyyaml ];
preferLocalBuild = true;
};
in
python3Packages.callPackage ghcup-metadata { }