Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • ghc/ghc
  • bgamari/ghc
  • syd/ghc
  • ggreif/ghc
  • watashi/ghc
  • RolandSenn/ghc
  • mpickering/ghc
  • DavidEichmann/ghc
  • carter/ghc
  • harpocrates/ghc
  • ethercrow/ghc
  • mijicd/ghc
  • adamse/ghc
  • alexbiehl/ghc
  • gridaphobe/ghc
  • trofi/ghc
  • supersven/ghc
  • ppk/ghc
  • ulysses4ever/ghc
  • AndreasK/ghc
  • ghuntley/ghc
  • shayne-fletcher-da/ghc
  • fgaz/ghc
  • yav/ghc
  • osa1/ghc
  • mbbx6spp/ghc
  • JulianLeviston/ghc
  • reactormonk/ghc
  • rae/ghc
  • takenobu-hs/ghc
  • michalt/ghc
  • andrewthad/ghc
  • hsyl20/ghc
  • scottgw/ghc
  • sjakobi/ghc
  • angerman/ghc
  • RyanGlScott/ghc
  • hvr/ghc
  • howtonotwin/ghc
  • chessai/ghc
  • m-renaud/ghc
  • brprice/ghc
  • stevehartdata/ghc
  • sighingnow/ghc
  • kgardas/ghc
  • ckoparkar/ghc
  • alp/ghc
  • smaeul/ghc
  • kakkun61/ghc
  • sykloid/ghc
  • newhoggy/ghc
  • toonn/ghc
  • nineonine/ghc
  • Phyx/ghc
  • ezyang/ghc
  • tweag/ghc
  • langston/ghc
  • ndmitchell/ghc
  • rockbmb/ghc
  • artempyanykh/ghc
  • mniip/ghc
  • mynguyenbmc/ghc
  • alexfmpe/ghc
  • crockeea/ghc
  • nh2/ghc
  • vaibhavsagar/ghc
  • phadej/ghc
  • Haskell-mouse/ghc
  • lolotp/ghc
  • spacekitteh/ghc
  • michaelpj/ghc
  • mgsloan/ghc
  • HPCohen/ghc
  • tmobile/ghc
  • radrow/ghc
  • simonmar/ghc
  • _deepfire/ghc
  • Ericson2314/ghc
  • leitao/ghc
  • fumieval/ghc
  • trac-isovector/ghc
  • cblp/ghc
  • xich/ghc
  • ciil/ghc
  • erthalion/ghc
  • xldenis/ghc
  • autotaker/ghc
  • haskell-wasm/ghc
  • kcsongor/ghc
  • agander/ghc
  • Baranowski/ghc
  • trac-dredozubov/ghc
  • 23Skidoo/ghc
  • iustin/ghc
  • ningning/ghc
  • josefs/ghc
  • kabuhr/ghc
  • gallais/ghc
  • dten/ghc
  • expipiplus1/ghc
  • Pluralia/ghc
  • rohanjr/ghc
  • intricate/ghc
  • kirelagin/ghc
  • Javran/ghc
  • DanielG/ghc
  • trac-mizunashi_mana/ghc
  • pparkkin/ghc
  • bollu/ghc
  • ntc2/ghc
  • jaspervdj/ghc
  • JoshMeredith/ghc
  • wz1000/ghc
  • zkourouma/ghc
  • code5hot/ghc
  • jdprice/ghc
  • tdammers/ghc
  • J-mie6/ghc
  • trac-lantti/ghc
  • ch1bo/ghc
  • cgohla/ghc
  • lucamolteni/ghc
  • acairncross/ghc
  • amerocu/ghc
  • chreekat/ghc
  • txsmith/ghc
  • trupill/ghc
  • typetetris/ghc
  • sergv/ghc
  • fryguybob/ghc
  • erikd/ghc
  • trac-roland/ghc
  • setupminimal/ghc
  • Friede80/ghc
  • SkyWriter/ghc
  • xplorld/ghc
  • abrar/ghc
  • obsidiansystems/ghc
  • Icelandjack/ghc
  • adinapoli/ghc
  • trac-matthewbauer/ghc
  • heatsink/ghc
  • dwijnand/ghc
  • Cmdv/ghc
  • alinab/ghc
  • pepeiborra/ghc
  • fommil/ghc
  • luochen1990/ghc
  • rlupton20/ghc
  • applePrincess/ghc
  • lehins/ghc
  • ronmrdechai/ghc
  • leeadam/ghc
  • harendra/ghc
  • mightymosquito1991/ghc
  • trac-gershomb/ghc
  • lucajulian/ghc
  • Rizary/ghc
  • VictorCMiraldo/ghc
  • jamesbrock/ghc
  • andrewdmeier/ghc
  • luke/ghc
  • pranaysashank/ghc
  • cocreature/ghc
  • hithroc/ghc
  • obreitwi/ghc
  • slrtbtfs/ghc
  • kaol/ghc
  • yairchu/ghc
  • Mathemagician98/ghc
  • trac-taylorfausak/ghc
  • leungbk/ghc
  • MichaWiedenmann/ghc
  • chris-martin/ghc
  • TDecki/ghc
  • adithyaov/ghc
  • trac-gelisam/ghc
  • Lysxia/ghc
  • complyue/ghc
  • bwignall/ghc
  • sternmull/ghc
  • sonika/ghc
  • leif/ghc
  • broadwaylamb/ghc
  • myszon/ghc
  • danbroooks/ghc
  • Mechachleopteryx/ghc
  • zardyh/ghc
  • trac-vdukhovni/ghc
  • OmarKhaledAbdo/ghc
  • arrowd/ghc
  • Bodigrim/ghc
  • matheus23/ghc
  • cardenaso11/ghc
  • trac-Athas/ghc
  • mb720/ghc
  • DylanZA/ghc
  • liff/ghc
  • typedrat/ghc
  • trac-claude/ghc
  • jbm/ghc
  • Gertjan423/ghc
  • PHO/ghc
  • JKTKops/ghc
  • kockahonza/ghc
  • msakai/ghc
  • Sir4ur0n/ghc
  • barambani/ghc
  • vishnu.c/ghc
  • dcoutts/ghc
  • trac-runeks/ghc
  • trac-MaxGabriel/ghc
  • lexi.lambda/ghc
  • strake/ghc
  • spavikevik/ghc
  • JakobBruenker/ghc
  • rmanne/ghc
  • gdziadkiewicz/ghc
  • ani/ghc
  • iliastsi/ghc
  • smunix/ghc
  • judah/ghc
  • blackgnezdo/ghc
  • emilypi/ghc
  • trac-bpfoley/ghc
  • muesli4/ghc
  • trac-gkaracha/ghc
  • Kleidukos/ghc
  • nek0/ghc
  • TristanCacqueray/ghc
  • dwulive/ghc
  • mbakke/ghc
  • arybczak/ghc
  • Yang123321/ghc
  • maksbotan/ghc
  • QuietMisdreavus/ghc
  • trac-olshanskydr/ghc
  • emekoi/ghc
  • samuela/ghc
  • josephcsible/ghc
  • dramforever/ghc
  • lpsmith/ghc
  • DenisFrezzato/ghc
  • michivi/ghc
  • jneira/ghc
  • jeffhappily/ghc
  • Ivan-Yudin/ghc
  • nakaji-dayo/ghc
  • gdevanla/ghc
  • galen/ghc
  • fendor/ghc
  • yaitskov/ghc
  • rcythr/ghc
  • awpr/ghc
  • jeremyschlatter/ghc
  • Aver1y/ghc
  • mitchellvitez/ghc
  • merijn/ghc
  • tomjaguarpaw1/ghc
  • trac-NoidedSuper/ghc
  • erewok/ghc
  • trac-junji.hashimoto/ghc
  • adamwespiser/ghc
  • bjaress/ghc
  • jhrcek/ghc
  • leonschoorl/ghc
  • lukasz-golebiewski/ghc
  • sheaf/ghc
  • last-g/ghc
  • carassius1014/ghc
  • eschwartz/ghc
  • dwincort/ghc
  • felixwiemuth/ghc
  • TimWSpence/ghc
  • marcusmonteirodesouza/ghc
  • WJWH/ghc
  • vtols/ghc
  • theobat/ghc
  • BinderDavid/ghc
  • ckoparkar0/ghc
  • alexander-kjeldaas/ghc
  • dme2/ghc
  • philderbeast/ghc
  • aaronallen8455/ghc
  • rayshih/ghc
  • benkard/ghc
  • mpardalos/ghc
  • saidelman/ghc
  • leiftw/ghc
  • ca333/ghc
  • bwroga/ghc
  • nmichael44/ghc
  • trac-crobbins/ghc
  • felixonmars/ghc
  • adityagupta1089/ghc
  • hgsipiere/ghc
  • treeowl/ghc
  • alexpeits/ghc
  • CraigFe/ghc
  • dnlkrgr/ghc
  • kerckhove_ts/ghc
  • cptwunderlich/ghc
  • eiais/ghc
  • hahohihu/ghc
  • sanchayan/ghc
  • lemmih/ghc
  • sehqlr/ghc
  • trac-dbeacham/ghc
  • luite/ghc
  • trac-f-a/ghc
  • vados/ghc
  • luntain/ghc
  • fatho/ghc
  • alexbiehl-gc/ghc
  • dcbdan/ghc
  • tvh/ghc
  • liam-ly/ghc
  • timbobbarnes/ghc
  • GovanifY/ghc
  • shanth2600/ghc
  • gliboc/ghc
  • duog/ghc
  • moxonsghost/ghc
  • zander/ghc
  • masaeedu/ghc
  • georgefst/ghc
  • guibou/ghc
  • nicuveo/ghc
  • mdebruijne/ghc
  • stjordanis/ghc
  • emiflake/ghc
  • wygulmage/ghc
  • frasertweedale/ghc
  • coot/ghc
  • aratamizuki/ghc
  • tsandstr/ghc
  • mrBliss/ghc
  • Anton-Latukha/ghc
  • tadfisher/ghc
  • vapourismo/ghc
  • Sorokin-Anton/ghc
  • basile-henry/ghc
  • trac-mightybyte/ghc
  • AbsoluteNikola/ghc
  • cobrien99/ghc
  • songzh/ghc
  • blamario/ghc
  • aj4ayushjain/ghc
  • trac-utdemir/ghc
  • tangcl/ghc
  • hdgarrood/ghc
  • maerwald/ghc
  • arjun/ghc
  • ratherforky/ghc
  • haskieLambda/ghc
  • EmilGedda/ghc
  • Bogicevic/ghc
  • eddiejessup/ghc
  • kozross/ghc
  • AlistairB/ghc
  • 3Rafal/ghc
  • christiaanb/ghc
  • trac-bit/ghc
  • matsumonkie/ghc
  • trac-parsonsmatt/ghc
  • chisui/ghc
  • jaro/ghc
  • trac-kmiyazato/ghc
  • davidsd/ghc
  • Tritlo/ghc
  • I-B-3/ghc
  • lykahb/ghc
  • AriFordsham/ghc
  • turion1/ghc
  • berberman/ghc
  • christiantakle/ghc
  • zyklotomic/ghc
  • trac-ocramz/ghc
  • CSEdd/ghc
  • doyougnu/ghc
  • mmhat/ghc
  • why-not-try-calmer/ghc
  • plutotulp/ghc
  • kjekac/ghc
  • Manvi07/ghc
  • teo/ghc
  • cactus/ghc
  • CarrieMY/ghc
  • abel/ghc
  • yihming/ghc
  • tsakki/ghc
  • jessicah/ghc
  • oliverbunting/ghc
  • meld/ghc
  • friedbrice/ghc
  • Joald/ghc
  • abarbu/ghc
  • DigitalBrains1/ghc
  • sterni/ghc
  • alexDarcy/ghc
  • hexchain/ghc
  • minimario/ghc
  • zliu41/ghc
  • tommd/ghc
  • jazcarate/ghc
  • peterbecich/ghc
  • alirezaghey/ghc
  • solomon/ghc
  • mikael.urankar/ghc
  • davjam/ghc
  • int-index/ghc
  • MorrowM/ghc
  • nrnrnr/ghc
  • Sonfamm/ghc-test-only
  • afzt1/ghc
  • nguyenhaibinh-tpc/ghc
  • trac-lierdakil/ghc
  • MichaWiedenmann1/ghc
  • jmorag/ghc
  • Ziharrk/ghc
  • trac-MitchellSalad/ghc
  • juampe/ghc
  • jwaldmann/ghc
  • snowleopard/ghc
  • juhp/ghc
  • normalcoder/ghc
  • ksqsf/ghc
  • trac-jberryman/ghc
  • roberth/ghc
  • 1ntEgr8/ghc
  • epworth/ghc
  • MrAdityaAlok/ghc
  • JunmingZhao42/ghc
  • jappeace/ghc
  • trac-Gabriel439/ghc
  • alt-romes/ghc
  • HugoPeters1024/ghc
  • 10ne1/ghc-fork
  • agentultra/ghc
  • Garfield1002/ghc
  • ChickenProp/ghc
  • clyring/ghc
  • MaxHearnden/ghc
  • jumper149/ghc
  • vem/ghc
  • ketzacoatl/ghc
  • Rosuavio/ghc
  • jackohughes/ghc
  • p4l1ly/ghc
  • konsumlamm/ghc
  • shlevy/ghc
  • torsten.schmits/ghc
  • andremarianiello/ghc
  • amesgen/ghc
  • googleson78/ghc
  • InfiniteVerma/ghc
  • uhbif19/ghc
  • yiyunliu/ghc
  • raehik/ghc
  • mrkun/ghc
  • telser/ghc
  • 1Jajen1/ghc
  • slotThe/ghc
  • WinstonHartnett/ghc
  • mpilgrem/ghc
  • dreamsmasher/ghc
  • schuelermine/ghc
  • trac-Viwor/ghc
  • undergroundquizscene/ghc
  • evertedsphere/ghc
  • coltenwebb/ghc
  • oberblastmeister/ghc
  • agrue/ghc
  • lf-/ghc
  • zacwood9/ghc
  • steshaw/ghc
  • high-cloud/ghc
  • SkamDart/ghc
  • PiDelport/ghc
  • maoif/ghc
  • RossPaterson/ghc
  • CharlesTaylor7/ghc
  • ribosomerocker/ghc
  • trac-ramirez7/ghc
  • daig/ghc
  • NicolasT/ghc
  • FinleyMcIlwaine/ghc
  • lawtonnichols/ghc
  • jmtd/ghc
  • ozkutuk/ghc
  • wildsebastian/ghc
  • nikshalark/ghc
  • lrzlin/ghc
  • tobias/ghc
  • fw/ghc
  • hawkinsw/ghc
  • type-dance/ghc
  • rui314/ghc
  • ocharles/ghc
  • wavewave/ghc
  • TheKK/ghc
  • nomeata/ghc
  • trac-csabahruska/ghc
  • jonathanjameswatson/ghc
  • L-as/ghc
  • Axman6/ghc
  • barracuda156/ghc
  • trac-jship/ghc
  • jake-87/ghc
  • meooow/ghc
  • rebeccat/ghc
  • hamana55/ghc
  • Enigmage/ghc
  • kokobd/ghc
  • agevelt/ghc
  • gshen42/ghc
  • chrismwendt/ghc
  • MangoIV/ghc
  • teto/ghc
  • Sookr1/ghc
  • trac-thomasjm/ghc
  • barci2/ghc-dev
  • trac-m4dc4p/ghc
  • dixonary/ghc
  • breakerzirconia/ghc
  • alexsio27444/ghc
  • glocq/ghc
  • sourabhxyz/ghc
  • ryantrinkle/ghc
  • Jade/ghc
  • scedfaliako/ghc
  • martijnbastiaan/ghc
  • trac-george.colpitts/ghc
  • ammarbinfaisal/ghc
  • mimi.vx/ghc
  • lortabac/ghc
  • trac-zyla/ghc
  • benbellick/ghc
  • aadaa-fgtaa/ghc
  • jvanbruegge/ghc
  • archbung/ghc
  • gilmi/ghc
  • mfonism/ghc
  • alex-mckenna/ghc
  • Ei30metry/ghc
  • DiegoDiverio/ghc
  • jorgecunhamendes/ghc
  • liesnikov/ghc
  • akrmn/ghc
  • trac-simplifierticks/ghc
  • jacco/ghc
  • rhendric/ghc
  • damhiya/ghc
  • ryndubei/ghc
  • DaveBarton/ghc
  • trac-Profpatsch/ghc
  • GZGavinZhao/ghc
  • ncfavier/ghc
  • jameshaydon/ghc
  • ajccosta/ghc
  • dschrempf/ghc
  • cydparser/ghc
  • LinuxUserGD/ghc
  • elodielander/ghc
  • facundominguez/ghc
  • psilospore/ghc
  • lachrimae/ghc
  • dylan-thinnes/ghc-type-errors-plugin
  • hamishmack/ghc
  • Leary/ghc
  • lzszt/ghc
  • lyokha/ghc
  • trac-glaubitz/ghc
  • Rewbert/ghc
  • andreabedini/ghc
  • Jasagredo/ghc
  • sol/ghc
  • OlegAlexander/ghc
  • trac-sthibaul/ghc
  • avdv/ghc
  • Wendaolee/ghc
  • ur4t/ghc
  • daylily/ghc
  • boltzmannrain/ghc
  • mmzk1526/ghc
  • trac-fizzixnerd/ghc
  • soulomoon/ghc
  • rwmjones/ghc
  • j14i/ghc
  • tracsis/ghc
  • gesh/ghc
  • flip101/ghc
  • eldritch-cookie/ghc
  • LemonjamesD/ghc
  • pgujjula/ghc
  • skeuchel/ghc
  • noteed/ghc
  • gulin.serge/ghc
  • Torrekie/ghc
  • jlwoodwa/ghc
  • ayanamists/ghc
  • husong998/ghc
  • trac-edmundnoble/ghc
  • josephf/ghc
  • contrun/ghc
  • baulig/ghc
  • edsko/ghc
  • mzschr/ghc-issue-24732
  • ulidtko/ghc
  • Arsen/ghc
  • trac-sjoerd_visscher/ghc
  • crumbtoo/ghc
  • L0neGamer/ghc
  • DrewFenwick/ghc
  • benz0li/ghc
  • MaciejWas/ghc
  • jordanrule/ghc
  • trac-qqwy/ghc
  • LiamGoodacre/ghc
  • isomorpheme/ghc
  • trac-danidiaz/ghc
  • Kariim/ghc
  • MTaimoorZaeem/ghc
  • hololeap/ghc
  • ticat-fp/ghc
  • meritamen/ghc
  • criskell/ghc
  • trac-kraai/ghc
  • aergus/ghc
  • jdral/ghc
  • SamB/ghc
  • Tristian/ghc
  • ywgrit/ghc
  • KatsuPatrick/ghc
  • OsePedro/ghc
  • mpscholten/ghc
  • fp/ghc
  • zaquest/ghc
  • fangyi-zhou/ghc
  • augyg/ghc
640 results
Show changes
Commits on Source (5296)
Showing
with 7675 additions and 1106 deletions
# Configure the environment
MSYSTEM=MINGW64
THREADS=9
SKIP_PERF_TESTS=YES
BUILD_FLAVOUR=
source /etc/profile || true # a terrible, terrible workaround for msys2 brokenness
# Don't set -e until after /etc/profile is sourced
set -ex
cd $APPVEYOR_BUILD_FOLDER
case "$1" in
"prepare")
# Prepare the tree
git config remote.origin.url git://github.com/ghc/ghc.git
git config --global url."git://github.com/ghc/packages-".insteadOf git://github.com/ghc/packages/
git submodule init
git submodule --quiet update --recursive
;;
"build")
# Build the compiler
./boot
cat <<EOF >> mk/build.mk
BuildFlavour=$BUILD_FLAVOUR
ifneq "\$(BuildFlavour)" ""
include mk/flavours/\$(BuildFlavour).mk
endif
EOF
./configure --enable-tarballs-autodownload
make -j$THREADS
;;
"test")
make binary-dist
curl https://ghc-artifacts.s3.amazonaws.com/tools/ghc-artifact-collector-x86_64-windows --output ghc-artifact-collector
./ghc-artifact-collector *.tar.xz
make test THREADS=$THREADS
;;
*)
echo "$0: unknown mode $1"
exit 1
;;
esac
# http://editorconfig.org
root = true
[*.hs]
indent_style = space
indent_size = 2
trim_trailing_whitespace = true
insert_final_newline = true
charset = utf-8
end_of_line = lf
[Makefile]
indent_style = tab
[*.c]
indent_style = space
indent_size = 2
# convert CRLF into LF on checkin
# don't convert anything on checkout
* text=auto eol=lf
mk/win32-tarballs.md5sum text=auto eol=LF mk/win32-tarballs.md5sum text=auto eol=LF
...@@ -17,6 +17,8 @@ Thumbs.db ...@@ -17,6 +17,8 @@ Thumbs.db
*.hi *.hi
*.hi-boot *.hi-boot
*.hie
*.hie-boot
*.o-boot *.o-boot
*.p_o *.p_o
*.t_o *.t_o
...@@ -42,6 +44,8 @@ autom4te.cache ...@@ -42,6 +44,8 @@ autom4te.cache
config.log config.log
config.status config.status
configure configure
# GHC's own aclocal.m4 is generated by aclocal
/aclocal.m4
# Temporarily generated configure files # Temporarily generated configure files
confdefs.h confdefs.h
...@@ -50,11 +54,13 @@ confdefs.h ...@@ -50,11 +54,13 @@ confdefs.h
stage0 stage0
stage1 stage1
stage2 stage2
_build # Ignore _build, _validatebuild and any other custom build directories headed by _
_*
*/generated/ */generated/
*/ghc-stage1 */ghc-stage1
.shake.* .shake.*
.hadrian_ghci .hadrian_ghci
.hie-bios
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
# Ignore any overlapped darcs repos and back up files # Ignore any overlapped darcs repos and back up files
...@@ -73,13 +79,12 @@ _darcs/ ...@@ -73,13 +79,12 @@ _darcs/
/driver/ghc/dist/ /driver/ghc/dist/
/driver/haddock/dist/ /driver/haddock/dist/
/driver/ghci/dist/ /driver/ghci/dist/
/includes/dist/
/includes/dist-*/
/libffi/dist-install/ /libffi/dist-install/
/libraries/*/dist-boot/ /libraries/*/dist-boot/
/libraries/*/dist-install/ /libraries/*/dist-install/
/libraries/*/dist-newstyle/
/libraries/dist-haddock/ /libraries/dist-haddock/
/rts/dist/ /linters/*/dist-install/
/utils/*/dist*/ /utils/*/dist*/
/compiler/stage1/ /compiler/stage1/
/compiler/stage2/ /compiler/stage2/
...@@ -101,12 +106,17 @@ _darcs/ ...@@ -101,12 +106,17 @@ _darcs/
/ch01.html /ch01.html
/ch02.html /ch02.html
/compiler/dist/ /compiler/dist/
/compiler/Bytecodes.h
/compiler/ClosureTypes.h
/compiler/FunTypes.h
/compiler/MachRegs.h
/compiler/ghc-llvm-version.h
/compiler/ghc.cabal /compiler/ghc.cabal
/compiler/ghc.cabal.old /compiler/ghc.cabal.old
/distrib/configure.ac /distrib/configure.ac
/distrib/ghc.iss /distrib/ghc.iss
/docs/man
/docs/index.html /docs/index.html
/docs/man
/docs/users_guide/.log /docs/users_guide/.log
/docs/users_guide/users_guide /docs/users_guide/users_guide
/docs/users_guide/ghc.1 /docs/users_guide/ghc.1
...@@ -123,13 +133,19 @@ _darcs/ ...@@ -123,13 +133,19 @@ _darcs/
/docs/users_guide/utils.pyc /docs/users_guide/utils.pyc
/driver/ghci/ghc-pkg-inplace /driver/ghci/ghc-pkg-inplace
/driver/ghci/ghci-inplace /driver/ghci/ghci-inplace
/driver/ghci/ghci-wrapper.cabal
/driver/ghci/ghci.res /driver/ghci/ghci.res
/driver/ghci/cwrapper.c
/driver/ghci/cwrapper.h
/driver/ghci/getLocation.c
/driver/ghci/getLocation.h
/driver/ghci/isMinTTY.c
/driver/ghci/isMinTTY.h
/driver/package.conf /driver/package.conf
/driver/package.conf.inplace.old /driver/package.conf.inplace.old
/settings /settings
/ghc.spec /ghc.spec
/ghc/ghc-bin.cabal /ghc/ghc-bin.cabal
/includes/dist/
/index.html /index.html
/inplace/ /inplace/
/libffi/build/ /libffi/build/
...@@ -170,15 +186,11 @@ _darcs/ ...@@ -170,15 +186,11 @@ _darcs/
/mk/config.h.in /mk/config.h.in
/mk/config.mk /mk/config.mk
/mk/config.mk.old /mk/config.mk.old
/mk/system-cxx-std-lib-1.0.conf
/mk/install.mk /mk/install.mk
/mk/project.mk /mk/project.mk
/mk/project.mk.old /mk/project.mk.old
/mk/validate.mk /mk/validate.mk
/rts/rts.cabal
/rts/package.conf.inplace
/rts/package.conf.inplace.raw
/rts/package.conf.install
/rts/package.conf.install.raw
/stage3.package.conf /stage3.package.conf
/testsuite_summary*.txt /testsuite_summary*.txt
/testsuite*.xml /testsuite*.xml
...@@ -189,9 +201,9 @@ _darcs/ ...@@ -189,9 +201,9 @@ _darcs/
/utils/mkUserGuidePart/mkUserGuidePart.cabal /utils/mkUserGuidePart/mkUserGuidePart.cabal
/utils/runghc/runghc.cabal /utils/runghc/runghc.cabal
/utils/gen-dll/gen-dll.cabal /utils/gen-dll/gen-dll.cabal
/utils/ghc-pkg/ghc-pkg.cabal
utils/lndir/fs.* utils/lndir/fs.*
utils/unlit/fs.* utils/unlit/fs.*
rts/fs.*
libraries/base/include/fs.h libraries/base/include/fs.h
libraries/base/cbits/fs.c libraries/base/cbits/fs.c
missing-win32-tarballs missing-win32-tarballs
...@@ -214,11 +226,6 @@ GIT_COMMIT_ID ...@@ -214,11 +226,6 @@ GIT_COMMIT_ID
# Should be equal to testdir_suffix from testsuite/driver/testlib.py. # Should be equal to testdir_suffix from testsuite/driver/testlib.py.
*.run *.run
# -----------------------------------------------------------------------------
# Output of ghc-in-ghci
/.ghci-objects/
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
# ghc.nix # ghc.nix
ghc.nix/ ghc.nix/
...@@ -227,8 +234,19 @@ ghc.nix/ ...@@ -227,8 +234,19 @@ ghc.nix/
.gdb_history .gdb_history
.gdbinit .gdbinit
# Tooling - direnv # -----------------------------------------------------------------------------
# Tooling
# direnv
.envrc .envrc
.direnv
# Tooling - vscode # Visual Studio Code
.vscode .vscode
# Tooling - ghcide
*.hiedb
# clangd
.clangd
dist-newstyle/
This diff is collapsed.
#!/usr/bin/env bash #!/usr/bin/env bash
# shellcheck disable=SC2230 # shellcheck disable=SC2230
# shellcheck disable=SC1090
# This is the primary driver of the GitLab CI infrastructure. # This is the primary driver of the GitLab CI infrastructure.
# Run `ci.sh usage` for usage information.
set -e -o pipefail set -Eeuo pipefail
# Configuration: # Configuration:
hackage_index_state="@1579718451" HACKAGE_INDEX_STATE="2020-12-21T14:48:20Z"
MIN_HAPPY_VERSION="1.20"
# Colors MIN_ALEX_VERSION="3.2.6"
BLACK="0;30"
GRAY="1;30"
RED="0;31"
LT_RED="1;31"
BROWN="0;33"
LT_BROWN="1;33"
GREEN="0;32"
LT_GREEN="1;32"
BLUE="0;34"
LT_BLUE="1;34"
PURPLE="0;35"
LT_PURPLE="1;35"
CYAN="0;36"
LT_CYAN="1;36"
WHITE="1;37"
LT_GRAY="0;37"
export LANG=C.UTF-8
export LC_ALL=C.UTF-8
# GitLab Pipelines log section delimiters
# https://gitlab.com/gitlab-org/gitlab-foss/issues/14664
start_section() {
name="$1"
echo -e "section_start:$(date +%s):$name\015\033[0K"
}
end_section() { TOP="$(pwd)"
name="$1" if [ ! -d "$TOP/.gitlab" ]; then
echo -e "section_end:$(date +%s):$name\015\033[0K" echo "This script expects to be run from the root of a ghc checkout"
fi
CABAL_CACHE="$TOP/${CABAL_CACHE:-cabal-cache}"
source "$TOP/.gitlab/common.sh"
function time_it() {
local name="$1"
shift
local start=$(date +%s)
local res=0
set +e
( set -e ; $@ )
res=$?
set -e
local end=$(date +%s)
local delta=$(expr $end - $start)
echo "$name took $delta seconds"
printf "%15s | $delta" > ci-timings
return $res
} }
echo_color() { function usage() {
local color="$1" cat <<EOF
local msg="$2" $0 - GHC continuous integration driver
echo -e "\033[${color}m${msg}\033[0m"
Common Modes:
usage Show this usage message.
setup Prepare environment for a build.
configure Run ./configure.
clean Clean the tree
shell Run an interactive shell with a configured build environment.
save_cache Preserve the cabal cache
Hadrian build system
build_hadrian Build GHC via the Hadrian build system
test_hadrian Test GHC via the Hadrian build system
Environment variables affecting both build systems:
CROSS_TARGET Triple of cross-compilation target.
VERBOSE Set to non-empty for verbose build output
RUNTEST_ARGS Arguments passed to runtest.py
MSYSTEM (Windows-only) Which platform to build from (CLANG64).
IGNORE_PERF_FAILURES
Whether to ignore perf failures (one of "increases",
"decreases", or "all")
HERMETIC Take measures to avoid looking at anything in \$HOME
CONFIGURE_ARGS Arguments passed to configure script.
CONFIGURE_WRAPPER Wrapper for the configure script (e.g. Emscripten's emconfigure).
ENABLE_NUMA Whether to enable numa support for the build (disabled by default)
INSTALL_CONFIGURE_ARGS
Arguments passed to the binary distribution configure script
during installation of test toolchain.
NIX_SYSTEM On Darwin, the target platform of the desired toolchain
(either "x86-64-darwin" or "aarch-darwin")
NO_BOOT Whether to run ./boot or not, used when testing the source dist
Environment variables determining build configuration of Make system:
BUILD_FLAVOUR Which flavour to build.
BUILD_SPHINX_HTML Whether to build Sphinx HTML documentation.
BUILD_SPHINX_PDF Whether to build Sphinx PDF documentation.
INTEGER_LIBRARY Which integer library to use (integer-simple or integer-gmp).
HADDOCK_HYPERLINKED_SOURCES
Whether to build hyperlinked Haddock sources.
TEST_TYPE Which test rule to run.
Environment variables determining build configuration of Hadrian system:
BUILD_FLAVOUR Which flavour to build.
REINSTALL_GHC Build and test a reinstalled "stage3" ghc built using cabal-install
This tests the "reinstall" configuration
CROSS_EMULATOR The emulator to use for testing of cross-compilers.
Environment variables determining bootstrap toolchain (Linux):
GHC Path of GHC executable to use for bootstrapping.
CABAL Path of cabal-install executable to use for bootstrapping.
ALEX Path of alex executable to use for bootstrapping.
HAPPY Path of alex executable to use for bootstrapping.
Environment variables determining bootstrap toolchain (non-Linux):
GHC_VERSION Which GHC version to fetch for bootstrapping.
CABAL_INSTALL_VERSION
Cabal-install version to fetch for bootstrapping.
EOF
} }
error() { echo_color "${RED}" "$1"; } function setup_locale() {
warn() { echo_color "${LT_BROWN}" "$1"; } # Musl doesn't provide locale support at all...
info() { echo_color "${LT_BLUE}" "$1"; } if ! which locale > /dev/null; then
info "No locale executable. Skipping locale setup..."
fail() { error "error: $1"; exit 1; } return
fi
function run() { # BSD grep terminates early with -q, consequently locale -a will get a
info "Running $*..." # SIGPIPE and the pipeline will fail with pipefail.
"$@" || ( error "$* failed"; return 1; ) shopt -o -u pipefail
if locale -a | grep -q C.UTF-8; then
# Debian
export LANG=C.UTF-8
elif locale -a | grep -q C.utf8; then
# Fedora calls it this
export LANG=C.utf8
elif locale -a | grep -q en_US.UTF-8; then
# Centos doesn't have C.UTF-8
export LANG=en_US.UTF-8
elif locale -a | grep -q en_US.utf8; then
# Centos doesn't have C.UTF-8
export LANG=en_US.utf8
else
error "Failed to find usable locale"
info "Available locales:"
locale -a
fail "No usable locale, aborting..."
fi
info "Using locale $LANG..."
export LC_ALL=$LANG
shopt -o -s pipefail
} }
TOP="$(pwd)"
function mingw_init() { function mingw_init() {
case "$MSYSTEM" in case "$MSYSTEM" in
MINGW32) CLANG64)
triple="i386-unknown-mingw32" target_triple="x86_64-unknown-mingw32"
boot_triple="i386-unknown-mingw32" # triple of bootstrap GHC
;;
MINGW64)
triple="x86_64-unknown-mingw32"
boot_triple="x86_64-unknown-mingw32" # triple of bootstrap GHC boot_triple="x86_64-unknown-mingw32" # triple of bootstrap GHC
;; ;;
*) *)
...@@ -92,7 +167,7 @@ toolchain="$TOP/toolchain" ...@@ -92,7 +167,7 @@ toolchain="$TOP/toolchain"
mkdir -p "$toolchain/bin" mkdir -p "$toolchain/bin"
PATH="$toolchain/bin:$PATH" PATH="$toolchain/bin:$PATH"
export METRICS_FILE="$CI_PROJECT_DIR/performance-metrics.tsv" export METRICS_FILE="$TOP/performance-metrics.tsv"
cores="$(mk/detect-cpu-count.sh)" cores="$(mk/detect-cpu-count.sh)"
...@@ -102,17 +177,6 @@ mkdir -p "$TOP/tmp" ...@@ -102,17 +177,6 @@ mkdir -p "$TOP/tmp"
export TMP="$TOP/tmp" export TMP="$TOP/tmp"
export TEMP="$TOP/tmp" export TEMP="$TOP/tmp"
function darwin_setup() {
# It looks like we already have python2 here and just installing python3
# does not work.
brew upgrade python
brew install ghc cabal-install ncurses gmp
pip3 install sphinx
# PDF documentation disabled as MacTeX apparently doesn't include xelatex.
#brew cask install mactex
}
function show_tool() { function show_tool() {
local tool="$1" local tool="$1"
info "$tool = ${!tool}" info "$tool = ${!tool}"
...@@ -120,46 +184,83 @@ function show_tool() { ...@@ -120,46 +184,83 @@ function show_tool() {
} }
function set_toolchain_paths() { function set_toolchain_paths() {
needs_toolchain=1 case "$(uname -m)-$(uname)" in
case "$(uname)" in # Linux toolchains are included in the Docker image
Linux) needs_toolchain="" ;; *-Linux) toolchain_source="env" ;;
*) ;; # Darwin toolchains are provided via .gitlab/darwin/toolchain.nix
*-Darwin) toolchain_source="nix" ;;
*) toolchain_source="extracted" ;;
esac esac
if [[ -n "$needs_toolchain" ]]; then case "$toolchain_source" in
extracted)
# These are populated by setup_toolchain # These are populated by setup_toolchain
GHC="$toolchain/bin/ghc$exe" GHC="$toolchain/bin/ghc$exe"
CABAL="$toolchain/bin/cabal$exe" CABAL="$toolchain/bin/cabal$exe"
HAPPY="$toolchain/bin/happy$exe" HAPPY="$toolchain/bin/happy$exe"
ALEX="$toolchain/bin/alex$exe" ALEX="$toolchain/bin/alex$exe"
else if [ "$(uname)" = "FreeBSD" ]; then
GHC="$(which ghc)" GHC=/usr/local/bin/ghc
CABAL="/usr/local/bin/cabal" fi
HAPPY="$HOME/.cabal/bin/happy" ;;
ALEX="$HOME/.cabal/bin/alex" nix)
fi if [[ ! -f toolchain.sh ]]; then
case "$NIX_SYSTEM" in
x86_64-darwin|aarch64-darwin) ;;
*) fail "unknown NIX_SYSTEM" ;;
esac
info "Building toolchain for $NIX_SYSTEM"
nix-build --quiet .gitlab/darwin/toolchain.nix --argstr system "$NIX_SYSTEM" -o toolchain.sh
cat toolchain.sh
fi
source toolchain.sh
;;
env)
# These are generally set by the Docker image but
# we provide these handy fallbacks in case the
# script isn't run from within a GHC CI docker image.
if [ -z "$GHC" ]; then GHC="$(which ghc)"; fi
if [ -z "$CABAL" ]; then CABAL="$(which cabal)"; fi
if [ -z "$HAPPY" ]; then HAPPY="$(which happy)"; fi
if [ -z "$ALEX" ]; then ALEX="$(which alex)"; fi
;;
*) fail "bad toolchain_source"
esac
export GHC export GHC
export CABAL export CABAL
export HAPPY export HAPPY
export ALEX export ALEX
if [[ "${CROSS_TARGET:-}" == *"wasm"* ]]; then
source "/home/ghc/.ghc-wasm/env"
fi
} }
function cabal_update() {
# In principle -w shouldn't be necessary here but with
# cabal-install 3.8.1.0 it is, due to cabal#8447.
run "$CABAL" update -w "$GHC" --index="$HACKAGE_INDEX_STATE"
}
# Extract GHC toolchain # Extract GHC toolchain
function setup() { function setup() {
if [ -d "$TOP/cabal-cache" ]; then echo "=== TIMINGS ===" > ci-timings
info "Extracting cabal cache..."
mkdir -p "$cabal_dir"
cp -Rf cabal-cache/* "$cabal_dir"
fi
if [[ -n "$needs_toolchain" ]]; then if [ -d "$CABAL_CACHE" ]; then
setup_toolchain info "Extracting cabal cache from $CABAL_CACHE to $CABAL_DIR..."
mkdir -p "$CABAL_DIR"
cp -Rf "$CABAL_CACHE"/* "$CABAL_DIR"
fi fi
case "$(uname)" in
Darwin) darwin_setup ;; case $toolchain_source in
extracted) time_it "setup" setup_toolchain ;;
*) ;; *) ;;
esac esac
cabal_update || fail "cabal update failed"
# Make sure that git works # Make sure that git works
git config user.email "ghc-ci@gitlab-haskell.org" git config user.email "ghc-ci@gitlab-haskell.org"
git config user.name "GHC GitLab CI" git config user.name "GHC GitLab CI"
...@@ -171,26 +272,31 @@ function setup() { ...@@ -171,26 +272,31 @@ function setup() {
show_tool CABAL show_tool CABAL
show_tool HAPPY show_tool HAPPY
show_tool ALEX show_tool ALEX
info "====================================================="
info "ghc --info"
info "====================================================="
$GHC --info
} }
function fetch_ghc() { function fetch_ghc() {
local v="$GHC_VERSION"
if [[ -z "$v" ]]; then
fail "GHC_VERSION is not set"
fi
if [ ! -e "$GHC" ]; then if [ ! -e "$GHC" ]; then
local v="$GHC_VERSION"
if [[ -z "$v" ]]; then
fail "neither GHC nor GHC_VERSION are not set"
fi
start_section "fetch GHC" start_section "fetch GHC"
url="https://downloads.haskell.org/~ghc/${GHC_VERSION}/ghc-${GHC_VERSION}-${boot_triple}.tar.xz" url="https://downloads.haskell.org/~ghc/${GHC_VERSION}/ghc-${GHC_VERSION}-${boot_triple}.tar.xz"
info "Fetching GHC binary distribution from $url..." info "Fetching GHC binary distribution from $url..."
curl "$url" > ghc.tar.xz || fail "failed to fetch GHC binary distribution" curl "$url" > ghc.tar.xz || fail "failed to fetch GHC binary distribution"
tar -xJf ghc.tar.xz || fail "failed to extract GHC binary distribution" $TAR -xJf ghc.tar.xz || fail "failed to extract GHC binary distribution"
case "$(uname)" in case "$(uname)" in
MSYS_*|MINGW*) MSYS_*|MINGW*)
cp -r "ghc-${GHC_VERSION}"/* "$toolchain" cp -r ghc-${GHC_VERSION}*/* "$toolchain"
;; ;;
*) *)
pushd "ghc-${GHC_VERSION}" pushd ghc-${GHC_VERSION}*
./configure --prefix="$toolchain" ./configure --prefix="$toolchain"
"$MAKE" install "$MAKE" install
popd popd
...@@ -203,22 +309,21 @@ function fetch_ghc() { ...@@ -203,22 +309,21 @@ function fetch_ghc() {
} }
function fetch_cabal() { function fetch_cabal() {
local v="$CABAL_INSTALL_VERSION"
if [[ -z "$v" ]]; then
fail "CABAL_INSTALL_VERSION is not set"
fi
if [ ! -e "$CABAL" ]; then if [ ! -e "$CABAL" ]; then
local v="$CABAL_INSTALL_VERSION"
if [[ -z "$v" ]]; then
fail "neither CABAL nor CABAL_INSTALL_VERSION are not set"
fi
start_section "fetch GHC" start_section "fetch GHC"
case "$(uname)" in case "$(uname)" in
# N.B. Windows uses zip whereas all others use .tar.xz # N.B. Windows uses zip whereas all others use .tar.xz
MSYS_*|MINGW*) MSYS_*|MINGW*)
case "$MSYSTEM" in case "$MSYSTEM" in
MINGW32) cabal_arch="i386" ;; CLANG64) cabal_arch="x86_64" ;;
MINGW64) cabal_arch="x86_64" ;;
*) fail "unknown MSYSTEM $MSYSTEM" ;; *) fail "unknown MSYSTEM $MSYSTEM" ;;
esac esac
url="https://downloads.haskell.org/~cabal/cabal-install-$v/cabal-install-$v-$cabal_arch-unknown-mingw32.zip" url="https://downloads.haskell.org/~cabal/cabal-install-$v/cabal-install-$v-$cabal_arch-windows.zip"
info "Fetching cabal binary distribution from $url..." info "Fetching cabal binary distribution from $url..."
curl "$url" > "$TMP/cabal.zip" curl "$url" > "$TMP/cabal.zip"
unzip "$TMP/cabal.zip" unzip "$TMP/cabal.zip"
...@@ -228,14 +333,12 @@ function fetch_cabal() { ...@@ -228,14 +333,12 @@ function fetch_cabal() {
local base_url="https://downloads.haskell.org/~cabal/cabal-install-$v/" local base_url="https://downloads.haskell.org/~cabal/cabal-install-$v/"
case "$(uname)" in case "$(uname)" in
Darwin) cabal_url="$base_url/cabal-install-$v-x86_64-apple-darwin17.7.0.tar.xz" ;; Darwin) cabal_url="$base_url/cabal-install-$v-x86_64-apple-darwin17.7.0.tar.xz" ;;
FreeBSD) FreeBSD) cabal_url="$base_url/cabal-install-$v-x86_64-freebsd13.tar.xz" ;;
#cabal_url="$base_url/cabal-install-$v-x86_64-portbld-freebsd.tar.xz" ;;
cabal_url="http://home.smart-cactus.org/~ben/ghc/cabal-install-3.0.0.0-x86_64-portbld-freebsd.tar.xz" ;;
*) fail "don't know where to fetch cabal-install for $(uname)" *) fail "don't know where to fetch cabal-install for $(uname)"
esac esac
echo "Fetching cabal-install from $cabal_url" echo "Fetching cabal-install from $cabal_url"
curl "$cabal_url" > cabal.tar.xz curl "$cabal_url" > cabal.tar.xz
tar -xJf cabal.tar.xz $TAR -xJf cabal.tar.xz
mv cabal "$toolchain/bin" mv cabal "$toolchain/bin"
;; ;;
esac esac
...@@ -249,35 +352,41 @@ function fetch_cabal() { ...@@ -249,35 +352,41 @@ function fetch_cabal() {
function setup_toolchain() { function setup_toolchain() {
fetch_ghc fetch_ghc
fetch_cabal fetch_cabal
cabal_install="$CABAL v2-install --index-state=$hackage_index_state --installdir=$toolchain/bin" cabal_update
local cabal_install="$CABAL v2-install \
--with-compiler=$GHC \
--index-state=$HACKAGE_INDEX_STATE \
--installdir=$toolchain/bin \
--ignore-project \
--overwrite-policy=always"
# Avoid symlinks on Windows # Avoid symlinks on Windows
case "$(uname)" in case "$(uname)" in
MSYS_*|MINGW*) cabal_install="$cabal_install --install-method=copy" ;; MSYS_*|MINGW*) cabal_install="$cabal_install --install-method=copy" ;;
*) ;; *) ;;
esac esac
if [ ! -e "$HAPPY" ]; then info "Building happy..."
info "Building happy..." $cabal_install happy --constraint="happy>=$MIN_HAPPY_VERSION"
cabal update
$cabal_install happy
fi
if [ ! -e "$ALEX" ]; then info "Building alex..."
info "Building alex..." $cabal_install alex --constraint="alex>=$MIN_ALEX_VERSION"
cabal update
$cabal_install alex
fi
} }
function cleanup_submodules() { function cleanup_submodules() {
start_section "clean submodules" start_section "clean submodules"
info "Cleaning submodules..." if [ -d .git ]; then
# On Windows submodules can inexplicably get into funky states where git info "Cleaning submodules..."
# believes that the submodule is initialized yet its associated repository # On Windows submodules can inexplicably get into funky states where git
# is not valid. Avoid failing in this case with the following insanity. # believes that the submodule is initialized yet its associated repository
git submodule sync --recursive || git submodule deinit --force --all # is not valid. Avoid failing in this case with the following insanity.
git submodule update --init --recursive git submodule sync || git submodule deinit --force --all
git submodule foreach git clean -xdf git submodule update --init
git submodule foreach git clean -xdf
else
info "Not cleaning submodules, not in a git repo"
fi;
end_section "clean submodules" end_section "clean submodules"
} }
...@@ -285,54 +394,50 @@ function prepare_build_mk() { ...@@ -285,54 +394,50 @@ function prepare_build_mk() {
if [[ -z "$BUILD_FLAVOUR" ]]; then fail "BUILD_FLAVOUR is not set"; fi if [[ -z "$BUILD_FLAVOUR" ]]; then fail "BUILD_FLAVOUR is not set"; fi
if [[ -z ${BUILD_SPHINX_HTML:-} ]]; then BUILD_SPHINX_HTML=YES; fi if [[ -z ${BUILD_SPHINX_HTML:-} ]]; then BUILD_SPHINX_HTML=YES; fi
if [[ -z ${BUILD_SPHINX_PDF:-} ]]; then BUILD_SPHINX_PDF=YES; fi if [[ -z ${BUILD_SPHINX_PDF:-} ]]; then BUILD_SPHINX_PDF=YES; fi
if [[ -z ${BIGNUM_BACKEND:-} ]]; then BIGNUM_BACKEND=gmp; fi
cat > mk/build.mk <<EOF cat > mk/build.mk <<EOF
V=1 BIGNUM_BACKEND=${BIGNUM_BACKEND}
HADDOCK_DOCS=YES include mk/flavours/${BUILD_FLAVOUR}.mk
LATEX_DOCS=YES
HSCOLOUR_SRCS=YES
BUILD_SPHINX_HTML=$BUILD_SPHINX_HTML
BUILD_SPHINX_PDF=$BUILD_SPHINX_PDF
BeConservative=YES
BIGNUM_BACKEND=$BIGNUM_BACKEND
XZ_CMD=$XZ
BuildFlavour=$BUILD_FLAVOUR
ifneq "\$(BuildFlavour)" ""
include mk/flavours/\$(BuildFlavour).mk
endif
GhcLibHcOpts+=-haddock GhcLibHcOpts+=-haddock
EOF EOF
if [ -n "$HADDOCK_HYPERLINKED_SOURCES" ]; then if [ -n "${HADDOCK_HYPERLINKED_SOURCES:-}" ]; then
echo "EXTRA_HADDOCK_OPTS += --hyperlinked-source --quickjump" >> mk/build.mk echo "EXTRA_HADDOCK_OPTS += --hyperlinked-source --quickjump" >> mk/build.mk
fi fi
case "$(uname)" in
Darwin) echo "libraries/integer-gmp_CONFIGURE_OPTS += --configure-option=--with-intree-gmp" >> mk/build.mk ;;
*) ;;
esac
info "build.mk is:" info "build.mk is:"
cat mk/build.mk cat mk/build.mk
} }
function configure() { function configure() {
start_section "booting" case "${CONFIGURE_WRAPPER:-}" in
run python3 boot emconfigure) source "$EMSDK/emsdk_env.sh" ;;
end_section "booting" *) ;;
esac
if [[ -z "${NO_BOOT:-}" ]]; then
start_section "booting"
run python3 boot
end_section "booting"
fi
local target_args="" read -r -a args <<< "${CONFIGURE_ARGS:-}"
if [[ -n "$triple" ]]; then if [[ -n "${target_triple:-}" ]]; then
target_args="--target=$triple" args+=("--target=$target_triple")
fi
if [[ -n "${ENABLE_NUMA:-}" ]]; then
args+=("--enable-numa")
else
args+=("--disable-numa")
fi fi
start_section "configuring" start_section "configuring"
run ./configure \ # See https://stackoverflow.com/questions/7577052 for a rationale for the
# args[@] symbol-soup below.
run ${CONFIGURE_WRAPPER:-} ./configure \
--enable-tarballs-autodownload \ --enable-tarballs-autodownload \
$target_args \ "${args[@]+"${args[@]}"}" \
$CONFIGURE_ARGS \
GHC="$GHC" \ GHC="$GHC" \
HAPPY="$HAPPY" \ HAPPY="$HAPPY" \
ALEX="$ALEX" \ ALEX="$ALEX" \
...@@ -340,125 +445,498 @@ function configure() { ...@@ -340,125 +445,498 @@ function configure() {
end_section "configuring" end_section "configuring"
} }
function build_make() {
prepare_build_mk
if [[ -z "$BIN_DIST_PREP_TAR_COMP" ]]; then
fail "BIN_DIST_PREP_TAR_COMP is not set"
fi
echo "include mk/flavours/${BUILD_FLAVOUR}.mk" > mk/build.mk
echo 'GhcLibHcOpts+=-haddock' >> mk/build.mk
run "$MAKE" -j"$cores" $MAKE_ARGS
run "$MAKE" -j"$cores" binary-dist-prep TAR_COMP_OPTS=-1
ls -lh "$BIN_DIST_PREP_TAR_COMP"
}
function fetch_perf_notes() { function fetch_perf_notes() {
info "Fetching perf notes..." info "Fetching perf notes..."
"$TOP/.gitlab/test-metrics.sh" pull "$TOP/.gitlab/test-metrics.sh" pull
} }
function push_perf_notes() { function push_perf_notes() {
if [[ -z "${TEST_ENV:-}" ]]; then
return
fi
if [ -n "${CROSS_TARGET:-}" ] && [ "${CROSS_EMULATOR:-}" != "js-emulator" ]; then
info "Can't test cross-compiled build."
return
fi
info "Pushing perf notes..." info "Pushing perf notes..."
"$TOP/.gitlab/test-metrics.sh" push "$TOP/.gitlab/test-metrics.sh" push
} }
function test_make() { # Figure out which commit should be used by the testsuite driver as a
run "$MAKE" test_bindist TEST_PREP=YES # performance baseline. See Note [The CI Story].
run "$MAKE" V=0 test \ function determine_metric_baseline() {
THREADS="$cores" \ if [ -n "${CI_MERGE_REQUEST_DIFF_BASE_SHA:-}" ]; then
JUNIT_FILE=../../junit.xml PERF_BASELINE_COMMIT="$CI_MERGE_REQUEST_DIFF_BASE_SHA"
export PERF_BASELINE_COMMIT
info "Using $PERF_BASELINE_COMMIT for performance metric baseline..."
fi
}
function check_msys2_deps() {
# Ensure that GHC on Windows doesn't have any dynamic dependencies on msys2
case "$(uname)" in
MSYS_*|MINGW*)
sysroot="$(cygpath "$SYSTEMROOT")"
PATH="$sysroot/System32:$sysroot;$sysroot/Wbem" $@ \
|| fail "'$@' failed; there may be unwanted dynamic dependencies."
;;
esac
}
# If RELEASE_JOB = yes then we skip builds with a validate flavour.
# This has the effect of
# (1) Skipping validate jobs when trying to do release builds
# (2) Ensured we don't accidentally build release builds with validate flavour.
#
# We should never try to build a validate build in a release pipeline so this is
# very defensive in case we have made a mistake somewhere.
function check_release_build() {
if [ "${RELEASE_JOB:-}" == "yes" ] && [[ "${BUILD_FLAVOUR:-}" == *"validate"* ]]
then
info "Exiting build because this is a validate build in a release job"
exit 0;
fi
} }
function build_hadrian() { function build_hadrian() {
if [ -z "$FLAVOUR" ]; then if [ -z "${BIN_DIST_NAME:-}" ]; then
fail "FLAVOUR not set" fail "BIN_DIST_NAME not set"
fi
if [ -n "${BIN_DIST_PREP_TAR_COMP:-}" ]; then
fail "BIN_DIST_PREP_TAR_COMP must not be set for hadrian (you mean BIN_DIST_NAME)"
fi
check_release_build
# We can safely enable parallel compression for x64. By the time
# hadrian calls tar/xz to produce bindist, there's no other build
# work taking place.
if [[ "${CI_JOB_NAME:-}" != *"i386"* ]]; then
export XZ_OPT="${XZ_OPT:-} -T$cores"
fi fi
run_hadrian binary-dist if [[ -n "${REINSTALL_GHC:-}" ]]; then
run_hadrian build-cabal -V
else
run_hadrian test:all_deps binary-dist -V
mv _build/bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
fi
mv _build/bindist/ghc*.tar.xz ghc.tar.xz }
# run's `make DESTDIR=$1 install` and then
# merges the file tree to the actual destination $2,
# ensuring that `DESTDIR` is properly honoured by the
# build system
function make_install_destdir() {
local destdir=$1
local instdir=$2
mkdir -p "$destdir"
mkdir -p "$instdir"
run "$MAKE" DESTDIR="$destdir" install || fail "make install failed"
# check for empty dir portably
# https://superuser.com/a/667100
if find "$instdir" -mindepth 1 -maxdepth 1 | read; then
fail "$instdir is not empty!"
fi
info "merging file tree from $destdir to $instdir"
cp -a "$destdir/$instdir"/* "$instdir"/
"$instdir"/bin/${cross_prefix}ghc-pkg recache
}
# install the binary distribution in directory $1 to $2.
function install_bindist() {
case "${CONFIGURE_WRAPPER:-}" in
emconfigure) source "$EMSDK/emsdk_env.sh" ;;
*) ;;
esac
local bindist="$1"
local instdir="$2"
pushd "$bindist"
case "$(uname)" in
MSYS_*|MINGW*)
mkdir -p "$instdir"
cp -a * "$instdir"
;;
*)
read -r -a args <<< "${INSTALL_CONFIGURE_ARGS:-}"
# FIXME: The bindist configure script shouldn't need to be reminded of
# the target platform. See #21970.
if [ -n "${CROSS_TARGET:-}" ]; then
args+=( "--target=$CROSS_TARGET" "--host=$CROSS_TARGET" )
fi
run ${CONFIGURE_WRAPPER:-} ./configure \
--prefix="$instdir" \
"${args[@]+"${args[@]}"}"
make_install_destdir "$TOP"/destdir "$instdir"
;;
esac
popd
} }
function test_hadrian() { function test_hadrian() {
cd _build/bindist/ghc-*/ check_msys2_deps _build/stage1/bin/ghc --version
run ./configure --prefix="$TOP"/_build/install check_release_build
run "$MAKE" install
cd ../../../ # Ensure that statically-linked builds are actually static
if [[ "${BUILD_FLAVOUR}" = *static* ]]; then
bad_execs=""
for binary in _build/stage1/bin/*; do
if ldd "${binary}" &> /dev/null; then
warn "${binary} is not static!"
ldd "${binary}"
echo
bad_execs="$bad_execs $binary"
fi
done
if [ -n "$bad_execs" ]; then
fail "the following executables contain dynamic-object references: $bad_execs"
fi
fi
if [[ "${CROSS_EMULATOR:-}" == "NOT_SET" ]]; then
info "Cannot test cross-compiled build without CROSS_EMULATOR being set."
return
# special case for JS backend
elif [ -n "${CROSS_TARGET:-}" ] && [ "${CROSS_EMULATOR:-}" == "js-emulator" ]; then
# The JS backend doesn't support CROSS_EMULATOR logic yet
unset CROSS_EMULATOR
# run "hadrian test" directly, not using the bindist, even though it did get installed.
# This is a temporary solution, See !9515 for the status of hadrian support.
run_hadrian \
test \
--summary-junit=./junit.xml \
--test-have-intree-files \
--docs=none \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "cross-compiled hadrian main testsuite"
elif [[ -n "${CROSS_TARGET:-}" ]] && [[ "${CROSS_TARGET:-}" == *"wasm"* ]]; then
run_hadrian \
test \
--summary-junit=./junit.xml \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian main testsuite targetting $CROSS_TARGET"
elif [ -n "${CROSS_TARGET:-}" ]; then
local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
install_bindist _build/bindist/ghc-*/ "$instdir"
echo 'main = putStrLn "hello world"' > expected
run "$test_compiler" -package ghc "$TOP/.gitlab/hello.hs" -o hello
${CROSS_EMULATOR:-} ./hello > actual
run diff expected actual
elif [[ -n "${REINSTALL_GHC:-}" ]]; then
run_hadrian \
test \
--test-root-dirs=testsuite/tests/stage1 \
--test-compiler=stage-cabal \
--test-root-dirs=testsuite/tests/perf \
--test-root-dirs=testsuite/tests/typecheck \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian cabal-install test"
else
local instdir="$TOP/_build/install"
local test_compiler="$instdir/bin/${cross_prefix}ghc$exe"
install_bindist _build/bindist/ghc-*/ "$instdir"
if [[ "${WINDOWS_HOST}" == "no" ]] && [ -z "${CROSS_TARGET:-}" ]
then
run_hadrian \
test \
--test-root-dirs=testsuite/tests/stage1 \
--test-compiler=stage1 \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian stage1 test"
info "STAGE1_TEST=$?"
fi
# Ensure the resulting compiler has the correct bignum-flavour,
# except for cross-compilers as they may not support the interpreter
if [ -z "${CROSS_TARGET:-}" ]
then
test_compiler_backend=$(${test_compiler} -e "GHC.Num.Backend.backendName")
if [ $test_compiler_backend != "\"$BIGNUM_BACKEND\"" ]; then
fail "Test compiler has a different BIGNUM_BACKEND ($test_compiler_backend) thean requested ($BIGNUM_BACKEND)"
fi
fi
# If we are doing a release job, check the compiler can build a profiled executable
if [ "${RELEASE_JOB:-}" == "yes" ]; then
echo "main = print ()" > proftest.hs
run ${test_compiler} -prof proftest.hs || fail "hadrian profiled libs test"
rm proftest.hs
fi
run_hadrian \
test \
--summary-junit=./junit.xml \
--test-have-intree-files \
--test-compiler="${test_compiler}" \
"runtest.opts+=${RUNTEST_ARGS:-}" || fail "hadrian main testsuite"
info "STAGE2_TEST=$?"
fi
}
function summarise_hi_files() {
for iface in $(find . -type f -name "*.hi" | sort); do echo "$iface $($HC --show-iface $iface | grep " ABI hash:")"; done | tee $OUT/abis
for iface in $(find . -type f -name "*.hi" | sort); do echo "$iface $($HC --show-iface $iface | grep " interface hash:")"; done | tee $OUT/interfaces
for iface in $(find . -type f -name "*.hi" | sort); do
fname="$OUT/$(dirname $iface)"
mkdir -p $fname
$HC --show-iface $iface > "$OUT/$iface"
done
}
run_hadrian \ function cabal_abi_test() {
test \ if [ -z "$OUT" ]; then
--summary-junit=./junit.xml \ fail "OUT not set"
--test-compiler="$TOP"/_build/install/bin/ghc fi
cp -r libraries/Cabal $DIR
pushd $DIR
echo $PWD
start_section "Cabal test: $OUT"
mkdir -p "$OUT"
run "$HC" \
-hidir tmp -odir tmp -fforce-recomp -haddock \
-iCabal/Cabal/src -XNoPolyKinds Distribution.Simple -j"$cores" \
"$@" 2>&1 | tee $OUT/log
summarise_hi_files
popd
end_section "Cabal test: $OUT"
}
function cabal_test() {
if [ -z "$OUT" ]; then
fail "OUT not set"
fi
start_section "Cabal test: $OUT"
mkdir -p "$OUT"
run "$HC" \
-hidir tmp -odir tmp -fforce-recomp \
-dumpdir "$OUT/dumps" -ddump-timings \
+RTS --machine-readable "-t$OUT/rts.log" -RTS \
-ilibraries/Cabal/Cabal/src -XNoPolyKinds Distribution.Simple \
"$@" 2>&1 | tee $OUT/log
rm -Rf tmp
end_section "Cabal test: $OUT"
}
function run_perf_test() {
if [ -z "$HC" ]; then
fail "HC not set"
fi
mkdir -p out
git -C libraries/Cabal/ rev-parse HEAD > out/cabal_commit
$HC --print-project-git-commit-id > out/ghc_commit
OUT=out/Cabal-O0 cabal_test -O0
OUT=out/Cabal-O1 cabal_test -O1
OUT=out/Cabal-O2 cabal_test -O2
}
function check_interfaces(){
difference=$(diff "$1/$3" "$2/$3") || warn "diff failed"
if [ -z "$difference" ]
then
info "$1 and $2 $3 match"
else
echo $difference
for line in $(echo "$difference" | tr ' ' '\n' | grep ".hi" | sort | uniq); do
diff "$1/$line" "$2/$line"
done
fail "$3"
fi
}
function abi_test() {
for i in {1..20}; do info "iteration $i"; run_abi_test; done
}
function run_abi_test() {
if [ -z "$HC" ]; then
fail "HC not set"
fi
mkdir -p out
OUT="$PWD/out/run1" DIR=$(mktemp -d XXXX-looooooooong) cabal_abi_test -O0
OUT="$PWD/out/run2" DIR=$(mktemp -d XXXX-short) cabal_abi_test -O0
check_interfaces out/run1 out/run2 abis "Mismatched ABI hash"
check_interfaces out/run1 out/run2 interfaces "Mismatched interface hashes"
}
function save_cache () {
info "Storing cabal cache from $CABAL_DIR to $CABAL_CACHE..."
rm -Rf "$CABAL_CACHE"
cp -Rf "$CABAL_DIR" "$CABAL_CACHE"
} }
function clean() { function clean() {
rm -R tmp rm -R tmp
run "$MAKE" --quiet clean || true
run rm -Rf _build run rm -Rf _build
} }
function run_hadrian() { function run_hadrian() {
run hadrian/build-cabal \ if [ -z "${BUILD_FLAVOUR:-}" ]; then
--flavour="$FLAVOUR" \ fail "BUILD_FLAVOUR not set"
-j"$cores" \ fi
--broken-test="$BROKEN_TESTS" \ read -r -a args <<< "${HADRIAN_ARGS:-}"
$HADRIAN_ARGS \ if [ -n "${VERBOSE:-}" ]; then args+=("-V"); fi
$@ # Before running the compiler, unset variables gitlab env vars as these
# can destabilise the performance test (see #20341)
(unset $(compgen -v | grep CI_*);
run "${HADRIAN_PATH:-hadrian/build-cabal}" \
--flavour="$BUILD_FLAVOUR" \
-j"$cores" \
--broken-test="${BROKEN_TESTS:-}" \
--bignum=$BIGNUM_BACKEND \
"${args[@]+"${args[@]}"}" \
"$@")
} }
# A convenience function to allow debugging in the CI environment. # A convenience function to allow debugging in the CI environment.
function shell() { function shell() {
local cmd=$@ local cmd="${@: 1}"
if [ -z "$cmd" ]; then if [ -z "$cmd" ]; then
cmd="bash -i" cmd="bash -i"
fi fi
run $cmd run "$cmd"
}
function lint_author(){
base=$1
head=$2
for email in $(git log --format='%ae' $base..$head); do
if [ $email == "ghc-ci@gitlab-haskell.org" ];
then
fail "Commit has GHC CI author, please amend the author information."
fi
done
}
function abi_of(){
DIR=$(realpath $1)
mkdir -p "$OUT"
pushd $DIR
summarise_hi_files
popd
}
# Checks that the interfaces in folder $1 match the interfaces in folder $2
function compare_interfaces_of(){
OUT=$PWD/out/run1 abi_of $1
OUT=$PWD/out/run2 abi_of $2
check_interfaces out/run1 out/run2 abis "Mismatched ABI hash"
check_interfaces out/run1 out/run2 interfaces "Mismatched interface hashes"
} }
# Determine Cabal data directory
setup_locale
# Platform-specific environment initialization
if [ -n "${HERMETIC:-}" ]; then
export CABAL_DIR="$TOP/cabal"
# We previously set HOME=/nonexistent but apparently nix wants $HOME to exist
# so sadly we must settle for someplace writable.
export HOME="$TOP/tmp-home"
else
BIN_DIST_NAME="${BIN_DIST_NAME:-}"
case "$(uname)" in
MSYS_*|MINGW*) CABAL_DIR="$APPDATA/cabal" ;;
*) CABAL_DIR="$HOME/.cabal" ;;
esac
fi
case "$(uname)" in case "$(uname)" in
MSYS_*|MINGW*) exe=".exe"; cabal_dir="$APPDATA/cabal" ;; MSYS_*|MINGW*)
*) cabal_dir="$HOME/.cabal"; exe="" ;; exe=".exe"
# N.B. cabal-install expects CABAL_DIR to be a Windows path
CABAL_DIR="$(cygpath -w "$CABAL_DIR")"
WINDOWS_HOST="yes"
;;
*)
exe=""
WINDOWS_HOST="no"
;;
esac esac
# Platform-specific environment initialization
MAKE="make" MAKE="make"
TAR="tar"
case "$(uname)" in case "$(uname)" in
MSYS_*|MINGW*) mingw_init ;; MSYS_*|MINGW*) mingw_init ;;
Darwin) boot_triple="x86_64-apple-darwin" ;; Darwin) boot_triple="x86_64-apple-darwin" ;;
FreeBSD) FreeBSD)
boot_triple="x86_64-portbld-freebsd" boot_triple="x86_64-portbld-freebsd"
MAKE="gmake" MAKE="gmake"
TAR="gtar"
;; ;;
Linux) ;; Linux) ;;
*) fail "uname $(uname) is not supported" ;; *) fail "uname $(uname) is not supported" ;;
esac esac
if [ -n "${CROSS_TARGET:-}" ]; then
info "Cross-compiling for $CROSS_TARGET..."
target_triple="$CROSS_TARGET"
cross_prefix="$target_triple-"
else
cross_prefix=""
fi
echo "Branch name ${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME:-}"
# Ignore performance improvements in @marge-bot batches.
# See #19562.
if [ "${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME:-}" == "wip/marge_bot_batch_merge_job" ]; then
if [ -z "${IGNORE_PERF_FAILURES:-}" ]; then
IGNORE_PERF_FAILURES="decreases"
echo "Ignoring perf failures"
fi
fi
echo "CI_COMMIT_BRANCH: ${CI_COMMIT_BRANCH:-}"
echo "CI_PROJECT_PATH: ${CI_PROJECT_PATH:-}"
if [ "${CI_COMMIT_BRANCH:-}" == "master" ] && [ "${CI_PROJECT_PATH:-}" == "ghc/ghc" ]; then
if [ -z "${IGNORE_PERF_FAILURES:-}" ]; then
IGNORE_PERF_FAILURES="decreases"
echo "Ignoring perf failures"
fi
fi
if [ -n "${IGNORE_PERF_FAILURES:-}" ]; then
RUNTEST_ARGS="--ignore-perf-failures=$IGNORE_PERF_FAILURES"
fi
if [[ -z ${BIGNUM_BACKEND:-} ]]; then BIGNUM_BACKEND=gmp; fi
determine_metric_baseline
set_toolchain_paths set_toolchain_paths
case $1 in case $1 in
usage) usage ;;
setup) setup && cleanup_submodules ;; setup) setup && cleanup_submodules ;;
configure) configure ;; configure) time_it "configure" configure ;;
build_make) build_make ;; build_hadrian) time_it "build" build_hadrian ;;
test_make)
fetch_perf_notes
res=0
test_make || res=$?
push_perf_notes
exit $res ;;
build_hadrian) build_hadrian ;;
# N.B. Always push notes, even if the build fails. This is okay to do as the # N.B. Always push notes, even if the build fails. This is okay to do as the
# testsuite driver doesn't record notes for tests that fail due to # testsuite driver doesn't record notes for tests that fail due to
# correctness. # correctness.
test_hadrian) test_hadrian)
fetch_perf_notes fetch_perf_notes
res=0 res=0
test_hadrian || res=$? time_it "test" test_hadrian || res=$?
push_perf_notes push_perf_notes
exit $res ;; exit $res ;;
run_hadrian) run_hadrian $@ ;; run_hadrian) shift; run_hadrian "$@" ;;
perf_test) run_perf_test ;;
abi_test) abi_test ;;
cabal_test) cabal_test ;;
lint_author) shift; lint_author "$@" ;;
compare_interfaces_of) shift; compare_interfaces_of "$@" ;;
clean) clean ;; clean) clean ;;
shell) shell $@ ;; save_cache) save_cache ;;
shell) shift; shell "$@" ;;
*) fail "unknown mode $1" ;; *) fail "unknown mode $1" ;;
esac esac
# Common bash utilities
# ----------------------
# Colors
BLACK="0;30"
GRAY="1;30"
RED="0;31"
LT_RED="1;31"
BROWN="0;33"
LT_BROWN="1;33"
GREEN="0;32"
LT_GREEN="1;32"
BLUE="0;34"
LT_BLUE="1;34"
PURPLE="0;35"
LT_PURPLE="1;35"
CYAN="0;36"
LT_CYAN="1;36"
WHITE="1;37"
LT_GRAY="0;37"
# GitLab Pipelines log section delimiters
# https://gitlab.com/gitlab-org/gitlab-foss/issues/14664
start_section() {
name="$1"
echo -e "section_start:$(date +%s):$name\015\033[0K"
}
end_section() {
name="$1"
echo -e "section_end:$(date +%s):$name\015\033[0K"
}
echo_color() {
local color="$1"
local msg="$2"
echo -e "\033[${color}m${msg}\033[0m"
}
error() { echo_color "${RED}" "$1"; }
warn() { echo_color "${LT_BROWN}" "$1"; }
info() { echo_color "${LT_BLUE}" "$1"; }
fail() { error "error: $1"; exit 1; }
function run() {
info "Running $*..."
"$@" || ( error "$* failed"; return 1; )
}
{
"niv": {
"branch": "master",
"description": "Easy dependency management for Nix projects",
"homepage": "https://github.com/nmattia/niv",
"owner": "nmattia",
"repo": "niv",
"rev": "e0ca65c81a2d7a4d82a189f1e23a48d59ad42070",
"sha256": "1pq9nh1d8nn3xvbdny8fafzw87mj7gsmp6pxkdl65w2g18rmcmzx",
"type": "tarball",
"url": "https://github.com/nmattia/niv/archive/e0ca65c81a2d7a4d82a189f1e23a48d59ad42070.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
},
"nixpkgs": {
"branch": "master",
"description": "Nix Packages collection",
"homepage": "",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "ce1aa29621356706746c53e2d480da7c68f6c972",
"sha256": "sha256:1sbs3gi1nf4rcbmnw69fw0fpvb3qvlsa84hqimv78vkpd6xb0bgg",
"type": "tarball",
"url": "https://github.com/nixos/nixpkgs/archive/ce1aa29621356706746c53e2d480da7c68f6c972.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
}
}
# This file has been generated by Niv.
let
#
# The fetchers. fetch_<type> fetches specs of type <type>.
#
fetch_file = pkgs: name: spec:
let
name' = sanitizeName name + "-src";
in
if spec.builtin or true then
builtins_fetchurl { inherit (spec) url sha256; name = name'; }
else
pkgs.fetchurl { inherit (spec) url sha256; name = name'; };
fetch_tarball = pkgs: name: spec:
let
name' = sanitizeName name + "-src";
in
if spec.builtin or true then
builtins_fetchTarball { name = name'; inherit (spec) url sha256; }
else
pkgs.fetchzip { name = name'; inherit (spec) url sha256; };
fetch_git = name: spec:
let
ref =
if spec ? ref then spec.ref else
if spec ? branch then "refs/heads/${spec.branch}" else
if spec ? tag then "refs/tags/${spec.tag}" else
abort "In git source '${name}': Please specify `ref`, `tag` or `branch`!";
in
builtins.fetchGit { url = spec.repo; inherit (spec) rev; inherit ref; };
fetch_local = spec: spec.path;
fetch_builtin-tarball = name: throw
''[${name}] The niv type "builtin-tarball" is deprecated. You should instead use `builtin = true`.
$ niv modify ${name} -a type=tarball -a builtin=true'';
fetch_builtin-url = name: throw
''[${name}] The niv type "builtin-url" will soon be deprecated. You should instead use `builtin = true`.
$ niv modify ${name} -a type=file -a builtin=true'';
#
# Various helpers
#
# https://github.com/NixOS/nixpkgs/pull/83241/files#diff-c6f540a4f3bfa4b0e8b6bafd4cd54e8bR695
sanitizeName = name:
(
concatMapStrings (s: if builtins.isList s then "-" else s)
(
builtins.split "[^[:alnum:]+._?=-]+"
((x: builtins.elemAt (builtins.match "\\.*(.*)" x) 0) name)
)
);
# The set of packages used when specs are fetched using non-builtins.
mkPkgs = sources: system:
let
sourcesNixpkgs =
import (builtins_fetchTarball { inherit (sources.nixpkgs) url sha256; }) { inherit system; };
hasNixpkgsPath = builtins.any (x: x.prefix == "nixpkgs") builtins.nixPath;
hasThisAsNixpkgsPath = <nixpkgs> == ./.;
in
if builtins.hasAttr "nixpkgs" sources
then sourcesNixpkgs
else if hasNixpkgsPath && ! hasThisAsNixpkgsPath then
import <nixpkgs> {}
else
abort
''
Please specify either <nixpkgs> (through -I or NIX_PATH=nixpkgs=...) or
add a package called "nixpkgs" to your sources.json.
'';
# The actual fetching function.
fetch = pkgs: name: spec:
if ! builtins.hasAttr "type" spec then
abort "ERROR: niv spec ${name} does not have a 'type' attribute"
else if spec.type == "file" then fetch_file pkgs name spec
else if spec.type == "tarball" then fetch_tarball pkgs name spec
else if spec.type == "git" then fetch_git name spec
else if spec.type == "local" then fetch_local spec
else if spec.type == "builtin-tarball" then fetch_builtin-tarball name
else if spec.type == "builtin-url" then fetch_builtin-url name
else
abort "ERROR: niv spec ${name} has unknown type ${builtins.toJSON spec.type}";
# If the environment variable NIV_OVERRIDE_${name} is set, then use
# the path directly as opposed to the fetched source.
replace = name: drv:
let
saneName = stringAsChars (c: if isNull (builtins.match "[a-zA-Z0-9]" c) then "_" else c) name;
ersatz = builtins.getEnv "NIV_OVERRIDE_${saneName}";
in
if ersatz == "" then drv else
# this turns the string into an actual Nix path (for both absolute and
# relative paths)
if builtins.substring 0 1 ersatz == "/" then /. + ersatz else /. + builtins.getEnv "PWD" + "/${ersatz}";
# Ports of functions for older nix versions
# a Nix version of mapAttrs if the built-in doesn't exist
mapAttrs = builtins.mapAttrs or (
f: set: with builtins;
listToAttrs (map (attr: { name = attr; value = f attr set.${attr}; }) (attrNames set))
);
# https://github.com/NixOS/nixpkgs/blob/0258808f5744ca980b9a1f24fe0b1e6f0fecee9c/lib/lists.nix#L295
range = first: last: if first > last then [] else builtins.genList (n: first + n) (last - first + 1);
# https://github.com/NixOS/nixpkgs/blob/0258808f5744ca980b9a1f24fe0b1e6f0fecee9c/lib/strings.nix#L257
stringToCharacters = s: map (p: builtins.substring p 1 s) (range 0 (builtins.stringLength s - 1));
# https://github.com/NixOS/nixpkgs/blob/0258808f5744ca980b9a1f24fe0b1e6f0fecee9c/lib/strings.nix#L269
stringAsChars = f: s: concatStrings (map f (stringToCharacters s));
concatMapStrings = f: list: concatStrings (map f list);
concatStrings = builtins.concatStringsSep "";
# https://github.com/NixOS/nixpkgs/blob/8a9f58a375c401b96da862d969f66429def1d118/lib/attrsets.nix#L331
optionalAttrs = cond: as: if cond then as else {};
# fetchTarball version that is compatible between all the versions of Nix
builtins_fetchTarball = { url, name ? null, sha256 }@attrs:
let
inherit (builtins) lessThan nixVersion fetchTarball;
in
if lessThan nixVersion "1.12" then
fetchTarball ({ inherit url; } // (optionalAttrs (!isNull name) { inherit name; }))
else
fetchTarball attrs;
# fetchurl version that is compatible between all the versions of Nix
builtins_fetchurl = { url, name ? null, sha256 }@attrs:
let
inherit (builtins) lessThan nixVersion fetchurl;
in
if lessThan nixVersion "1.12" then
fetchurl ({ inherit url; } // (optionalAttrs (!isNull name) { inherit name; }))
else
fetchurl attrs;
# Create the final "sources" from the config
mkSources = config:
mapAttrs (
name: spec:
if builtins.hasAttr "outPath" spec
then abort
"The values in sources.json should not have an 'outPath' attribute"
else
spec // { outPath = replace name (fetch config.pkgs name spec); }
) config.sources;
# The "config" used by the fetchers
mkConfig =
{ sourcesFile ? if builtins.pathExists ./sources.json then ./sources.json else null
, sources ? if isNull sourcesFile then {} else builtins.fromJSON (builtins.readFile sourcesFile)
, system ? builtins.currentSystem
, pkgs ? mkPkgs sources system
}: rec {
# The sources, i.e. the attribute set of spec name to spec
inherit sources;
# The "pkgs" (evaluated nixpkgs) to use for e.g. non-builtin fetchers
inherit pkgs;
};
in
mkSources (mkConfig {}) // { __functor = _: settings: mkSources (mkConfig settings); }
{ system }:
let
sources = import ./nix/sources.nix;
nixpkgsSrc = sources.nixpkgs;
pkgs = import nixpkgsSrc { inherit system; };
in
let
hsPkgs = pkgs.haskellPackages;
alex = hsPkgs.alex;
happy = hsPkgs.happy;
targetTriple = pkgs.stdenv.targetPlatform.config;
ghcBindists = let version = ghc.version; in {
aarch64-darwin = pkgs.fetchurl {
url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-aarch64-apple-darwin.tar.xz";
sha256 = "sha256-tQUHsingxBizLktswGAoi6lJf92RKWLjsHB9CisANlg=";
};
x86_64-darwin = pkgs.fetchurl {
url = "https://downloads.haskell.org/ghc/${version}/ghc-${version}-x86_64-apple-darwin.tar.xz";
sha256 = "sha256-OjXjVe+ZODDCc/hqtihqqz6CX25TKI0ZgORzkR5O3pQ=";
};
};
ghc = pkgs.stdenv.mkDerivation rec {
version = "9.4.4";
name = "ghc";
src = ghcBindists.${pkgs.stdenv.hostPlatform.system};
configureFlags = [
"CC=/usr/bin/clang"
"CLANG=/usr/bin/clang"
"LLC=${llvm}/bin/llc"
"OPT=${llvm}/bin/opt"
"CONF_CC_OPTS_STAGE2=--target=${targetTriple}"
"CONF_CXX_OPTS_STAGE2=--target=${targetTriple}"
"CONF_GCC_LINKER_OPTS_STAGE2=--target=${targetTriple}"
];
buildPhase = "true";
# This is a horrible hack because the configure script invokes /usr/bin/clang
# without a `--target` flag. Then depending on whether the `nix` binary itself is
# a native x86 or arm64 binary means that /usr/bin/clang thinks it needs to run in
# x86 or arm64 mode.
# The correct answer for the check in question is the first one we try, so by replacing
# the condition to true; we select the right C++ standard library still.
preConfigure = ''
sed "s/\"\$CC\" -o actest actest.o \''${1} 2>\/dev\/null/true/i" configure > configure.new
mv configure.new configure
chmod +x configure
cat configure
'';
# N.B. Work around #20253.
nativeBuildInputs = [ pkgs.gnused ];
postInstallPhase = ''
settings="$out/lib/ghc-${version}/settings"
sed -i -e "s%\"llc\"%\"${llvm}/bin/llc\"%" $settings
sed -i -e "s%\"opt\"%\"${llvm}/bin/opt\"%" $settings
sed -i -e "s%\"clang\"%\"/usr/bin/clang\"%" $settings
sed -i -e 's%("C compiler command", "")%("C compiler command", "/usr/bin/clang")%' $settings
sed -i -e 's%("C compiler flags", "")%("C compiler flags", "--target=${targetTriple}")%' $settings
sed -i -e 's%("C++ compiler flags", "")%("C++ compiler flags", "--target=${targetTriple}")%' $settings
sed -i -e 's%("C compiler link flags", "")%("C compiler link flags", "--target=${targetTriple}")%' $settings
'';
# Sanity check: verify that we can compile hello world.
doInstallCheck = true;
installCheckPhase = ''
unset DYLD_LIBRARY_PATH
$out/bin/ghc --info
cd $TMP
mkdir test-ghc; cd test-ghc
cat > main.hs << EOF
{-# LANGUAGE TemplateHaskell #-}
module Main where
main = putStrLn \$([|"yes"|])
EOF
$out/bin/ghc --make -v3 main.hs || exit 1
echo compilation ok
[ $(./main) == "yes" ]
'';
};
ourtexlive = with pkgs;
texlive.combine {
inherit (texlive)
scheme-medium collection-xetex fncychap titlesec tabulary varwidth
framed capt-of wrapfig needspace dejavu-otf helvetic upquote;
};
fonts = with pkgs; makeFontsConf { fontDirectories = [ dejavu_fonts ]; };
llvm = pkgs.llvm_11;
in
pkgs.writeTextFile {
name = "toolchain";
text = ''
export PATH
PATH="${pkgs.autoconf}/bin:$PATH"
PATH="${pkgs.automake}/bin:$PATH"
export FONTCONFIG_FILE=${fonts}
export XELATEX="${ourtexlive}/bin/xelatex"
export MAKEINDEX="${ourtexlive}/bin/makeindex"
export HAPPY="${happy}/bin/happy"
export ALEX="${alex}/bin/alex"
export GHC="${ghc}/bin/ghc"
export LLC="${llvm}/bin/llc"
export OPT="${llvm}/bin/opt"
export SPHINXBUILD="${pkgs.python3Packages.sphinx}/bin/sphinx-build"
export CABAL_INSTALL="${pkgs.cabal-install}/bin/cabal"
export CABAL="$CABAL_INSTALL"
sdk_path="$(xcrun --sdk macosx --show-sdk-path)"
export CONFIGURE_ARGS="$CONFIGURE_ARGS --with-ffi-libraries=$sdk_path/usr/lib --with-ffi-includes=$sdk_path/usr/include/ffi --build=${targetTriple}"
'';
}
cabal-version: 3.0
name: gen-ci
version: 0.1.0.0
build-type: Simple
common warnings
ghc-options: -Wall
executable gen_ci
import: warnings
main-is: gen_ci.hs
build-depends:
, aeson >=1.8.1
, base
, bytestring
, containers
default-language: Haskell2010
#!/usr/bin/env cabal
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE DeriveFunctor #-}
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
{- cabal:
build-depends: base, aeson >= 1.8.1, containers, bytestring
-}
import Data.Aeson as A
import qualified Data.Map as Map
import Data.Map (Map)
import Data.Maybe
import qualified Data.ByteString.Lazy as B
import qualified Data.ByteString.Lazy.Char8 as B
import Data.List (intercalate)
import Data.Set (Set)
import qualified Data.Set as S
import System.Environment
{-
Note [Generating the CI pipeline]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This script is responsible for generating the majority of jobs in the CI pipeline.
In particular, it generates all the standard build configurations which perform a
full build and test of the compiler.
There are broadly three categories of job:
* validate - jobs run on every MR, these are typically validate builds.
* nightly - jobs run once per day on the master branch
* release - jobs for producing release artifacts, these are perf builds.
Basically, for each suitable combination of architecture and operating system these three
jobs are generated which run either.
In reality things are a bit more complicated because
* validate - we run some additional validation jobs which have no corresponding release artifacts
* nightly - Some builds are only run on nightly, rather than also validate, to
relieve pressure on CI
* release - Not all jobs are run in release pipelines, only those which we
produce release artifacts for.
The job specification can be seen at the bottom of this file in the 'jobs' variable.
The generated jobs assume certain things about the configuration file they are included
into. For example
* The DOCKER_REV variable must be set (which specifies the versions of the docker images)
Things will go very quickly wrong if you don't have the right variables set, the
testing logic in `ci.sh` contains more dependencies on these global variables.
Generating the CI configuration
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In order to regenerate the CI configuration you need to run the ./generate_jobs
script which adds a module header and also formats the output JSON with jq.
Other CI jobs
~~~~~~~~~~~~~
Not all the jobs in the CI pipeline are generated by this script. There are quite a
few ad-hoc jobs (which don't fit into the build/test with hadrian) model. For example
* linters
* hadrian/ghci
* One test which builds with the make build system (until we remove it #17527)
All these definitions are found in the .gitlab-ci.yaml file directly.
Note [Consumers of artifacts]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The generated names for the jobs is important as there are a few downstream consumers
of the jobs artifacts. Therefore some care should be taken if changing the generated
names of jobs to update these other places.
1. Fedora33 jobs are required by head.hackage
2. The fetch-gitlab release utility pulls release artifacts from the
3. The ghc-head-from script downloads release artifacts based on a pipeline change.
4. Some subsequent CI jobs have explicit dependencies (for example docs-tarball, perf, perf-nofib)
Note [Generation Modes]
~~~~~~~~~~~~~~~~~~~~~~~
There are two different modes this script can operate in:
* `gitlab`: Generates a job.yaml which defines all the pipelines for the platforms
* `metadata`: Generates a file which maps a platform the the "default" validate and
nightly pipeline. This file is intended to be used when generating
ghcup metadata.
-}
-----------------------------------------------------------------------------
-- Definition of a BuildConfig (options which affect the binaries which are in the bindist)
-----------------------------------------------------------------------------
-- | Operating system
data Opsys
= Linux LinuxDistro
| Darwin
| FreeBSD13
| Windows deriving (Eq)
data LinuxDistro
= Debian11 | Debian10 | Debian9
| Fedora33
| Ubuntu2004
| Ubuntu1804
| Centos7
| Alpine
| AlpineWasm
| Rocky8
deriving (Eq)
data Arch = Amd64 | AArch64 | I386
data BignumBackend = Native | Gmp deriving Eq
bignumString :: BignumBackend -> String
bignumString Gmp = "gmp"
bignumString Native = "native"
data CrossEmulator
= NoEmulator
| NoEmulatorNeeded
| Emulator String
-- | A BuildConfig records all the options which can be modified to affect the
-- bindists produced by the compiler.
data BuildConfig
= BuildConfig { withDwarf :: Bool
, unregisterised :: Bool
, buildFlavour :: BaseFlavour
, bignumBackend :: BignumBackend
, llvmBootstrap :: Bool
, withAssertions :: Bool
, withNuma :: Bool
, crossTarget :: Maybe String
, crossEmulator :: CrossEmulator
, configureWrapper :: Maybe String
, fullyStatic :: Bool
, tablesNextToCode :: Bool
, threadSanitiser :: Bool
, noSplitSections :: Bool
, validateNonmovingGc :: Bool
}
-- Extra arguments to pass to ./configure due to the BuildConfig
configureArgsStr :: BuildConfig -> String
configureArgsStr bc = unwords $
["--enable-unregisterised"| unregisterised bc ]
++ ["--disable-tables-next-to-code" | not (tablesNextToCode bc) ]
++ ["--with-intree-gmp" | Just _ <- pure (crossTarget bc) ]
++ ["--with-system-libffi" | crossTarget bc == Just "wasm32-wasi" ]
-- Compute the hadrian flavour from the BuildConfig
mkJobFlavour :: BuildConfig -> Flavour
mkJobFlavour BuildConfig{..} = Flavour buildFlavour opts
where
opts = [Llvm | llvmBootstrap] ++
[Dwarf | withDwarf] ++
[FullyStatic | fullyStatic] ++
[ThreadSanitiser | threadSanitiser] ++
[NoSplitSections | noSplitSections, buildFlavour == Release ] ++
[BootNonmovingGc | validateNonmovingGc ]
data Flavour = Flavour BaseFlavour [FlavourTrans]
data FlavourTrans
= Llvm | Dwarf | FullyStatic | ThreadSanitiser | NoSplitSections
| BootNonmovingGc
data BaseFlavour = Release | Validate | SlowValidate deriving Eq
-----------------------------------------------------------------------------
-- Build Configs
-----------------------------------------------------------------------------
-- | A standard build config
vanilla :: BuildConfig
vanilla = BuildConfig
{ withDwarf = False
, unregisterised = False
, buildFlavour = Validate
, bignumBackend = Gmp
, llvmBootstrap = False
, withAssertions = False
, withNuma = False
, crossTarget = Nothing
, crossEmulator = NoEmulator
, configureWrapper = Nothing
, fullyStatic = False
, tablesNextToCode = True
, threadSanitiser = False
, noSplitSections = False
, validateNonmovingGc = False
}
splitSectionsBroken :: BuildConfig -> BuildConfig
splitSectionsBroken bc = bc { noSplitSections = True }
nativeInt :: BuildConfig
nativeInt = vanilla { bignumBackend = Native }
dwarf :: BuildConfig
dwarf = vanilla { withDwarf = True }
unreg :: BuildConfig
unreg = vanilla { unregisterised = True }
releaseConfig :: BuildConfig
releaseConfig = vanilla { buildFlavour = Release }
debug :: BuildConfig
debug = vanilla { buildFlavour = SlowValidate
, withAssertions = True
-- WithNuma so at least one job tests Numa
, withNuma = True
}
static :: BuildConfig
static = vanilla { fullyStatic = True }
staticNativeInt :: BuildConfig
staticNativeInt = static { bignumBackend = Native }
crossConfig :: String -- ^ target triple
-> CrossEmulator -- ^ emulator for testing
-> Maybe String -- ^ Configure wrapper
-> BuildConfig
crossConfig triple emulator configure_wrapper =
vanilla { crossTarget = Just triple
, crossEmulator = emulator
, configureWrapper = configure_wrapper
}
llvm :: BuildConfig
llvm = vanilla { llvmBootstrap = True }
tsan :: BuildConfig
tsan = vanilla { threadSanitiser = True }
noTntc :: BuildConfig
noTntc = vanilla { tablesNextToCode = False }
-----------------------------------------------------------------------------
-- Platform specific variables
-----------------------------------------------------------------------------
-- | These tags have to match what we call the runners on gitlab
runnerTag :: Arch -> Opsys -> String
runnerTag arch (Linux _) =
case arch of
Amd64 -> "x86_64-linux"
AArch64 -> "aarch64-linux"
I386 -> "x86_64-linux"
runnerTag AArch64 Darwin = "aarch64-darwin"
runnerTag Amd64 Darwin = "x86_64-darwin-m1"
runnerTag Amd64 Windows = "new-x86_64-windows"
runnerTag Amd64 FreeBSD13 = "x86_64-freebsd13"
runnerTag _ _ = error "Invalid arch/opsys"
tags :: Arch -> Opsys -> BuildConfig -> [String]
tags arch opsys _bc = [runnerTag arch opsys] -- Tag for which runners we can use
-- These names are used to find the docker image so they have to match what is
-- in the docker registry.
distroName :: LinuxDistro -> String
distroName Debian11 = "deb11"
distroName Debian10 = "deb10"
distroName Debian9 = "deb9"
distroName Fedora33 = "fedora33"
distroName Ubuntu1804 = "ubuntu18_04"
distroName Ubuntu2004 = "ubuntu20_04"
distroName Centos7 = "centos7"
distroName Alpine = "alpine3_12"
distroName AlpineWasm = "alpine3_17-wasm"
distroName Rocky8 = "rocky8"
opsysName :: Opsys -> String
opsysName (Linux distro) = "linux-" ++ distroName distro
opsysName Darwin = "darwin"
opsysName FreeBSD13 = "freebsd13"
opsysName Windows = "windows"
archName :: Arch -> String
archName Amd64 = "x86_64"
archName AArch64 = "aarch64"
archName I386 = "i386"
binDistName :: Arch -> Opsys -> BuildConfig -> String
binDistName arch opsys bc = "ghc-" ++ testEnv arch opsys bc
-- | Test env should create a string which changes whenever the 'BuildConfig' changes.
-- Either the change is reflected by modifying the flavourString or directly (as is
-- the case for settings which affect environment variables)
testEnv :: Arch -> Opsys -> BuildConfig -> String
testEnv arch opsys bc = intercalate "-" $
[ archName arch
, opsysName opsys ]
++ ["int_" ++ bignumString (bignumBackend bc) | bignumBackend bc /= Gmp]
++ ["unreg" | unregisterised bc ]
++ ["numa" | withNuma bc ]
++ ["no_tntc" | not (tablesNextToCode bc) ]
++ ["cross_"++triple | Just triple <- pure $ crossTarget bc ]
++ [flavourString (mkJobFlavour bc)]
-- | The hadrian flavour string we are going to use for this build
flavourString :: Flavour -> String
flavourString (Flavour base trans) = baseString base ++ concatMap (("+" ++) . flavourString) trans
where
baseString Release = "release"
baseString Validate = "validate"
baseString SlowValidate = "slow-validate"
flavourString Llvm = "llvm"
flavourString Dwarf = "debug_info"
flavourString FullyStatic = "fully_static"
flavourString ThreadSanitiser = "thread_sanitizer"
flavourString NoSplitSections = "no_split_sections"
flavourString BootNonmovingGc = "boot_nonmoving_gc"
-- The path to the docker image (just for linux builders)
dockerImage :: Arch -> Opsys -> Maybe String
dockerImage arch (Linux distro) =
Just image
where
image = mconcat
[ "registry.gitlab.haskell.org/ghc/ci-images/"
, archName arch
, "-linux-"
, distroName distro
, ":$DOCKER_REV"
]
dockerImage _ _ = Nothing
-----------------------------------------------------------------------------
-- Platform specific variables
-----------------------------------------------------------------------------
-- The variables map is a monoidal map so that we don't ever accidentally lose
-- variables settings by silently overwriting when merging. At the end these variables
-- are combinated together with spaces if they are set multiple times. This may
-- produce nonsense but it's easier to debug that silently overwriting.
--
-- The "proper" solution would be to use a dependent monoidal map where each key specifies
-- the combination behaviour of it's values. Ie, whether setting it multiple times is an error
-- or they should be combined.
newtype MonoidalMap k v = MonoidalMap { unMonoidalMap :: Map k v }
deriving (Eq, Show, Functor, ToJSON)
instance (Ord k, Semigroup v) => Semigroup (MonoidalMap k v) where
(MonoidalMap a) <> (MonoidalMap b) = MonoidalMap (Map.unionWith (<>) a b)
instance (Ord k, Semigroup v) => Monoid (MonoidalMap k v) where
mempty = MonoidalMap Map.empty
mminsertWith :: Ord k => (a -> a -> a) -> k -> a -> MonoidalMap k a -> MonoidalMap k a
mminsertWith f k v (MonoidalMap m) = MonoidalMap (Map.insertWith f k v m)
mmlookup :: Ord k => k -> MonoidalMap k a -> Maybe a
mmlookup k (MonoidalMap m) = Map.lookup k m
type Variables = MonoidalMap String [String]
(=:) :: String -> String -> Variables
a =: b = MonoidalMap (Map.singleton a [b])
opsysVariables :: Arch -> Opsys -> Variables
opsysVariables _ FreeBSD13 = mconcat
[ -- N.B. we use iconv from ports as I see linker errors when we attempt
-- to use the "native" iconv embedded in libc as suggested by the
-- porting guide [1].
-- [1] https://www.freebsd.org/doc/en/books/porters-handbook/using-iconv.html)
"CONFIGURE_ARGS" =: "--with-gmp-includes=/usr/local/include --with-gmp-libraries=/usr/local/lib --with-iconv-includes=/usr/local/include --with-iconv-libraries=/usr/local/lib"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
, "GHC_VERSION" =: "9.4.3"
, "CABAL_INSTALL_VERSION" =: "3.8.1.0"
]
opsysVariables _ (Linux distro) = distroVariables distro
opsysVariables AArch64 (Darwin {}) =
mconcat [ "NIX_SYSTEM" =: "aarch64-darwin"
, "MACOSX_DEPLOYMENT_TARGET" =: "11.0"
, "LANG" =: "en_US.UTF-8"
, "CONFIGURE_ARGS" =: "--with-intree-gmp --with-system-libffi"
-- Fonts can't be installed on darwin
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
]
opsysVariables Amd64 (Darwin {}) =
mconcat [ "NIX_SYSTEM" =: "x86_64-darwin"
, "MACOSX_DEPLOYMENT_TARGET" =: "10.10"
-- "# Only Sierra and onwards supports clock_gettime. See #12858"
, "ac_cv_func_clock_gettime" =: "no"
-- # Only newer OS Xs support utimensat. See #17895
, "ac_cv_func_utimensat" =: "no"
, "LANG" =: "en_US.UTF-8"
, "CONFIGURE_ARGS" =: "--with-intree-gmp --with-system-libffi"
-- Fonts can't be installed on darwin
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
]
opsysVariables _ (Windows {}) =
mconcat [ "MSYSTEM" =: "CLANG64"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
, "LANG" =: "en_US.UTF-8"
, "CABAL_INSTALL_VERSION" =: "3.8.1.0"
, "GHC_VERSION" =: "9.4.3" ]
opsysVariables _ _ = mempty
distroVariables :: LinuxDistro -> Variables
distroVariables Alpine = mconcat
[ -- Due to #20266
"CONFIGURE_ARGS" =: "--disable-ld-override"
, "INSTALL_CONFIGURE_ARGS" =: "--disable-ld-override"
, "HADRIAN_ARGS" =: "--docs=no-sphinx"
-- encoding004: due to lack of locale support
-- T10458, ghcilink002: due to #17869
-- linker_unload_native: due to musl not supporting any means of probing dynlib dependencies
-- (see Note [Object unloading]).
, "BROKEN_TESTS" =: "encoding004 T10458 linker_unload_native"
]
distroVariables Centos7 = mconcat [
"HADRIAN_ARGS" =: "--docs=no-sphinx"
]
distroVariables Rocky8 = mconcat [
"HADRIAN_ARGS" =: "--docs=no-sphinx"
]
distroVariables Fedora33 = mconcat
-- LLC/OPT do not work for some reason in our fedora images
-- These tests fail with this error: T11649 T5681 T7571 T8131b
-- +/opt/llvm/bin/opt: /lib64/libtinfo.so.5: no version information available (required by /opt/llvm/bin/opt)
-- +/opt/llvm/bin/llc: /lib64/libtinfo.so.5: no version information available (required by /opt/llvm/bin/llc)
[ "LLC" =: "/bin/false"
, "OPT" =: "/bin/false"
]
distroVariables _ = mempty
-----------------------------------------------------------------------------
-- Cache settings, what to cache and when can we share the cache
-----------------------------------------------------------------------------
data Cache
= Cache { cacheKey :: String
, cachePaths :: [String]
}
-- The cache doesn't depend on the BuildConfig because we only cache the cabal store.
mkCacheKey :: Arch -> Opsys -> String
mkCacheKey arch opsys = archName arch <> "-" <> opsysName opsys <> "-$CACHE_REV"
instance ToJSON Cache where
toJSON Cache {..} = object
[ "key" A..= cacheKey
, "paths" A..= cachePaths
]
-----------------------------------------------------------------------------
-- Artifacts, what to store and how long for
-----------------------------------------------------------------------------
data Artifacts
= Artifacts { artifactPaths :: [String]
, junitReport :: String
, expireIn :: String
, artifactsWhen :: ArtifactsWhen
}
instance ToJSON Artifacts where
toJSON Artifacts{..} = object
[ "reports" A..= object
[ "junit" A..= junitReport
]
, "expire_in" A..= expireIn
, "paths" A..= artifactPaths
, "when" A..= artifactsWhen
]
data ArtifactsWhen = ArtifactsOnSuccess | ArtifactsOnFailure | ArtifactsAlways
instance ToJSON ArtifactsWhen where
toJSON ArtifactsOnSuccess = "on_success"
toJSON ArtifactsOnFailure = "on_failure"
toJSON ArtifactsAlways = "always"
-----------------------------------------------------------------------------
-- Rules, when do we run a job
-----------------------------------------------------------------------------
-- Data structure which records the condition when a job is run.
data OnOffRules = OnOffRules { rule_set :: Set Rule -- ^ The set of enabled rules
, when :: ManualFlag -- ^ The additional condition about when to run this job.
}
-- The initial set of rules where all rules are disabled and the job is always run.
emptyRules :: OnOffRules
emptyRules = OnOffRules S.empty OnSuccess
-- When to run the job
data ManualFlag = Manual -- ^ Only run the job when explicitly triggered by a user
| OnSuccess -- ^ Always run it, if the rules pass (the default)
deriving Eq
enableRule :: Rule -> OnOffRules -> OnOffRules
enableRule r (OnOffRules o m) = OnOffRules (S.insert r o) m
manualRule :: OnOffRules -> OnOffRules
manualRule rules = rules { when = Manual }
-- Given 'OnOffRules', returns a list of ALL rules with their toggled status.
-- For example, even if you don't explicitly disable a rule it will end up in the
-- rule list with the OFF state.
enumRules :: OnOffRules -> [OnOffRule]
enumRules o = map lkup rules
where
enabled_rules = rule_set o
lkup r = OnOffRule (if S.member r enabled_rules then On else Off) r
data OnOffRule = OnOffRule OnOff Rule
data OnOff = On | Off
instance ToJSON ManualFlag where
toJSON Manual = "manual"
toJSON OnSuccess = "on_success"
instance ToJSON OnOffRules where
toJSON rules = toJSON [object ([
"if" A..= and_all (map one_rule (enumRules rules))
, "when" A..= toJSON (when rules)]
-- Necessary to stop manual jobs stopping pipeline progress
-- https://docs.gitlab.com/ee/ci/yaml/#rulesallow_failure
++
["allow_failure" A..= True | when rules == Manual ])]
where
one_rule (OnOffRule onoff r) = ruleString onoff r
parens s = "(" ++ s ++ ")"
and_all rs = intercalate " && " (map parens rs)
-- | A Rule corresponds to some condition which must be satisifed in order to
-- run the job.
data Rule = FastCI -- ^ Run this job when the fast-ci label is set
| ReleaseOnly -- ^ Only run this job in a release pipeline
| Nightly -- ^ Only run this job in the nightly pipeline
| LLVMBackend -- ^ Only run this job when the "LLVM backend" label is present
| FreeBSDLabel -- ^ Only run this job when the "FreeBSD" label is set.
| NonmovingGc -- ^ Only run this job when the "non-moving GC" label is set.
| Disable -- ^ Don't run this job.
deriving (Bounded, Enum, Ord, Eq)
-- A constant evaluating to True because gitlab doesn't support "true" in the
-- expression language.
true :: String
true = "\"true\" == \"true\""
-- A constant evaluating to False because gitlab doesn't support "true" in the
-- expression language.
false :: String
false = "\"disabled\" != \"disabled\""
-- Convert the state of the rule into a string that gitlab understand.
ruleString :: OnOff -> Rule -> String
ruleString On FastCI = true
ruleString Off FastCI = "$CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/"
ruleString On LLVMBackend = "$CI_MERGE_REQUEST_LABELS =~ /.*LLVM backend.*/"
ruleString Off LLVMBackend = true
ruleString On FreeBSDLabel = "$CI_MERGE_REQUEST_LABELS =~ /.*FreeBSD.*/"
ruleString Off FreeBSDLabel = true
ruleString On NonmovingGc = "$CI_MERGE_REQUEST_LABELS =~ /.*non-moving GC.*/"
ruleString Off NonmovingGc = true
ruleString On ReleaseOnly = "$RELEASE_JOB == \"yes\""
ruleString Off ReleaseOnly = "$RELEASE_JOB != \"yes\""
ruleString On Nightly = "$NIGHTLY"
ruleString Off Nightly = "$NIGHTLY == null"
ruleString On Disable = false
ruleString Off Disable = true
-- Enumeration of all the rules
rules :: [Rule]
rules = [minBound .. maxBound]
-- | A 'Job' is the description of a single job in a gitlab pipeline. The
-- job contains all the information about how to do the build but can be further
-- modified with information about when to run jobs, which variables to set for
-- certain platforms and so on.
data Job
= Job { jobStage :: String
, jobNeeds :: [String]
, jobTags :: [String]
, jobAllowFailure :: Bool
, jobScript :: [String]
, jobAfterScript :: [String]
, jobDockerImage :: Maybe String
, jobVariables :: Variables
, jobDependencies :: [String]
, jobArtifacts :: Artifacts
, jobCache :: Cache
, jobRules :: OnOffRules
, jobPlatform :: (Arch, Opsys)
}
instance ToJSON Job where
toJSON Job{..} = object
[ "stage" A..= jobStage
-- Convoluted to avoid download artifacts from ghci job
-- https://docs.gitlab.com/ee/ci/yaml/#needsartifacts
, "needs" A..= map (\j -> object [ "job" A..= j, "artifacts" A..= False ]) jobNeeds
, "dependencies" A..= jobDependencies
, "image" A..= jobDockerImage
, "tags" A..= jobTags
, "allow_failure" A..= jobAllowFailure
-- Joining up variables like this may well be the wrong thing to do but
-- at least it doesn't lose information silently by overriding.
, "variables" A..= fmap unwords jobVariables
, "artifacts" A..= jobArtifacts
, "cache" A..= jobCache
, "after_script" A..= jobAfterScript
, "script" A..= jobScript
, "rules" A..= jobRules
]
-- | Build a job description from the system description and 'BuildConfig'
job :: Arch -> Opsys -> BuildConfig -> NamedJob Job
job arch opsys buildConfig = NamedJob { name = jobName, jobInfo = Job {..} }
where
jobPlatform = (arch, opsys)
jobRules = emptyRules
jobName = testEnv arch opsys buildConfig
jobTags = tags arch opsys buildConfig
jobDockerImage = dockerImage arch opsys
jobScript
| Windows <- opsys
= [ "bash .gitlab/ci.sh setup"
, "bash .gitlab/ci.sh configure"
, "bash .gitlab/ci.sh build_hadrian"
, "bash .gitlab/ci.sh test_hadrian" ]
| otherwise
= [ "find libraries -name config.sub -exec cp config.sub {} \\;" | Darwin == opsys ] ++
[ "sudo chown ghc:ghc -R ." | Linux {} <- [opsys]] ++
[ ".gitlab/ci.sh setup"
, ".gitlab/ci.sh configure"
, ".gitlab/ci.sh build_hadrian"
, ".gitlab/ci.sh test_hadrian"
]
jobAfterScript
| Windows <- opsys =
[ "bash .gitlab/ci.sh save_cache"
, "bash .gitlab/ci.sh clean"
]
| otherwise =
[ ".gitlab/ci.sh save_cache"
, ".gitlab/ci.sh clean"
, "cat ci_timings" ]
jobFlavour = mkJobFlavour buildConfig
jobDependencies = []
jobVariables = mconcat
[ opsysVariables arch opsys
, "TEST_ENV" =: testEnv arch opsys buildConfig
, "BIN_DIST_NAME" =: binDistName arch opsys buildConfig
, "BUILD_FLAVOUR" =: flavourString jobFlavour
, "BIGNUM_BACKEND" =: bignumString (bignumBackend buildConfig)
, "CONFIGURE_ARGS" =: configureArgsStr buildConfig
, maybe mempty ("CONFIGURE_WRAPPER" =:) (configureWrapper buildConfig)
, maybe mempty ("CROSS_TARGET" =:) (crossTarget buildConfig)
, case crossEmulator buildConfig of
NoEmulator -> case crossTarget buildConfig of
Nothing -> mempty
Just _ -> "CROSS_EMULATOR" =: "NOT_SET" -- we need an emulator but it isn't set. Won't run the testsuite
Emulator s -> "CROSS_EMULATOR" =: s
NoEmulatorNeeded -> mempty
, if withNuma buildConfig then "ENABLE_NUMA" =: "1" else mempty
, if validateNonmovingGc buildConfig
then "RUNTEST_ARGS" =: "--way=nonmoving --way=nonmoving_thr --way=nonmoving_thr_sanity"
else mempty
]
jobArtifacts = Artifacts
{ junitReport = "junit.xml"
, expireIn = "2 weeks"
, artifactPaths = [binDistName arch opsys buildConfig ++ ".tar.xz"
,"junit.xml"]
, artifactsWhen = ArtifactsAlways
}
jobCache
-- N.B. We have temporarily disabled cabal-install store caching on
-- Windows due to #21347.
| Windows <- opsys =
Cache { cachePaths = [], cacheKey = "no-caching" }
| otherwise = Cache
{ cachePaths = [ "cabal-cache", "toolchain" ]
, cacheKey = mkCacheKey arch opsys
}
jobAllowFailure = False
jobStage = "full-build"
jobNeeds = ["hadrian-ghc-in-ghci"]
---------------------------------------------------------------------------
-- Job Modifiers
---------------------------------------------------------------------------
-- Generic modification functions
-- | Modify all jobs in a 'JobGroup'
modifyJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyJobs = fmap
-- | Modify just the validate jobs in a 'JobGroup'
modifyValidateJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyValidateJobs f jg = jg { v = f <$> v jg }
-- | Modify just the nightly jobs in a 'JobGroup'
modifyNightlyJobs :: (a -> a) -> JobGroup a -> JobGroup a
modifyNightlyJobs f jg = jg { n = f <$> n jg }
-- Generic helpers
addJobRule :: Rule -> Job -> Job
addJobRule r j = j { jobRules = enableRule r (jobRules j) }
addVariable :: String -> String -> Job -> Job
addVariable k v j = j { jobVariables = mminsertWith (++) k [v] (jobVariables j) }
setVariable :: String -> String -> Job -> Job
setVariable k v j = j { jobVariables = MonoidalMap $ Map.insert k [v] $ unMonoidalMap $ jobVariables j }
delVariable :: String -> Job -> Job
delVariable k j = j { jobVariables = MonoidalMap $ Map.delete k $ unMonoidalMap $ jobVariables j }
-- Building the standard jobs
--
-- | Make a normal validate CI job
validate :: Arch -> Opsys -> BuildConfig -> NamedJob Job
validate = job
-- | Make a normal nightly CI job
nightly :: Arch -> Opsys -> BuildConfig -> NamedJob Job
nightly arch opsys bc =
let NamedJob n j = job arch opsys bc
in NamedJob { name = "nightly-" ++ n, jobInfo = addJobRule Nightly . keepArtifacts "8 weeks" . highCompression $ j}
-- | Make a normal release CI job
release :: Arch -> Opsys -> BuildConfig -> NamedJob Job
release arch opsys bc =
let NamedJob n j = job arch opsys (bc { buildFlavour = Release })
in NamedJob { name = "release-" ++ n, jobInfo = addJobRule ReleaseOnly . keepArtifacts "1 year" . ignorePerfFailures . useHashUnitIds . highCompression $ j}
-- Specific job modification functions
-- | Mark a job as requiring a manual trigger.
manual :: Job -> Job
manual j = j { jobRules = manualRule (jobRules j) }
-- | Mark a job as allowed to fail
allowFailure :: Job -> Job
allowFailure j = j { jobAllowFailure = True }
-- | Modify the time the job keeps its artifacts for
keepArtifacts :: String -> Job -> Job
keepArtifacts l j = j { jobArtifacts = (jobArtifacts j) { expireIn = l } }
-- | Ignore performance test failures for this job
ignorePerfFailures :: Job -> Job
ignorePerfFailures = addVariable "IGNORE_PERF_FAILURES" "all"
-- | Use a higher compression level to produce the job bindists (slower but produces
-- smaller results)
highCompression :: Job -> Job
highCompression = addVariable "XZ_OPT" "-9"
useHashUnitIds :: Job -> Job
useHashUnitIds = addVariable "HADRIAN_ARGS" "--hash-unit-ids"
-- | Mark the validate job to run in fast-ci mode
fastCI :: JobGroup Job -> JobGroup Job
fastCI = modifyValidateJobs (addJobRule FastCI)
-- | Mark a group of jobs as allowed to fail.
allowFailureGroup :: JobGroup Job -> JobGroup Job
allowFailureGroup = modifyJobs allowFailure
-- | Add a 'Rule' to just the validate job, for example, only run a job if a certain
-- label is set.
addValidateRule :: Rule -> JobGroup Job -> JobGroup Job
addValidateRule t = modifyValidateJobs (addJobRule t)
-- | Don't run the validate job, normally used to alleviate CI load by marking
-- jobs which are unlikely to fail (ie different linux distros)
disableValidate :: JobGroup Job -> JobGroup Job
disableValidate = addValidateRule Disable
data NamedJob a = NamedJob { name :: String, jobInfo :: a } deriving Functor
renameJob :: (String -> String) -> NamedJob a -> NamedJob a
renameJob f (NamedJob n i) = NamedJob (f n) i
instance ToJSON a => ToJSON (NamedJob a) where
toJSON nj = object
[ "name" A..= name nj
, "jobInfo" A..= jobInfo nj ]
-- Jobs are grouped into either triples or pairs depending on whether the
-- job is just validate and nightly, or also release.
data JobGroup a = StandardTriple { v :: NamedJob a
, n :: NamedJob a
, r :: NamedJob a }
| ValidateOnly { v :: NamedJob a
, n :: NamedJob a } deriving Functor
instance ToJSON a => ToJSON (JobGroup a) where
toJSON jg = object
[ "n" A..= n jg
, "r" A..= r jg
]
rename :: (String -> String) -> JobGroup a -> JobGroup a
rename f (StandardTriple nv nn nr) = StandardTriple (renameJob f nv) (renameJob f nn) (renameJob f nr)
rename f (ValidateOnly nv nn) = ValidateOnly (renameJob f nv) (renameJob f nn)
-- | Construct a 'JobGroup' which consists of a validate, nightly and release build with
-- a specific config.
standardBuildsWithConfig :: Arch -> Opsys -> BuildConfig -> JobGroup Job
standardBuildsWithConfig a op bc =
StandardTriple (validate a op bc)
(nightly a op bc)
(release a op bc)
-- | Construct a 'JobGroup' which consists of a validate, nightly and release builds with
-- the 'vanilla' config.
standardBuilds :: Arch -> Opsys -> JobGroup Job
standardBuilds a op = standardBuildsWithConfig a op vanilla
-- | Construct a 'JobGroup' which just consists of a validate and nightly build. We don't
-- produce releases for these jobs.
validateBuilds :: Arch -> Opsys -> BuildConfig -> JobGroup Job
validateBuilds a op bc = ValidateOnly (validate a op bc) (nightly a op bc)
flattenJobGroup :: JobGroup a -> [(String, a)]
flattenJobGroup (StandardTriple a b c) = map flattenNamedJob [a,b,c]
flattenJobGroup (ValidateOnly a b) = map flattenNamedJob [a, b]
flattenNamedJob :: NamedJob a -> (String, a)
flattenNamedJob (NamedJob n i) = (n, i)
-- | Specification for all the jobs we want to build.
jobs :: Map String Job
jobs = Map.fromList $ concatMap (filter is_enabled_job . flattenJobGroup) job_groups
where
is_enabled_job (_, Job {jobRules = OnOffRules {..}}) = not $ Disable `S.member` rule_set
job_groups :: [JobGroup Job]
job_groups =
[ disableValidate (standardBuilds Amd64 (Linux Debian10))
, standardBuildsWithConfig Amd64 (Linux Debian10) dwarf
, validateBuilds Amd64 (Linux Debian10) nativeInt
, fastCI (validateBuilds Amd64 (Linux Debian10) unreg)
, fastCI (validateBuilds Amd64 (Linux Debian10) debug)
, -- Nightly allowed to fail: #22520
modifyNightlyJobs allowFailure
(modifyValidateJobs manual tsan_jobs)
, -- Nightly allowed to fail: #22343
modifyNightlyJobs allowFailure
(modifyValidateJobs manual (validateBuilds Amd64 (Linux Debian10) noTntc))
, addValidateRule LLVMBackend (validateBuilds Amd64 (Linux Debian10) llvm)
, disableValidate (standardBuilds Amd64 (Linux Debian11))
-- We still build Deb9 bindists for now due to Ubuntu 18 and Linux Mint 19
-- not being at EOL until April 2023 and they still need tinfo5.
, disableValidate (standardBuildsWithConfig Amd64 (Linux Debian9) (splitSectionsBroken vanilla))
, disableValidate (standardBuilds Amd64 (Linux Ubuntu1804))
, disableValidate (standardBuilds Amd64 (Linux Ubuntu2004))
, disableValidate (standardBuilds Amd64 (Linux Rocky8))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Centos7) (splitSectionsBroken vanilla))
-- Fedora33 job is always built with perf so there's one job in the normal
-- validate pipeline which is built with perf.
, standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig
-- This job is only for generating head.hackage docs
, hackage_doc_job (disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) releaseConfig))
, disableValidate (standardBuildsWithConfig Amd64 (Linux Fedora33) dwarf)
, fastCI (standardBuildsWithConfig Amd64 Windows (splitSectionsBroken vanilla))
, disableValidate (standardBuildsWithConfig Amd64 Windows (splitSectionsBroken nativeInt))
, standardBuilds Amd64 Darwin
, allowFailureGroup (addValidateRule FreeBSDLabel (validateBuilds Amd64 FreeBSD13 vanilla))
, standardBuilds AArch64 Darwin
, standardBuildsWithConfig AArch64 (Linux Debian10) (splitSectionsBroken vanilla)
, disableValidate (validateBuilds AArch64 (Linux Debian10) llvm)
, standardBuildsWithConfig I386 (Linux Debian9) (splitSectionsBroken vanilla)
-- Fully static build, in theory usable on any linux distribution.
, fullyStaticBrokenTests (standardBuildsWithConfig Amd64 (Linux Alpine) (splitSectionsBroken static))
-- Dynamically linked build, suitable for building your own static executables on alpine
, disableValidate (standardBuildsWithConfig Amd64 (Linux Alpine) (splitSectionsBroken vanilla))
, fullyStaticBrokenTests (disableValidate (allowFailureGroup (standardBuildsWithConfig Amd64 (Linux Alpine) staticNativeInt)))
, validateBuilds Amd64 (Linux Debian11) (crossConfig "aarch64-linux-gnu" (Emulator "qemu-aarch64 -L /usr/aarch64-linux-gnu") Nothing)
, validateBuilds Amd64 (Linux Debian11) (crossConfig "javascript-unknown-ghcjs" (Emulator "js-emulator") (Just "emconfigure")
)
{ bignumBackend = Native
}
, make_wasm_jobs wasm_build_config
, modifyValidateJobs manual $
make_wasm_jobs wasm_build_config {bignumBackend = Native}
, modifyValidateJobs manual $
make_wasm_jobs wasm_build_config {unregisterised = True}
, addValidateRule NonmovingGc (standardBuildsWithConfig Amd64 (Linux Debian11) vanilla {validateNonmovingGc = True})
]
where
-- ghcilink002 broken due to #17869
fullyStaticBrokenTests = modifyJobs (addVariable "BROKEN_TESTS" "ghcilink002 ")
hackage_doc_job = rename (<> "-hackage") . modifyJobs (addVariable "HADRIAN_ARGS" "--haddock-base-url")
tsan_jobs =
modifyJobs
( addVariable "TSAN_OPTIONS" "suppressions=$CI_PROJECT_DIR/rts/.tsan-suppressions"
-- Haddock is large enough to make TSAN choke without massive quantities of
-- memory.
. addVariable "HADRIAN_ARGS" "--docs=none") $
validateBuilds Amd64 (Linux Debian10) tsan
make_wasm_jobs cfg =
modifyJobs
( delVariable "BROKEN_TESTS"
. setVariable "HADRIAN_ARGS" "--docs=none"
. delVariable "INSTALL_CONFIGURE_ARGS"
)
$ validateBuilds Amd64 (Linux AlpineWasm) cfg
wasm_build_config =
(crossConfig "wasm32-wasi" NoEmulatorNeeded Nothing)
{
fullyStatic = True
, buildFlavour = Release -- TODO: This needs to be validate but wasm backend doesn't pass yet
}
mkPlatform :: Arch -> Opsys -> String
mkPlatform arch opsys = archName arch <> "-" <> opsysName opsys
-- | This map tells us for a specific arch/opsys combo what the job name for
-- nightly/release pipelines is. This is used by the ghcup metadata generation so that
-- things like bindist names etc are kept in-sync.
--
-- For cases where there are just
--
-- Otherwise:
-- * Prefer jobs which have a corresponding release pipeline
-- * Explicitly require tie-breaking for other cases.
platform_mapping :: Map String (JobGroup BindistInfo)
platform_mapping = Map.map go $
Map.fromListWith combine [ (uncurry mkPlatform (jobPlatform (jobInfo $ v j)), j) | j <- filter hasReleaseBuild job_groups ]
where
whitelist = [ "x86_64-linux-alpine3_12-int_native-validate+fully_static"
, "x86_64-linux-deb10-validate"
, "x86_64-linux-deb11-validate"
, "x86_64-linux-fedora33-release"
, "x86_64-windows-validate"
]
combine a b
| name (v a) `elem` whitelist = a -- Explicitly selected
| name (v b) `elem` whitelist = b
| otherwise = error (show (name (v a)) ++ show (name (v b)))
go = fmap (BindistInfo . unwords . fromJust . mmlookup "BIN_DIST_NAME" . jobVariables)
hasReleaseBuild (StandardTriple{}) = True
hasReleaseBuild (ValidateOnly{}) = False
data BindistInfo = BindistInfo { bindistName :: String }
instance ToJSON BindistInfo where
toJSON (BindistInfo n) = object [ "bindistName" A..= n ]
main :: IO ()
main = do
ass <- getArgs
case ass of
-- See Note [Generation Modes]
("gitlab":as) -> write_result as jobs
("metadata":as) -> write_result as platform_mapping
_ -> error "gen_ci.hs <gitlab|metadata> [file.json]"
write_result as obj =
(case as of
[] -> B.putStrLn
(fp:_) -> B.writeFile fp)
(A.encode obj)
#! /usr/bin/env nix-shell
#!nix-shell -i bash -p cabal-install "haskell.packages.ghc92.ghcWithPackages (pkgs: with pkgs; [aeson])" git jq
cd "$(dirname "${BASH_SOURCE[0]}")"
cabal run gen_ci -- metadata jobs-metadata.json
#!/usr/bin/env nix-shell
#!nix-shell -i bash -p cabal-install "haskell.packages.ghc92.ghcWithPackages (pkgs: with pkgs; [aeson])" git jq
# shellcheck shell=bash
set -euo pipefail
cd "$(dirname "${BASH_SOURCE[0]}")"
tmp=$(mktemp)
cabal run gen_ci -- gitlab $tmp
rm -f jobs.yaml
echo "### THIS IS A GENERATED FILE, DO NOT MODIFY DIRECTLY" > jobs.yaml
cat $tmp | jq | tee -a jobs.yaml
{-# OPTIONS_GHC -Wall -Wno-missing-fields #-}
import GHC hiding (parseModule)
import GHC.Data.StringBuffer
import GHC.Driver.Config.Parser
import GHC.Parser
import GHC.Parser.Lexer
import GHC.Platform
import GHC.Plugins
import GHC.Settings
import GHC.Settings.Config
import System.Mem.Weak
fakeSettings :: Settings
fakeSettings =
Settings
{ sGhcNameVersion =
GhcNameVersion
{ ghcNameVersion_programName =
"ghc",
ghcNameVersion_projectVersion =
cProjectVersion
},
sFileSettings =
FileSettings {},
sToolSettings = ToolSettings {},
sTargetPlatform =
genericPlatform,
sPlatformMisc = PlatformMisc {}
}
fakeDynFlags :: DynFlags
fakeDynFlags = defaultDynFlags fakeSettings
parse :: DynFlags -> String -> IO (Located (HsModule GhcPs))
parse dflags src = do
let buf = stringToStringBuffer src
let loc = mkRealSrcLoc (mkFastString "Main.hs") 1 1
case unP parseModule (initParserState (initParserOpts dflags) buf loc) of
PFailed _ -> fail "parseModule failed"
POk _ rdr_module -> pure rdr_module
main :: IO ()
main = do
_ <- mkWeak runGhc runGhc Nothing
m <- parse fakeDynFlags "main = putStrLn \"hello world\""
putStrLn $ showSDoc fakeDynFlags $ ppr m
cradle:
cabal:
<!--
READ THIS FIRST: If the feature you are proposing changes the language that GHC accepts
or adds any warnings to `-Wall`, it should be written as a [GHC Proposal](https://github.com/ghc-proposals/ghc-proposals/).
Other features, appropriate for a GitLab feature request, include GHC API/plugin
innovations, new low-impact compiler flags, or other similar additions to GHC.
-->
## Motivation ## Motivation
Briefly describe the problem your proposal solves and why this problem should Briefly describe the problem your proposal solves and why this problem should
......
This diff is collapsed.
#!/usr/bin/env bash
set -e
COLOR_RED="\e[31m"
COLOR_GREEN="\e[32m"
COLOR_NONE="\e[0m"
grep TBA libraries/*/changelog.md && (
echo -e "${COLOR_RED}Error: Found \"TBA\"s in changelogs.${COLOR_NONE}"
exit 1
) || (
echo -e "${COLOR_GREEN}changelogs look okay.${COLOR_NONE}"
exit 0
)
#!/usr/bin/env bash
set -e
grep -E -q '\[[0-9]+\.[0-9]+\.[0-9]+\]' configure.ac ||
( echo "error: configure.ac: GHC version number must have three components."; exit 1 )