Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • ghc/ghc
  • bgamari/ghc
  • syd/ghc
  • ggreif/ghc
  • watashi/ghc
  • RolandSenn/ghc
  • mpickering/ghc
  • DavidEichmann/ghc
  • carter/ghc
  • harpocrates/ghc
  • ethercrow/ghc
  • mijicd/ghc
  • adamse/ghc
  • alexbiehl/ghc
  • gridaphobe/ghc
  • trofi/ghc
  • supersven/ghc
  • ppk/ghc
  • ulysses4ever/ghc
  • AndreasK/ghc
  • ghuntley/ghc
  • shayne-fletcher-da/ghc
  • fgaz/ghc
  • yav/ghc
  • osa1/ghc
  • mbbx6spp/ghc
  • JulianLeviston/ghc
  • reactormonk/ghc
  • rae/ghc
  • takenobu-hs/ghc
  • michalt/ghc
  • andrewthad/ghc
  • hsyl20/ghc
  • scottgw/ghc
  • sjakobi/ghc
  • angerman/ghc
  • RyanGlScott/ghc
  • hvr/ghc
  • howtonotwin/ghc
  • chessai/ghc
  • m-renaud/ghc
  • brprice/ghc
  • stevehartdata/ghc
  • sighingnow/ghc
  • kgardas/ghc
  • ckoparkar/ghc
  • alp/ghc
  • smaeul/ghc
  • kakkun61/ghc
  • sykloid/ghc
  • newhoggy/ghc
  • toonn/ghc
  • nineonine/ghc
  • Phyx/ghc
  • ezyang/ghc
  • tweag/ghc
  • langston/ghc
  • ndmitchell/ghc
  • rockbmb/ghc
  • artempyanykh/ghc
  • mniip/ghc
  • mynguyenbmc/ghc
  • alexfmpe/ghc
  • crockeea/ghc
  • nh2/ghc
  • vaibhavsagar/ghc
  • phadej/ghc
  • Haskell-mouse/ghc
  • lolotp/ghc
  • spacekitteh/ghc
  • michaelpj/ghc
  • mgsloan/ghc
  • HPCohen/ghc
  • tmobile/ghc
  • radrow/ghc
  • simonmar/ghc
  • _deepfire/ghc
  • Ericson2314/ghc
  • leitao/ghc
  • fumieval/ghc
  • trac-isovector/ghc
  • cblp/ghc
  • xich/ghc
  • ciil/ghc
  • erthalion/ghc
  • xldenis/ghc
  • autotaker/ghc
  • haskell-wasm/ghc
  • kcsongor/ghc
  • agander/ghc
  • Baranowski/ghc
  • trac-dredozubov/ghc
  • 23Skidoo/ghc
  • iustin/ghc
  • ningning/ghc
  • josefs/ghc
  • kabuhr/ghc
  • gallais/ghc
  • dten/ghc
  • expipiplus1/ghc
  • Pluralia/ghc
  • rohanjr/ghc
  • intricate/ghc
  • kirelagin/ghc
  • Javran/ghc
  • DanielG/ghc
  • trac-mizunashi_mana/ghc
  • pparkkin/ghc
  • bollu/ghc
  • ntc2/ghc
  • jaspervdj/ghc
  • JoshMeredith/ghc
  • wz1000/ghc
  • zkourouma/ghc
  • code5hot/ghc
  • jdprice/ghc
  • tdammers/ghc
  • J-mie6/ghc
  • trac-lantti/ghc
  • ch1bo/ghc
  • cgohla/ghc
  • lucamolteni/ghc
  • acairncross/ghc
  • amerocu/ghc
  • chreekat/ghc
  • txsmith/ghc
  • trupill/ghc
  • typetetris/ghc
  • sergv/ghc
  • fryguybob/ghc
  • erikd/ghc
  • trac-roland/ghc
  • setupminimal/ghc
  • Friede80/ghc
  • SkyWriter/ghc
  • xplorld/ghc
  • abrar/ghc
  • obsidiansystems/ghc
  • Icelandjack/ghc
  • adinapoli/ghc
  • trac-matthewbauer/ghc
  • heatsink/ghc
  • dwijnand/ghc
  • Cmdv/ghc
  • alinab/ghc
  • pepeiborra/ghc
  • fommil/ghc
  • luochen1990/ghc
  • rlupton20/ghc
  • applePrincess/ghc
  • lehins/ghc
  • ronmrdechai/ghc
  • leeadam/ghc
  • harendra/ghc
  • mightymosquito1991/ghc
  • trac-gershomb/ghc
  • lucajulian/ghc
  • Rizary/ghc
  • VictorCMiraldo/ghc
  • jamesbrock/ghc
  • andrewdmeier/ghc
  • luke/ghc
  • pranaysashank/ghc
  • cocreature/ghc
  • hithroc/ghc
  • obreitwi/ghc
  • slrtbtfs/ghc
  • kaol/ghc
  • yairchu/ghc
  • Mathemagician98/ghc
  • trac-taylorfausak/ghc
  • leungbk/ghc
  • MichaWiedenmann/ghc
  • chris-martin/ghc
  • TDecki/ghc
  • adithyaov/ghc
  • trac-gelisam/ghc
  • Lysxia/ghc
  • complyue/ghc
  • bwignall/ghc
  • sternmull/ghc
  • sonika/ghc
  • leif/ghc
  • broadwaylamb/ghc
  • myszon/ghc
  • danbroooks/ghc
  • Mechachleopteryx/ghc
  • zardyh/ghc
  • trac-vdukhovni/ghc
  • OmarKhaledAbdo/ghc
  • arrowd/ghc
  • Bodigrim/ghc
  • matheus23/ghc
  • cardenaso11/ghc
  • trac-Athas/ghc
  • mb720/ghc
  • DylanZA/ghc
  • liff/ghc
  • typedrat/ghc
  • trac-claude/ghc
  • jbm/ghc
  • Gertjan423/ghc
  • PHO/ghc
  • JKTKops/ghc
  • kockahonza/ghc
  • msakai/ghc
  • Sir4ur0n/ghc
  • barambani/ghc
  • vishnu.c/ghc
  • dcoutts/ghc
  • trac-runeks/ghc
  • trac-MaxGabriel/ghc
  • lexi.lambda/ghc
  • strake/ghc
  • spavikevik/ghc
  • JakobBruenker/ghc
  • rmanne/ghc
  • gdziadkiewicz/ghc
  • ani/ghc
  • iliastsi/ghc
  • smunix/ghc
  • judah/ghc
  • blackgnezdo/ghc
  • emilypi/ghc
  • trac-bpfoley/ghc
  • muesli4/ghc
  • trac-gkaracha/ghc
  • Kleidukos/ghc
  • nek0/ghc
  • TristanCacqueray/ghc
  • dwulive/ghc
  • mbakke/ghc
  • arybczak/ghc
  • Yang123321/ghc
  • maksbotan/ghc
  • QuietMisdreavus/ghc
  • trac-olshanskydr/ghc
  • emekoi/ghc
  • samuela/ghc
  • josephcsible/ghc
  • dramforever/ghc
  • lpsmith/ghc
  • DenisFrezzato/ghc
  • michivi/ghc
  • jneira/ghc
  • jeffhappily/ghc
  • Ivan-Yudin/ghc
  • nakaji-dayo/ghc
  • gdevanla/ghc
  • galen/ghc
  • fendor/ghc
  • yaitskov/ghc
  • rcythr/ghc
  • awpr/ghc
  • jeremyschlatter/ghc
  • Aver1y/ghc
  • mitchellvitez/ghc
  • merijn/ghc
  • tomjaguarpaw1/ghc
  • trac-NoidedSuper/ghc
  • erewok/ghc
  • trac-junji.hashimoto/ghc
  • adamwespiser/ghc
  • bjaress/ghc
  • jhrcek/ghc
  • leonschoorl/ghc
  • lukasz-golebiewski/ghc
  • sheaf/ghc
  • last-g/ghc
  • carassius1014/ghc
  • eschwartz/ghc
  • dwincort/ghc
  • felixwiemuth/ghc
  • TimWSpence/ghc
  • marcusmonteirodesouza/ghc
  • WJWH/ghc
  • vtols/ghc
  • theobat/ghc
  • BinderDavid/ghc
  • ckoparkar0/ghc
  • alexander-kjeldaas/ghc
  • dme2/ghc
  • philderbeast/ghc
  • aaronallen8455/ghc
  • rayshih/ghc
  • benkard/ghc
  • mpardalos/ghc
  • saidelman/ghc
  • leiftw/ghc
  • ca333/ghc
  • bwroga/ghc
  • nmichael44/ghc
  • trac-crobbins/ghc
  • felixonmars/ghc
  • adityagupta1089/ghc
  • hgsipiere/ghc
  • treeowl/ghc
  • alexpeits/ghc
  • CraigFe/ghc
  • dnlkrgr/ghc
  • kerckhove_ts/ghc
  • cptwunderlich/ghc
  • eiais/ghc
  • hahohihu/ghc
  • sanchayan/ghc
  • lemmih/ghc
  • sehqlr/ghc
  • trac-dbeacham/ghc
  • luite/ghc
  • trac-f-a/ghc
  • vados/ghc
  • luntain/ghc
  • fatho/ghc
  • alexbiehl-gc/ghc
  • dcbdan/ghc
  • tvh/ghc
  • liam-ly/ghc
  • timbobbarnes/ghc
  • GovanifY/ghc
  • shanth2600/ghc
  • gliboc/ghc
  • duog/ghc
  • moxonsghost/ghc
  • zander/ghc
  • masaeedu/ghc
  • georgefst/ghc
  • guibou/ghc
  • nicuveo/ghc
  • mdebruijne/ghc
  • stjordanis/ghc
  • emiflake/ghc
  • wygulmage/ghc
  • frasertweedale/ghc
  • coot/ghc
  • aratamizuki/ghc
  • tsandstr/ghc
  • mrBliss/ghc
  • Anton-Latukha/ghc
  • tadfisher/ghc
  • vapourismo/ghc
  • Sorokin-Anton/ghc
  • basile-henry/ghc
  • trac-mightybyte/ghc
  • AbsoluteNikola/ghc
  • cobrien99/ghc
  • songzh/ghc
  • blamario/ghc
  • aj4ayushjain/ghc
  • trac-utdemir/ghc
  • tangcl/ghc
  • hdgarrood/ghc
  • maerwald/ghc
  • arjun/ghc
  • ratherforky/ghc
  • haskieLambda/ghc
  • EmilGedda/ghc
  • Bogicevic/ghc
  • eddiejessup/ghc
  • kozross/ghc
  • AlistairB/ghc
  • 3Rafal/ghc
  • christiaanb/ghc
  • trac-bit/ghc
  • matsumonkie/ghc
  • trac-parsonsmatt/ghc
  • chisui/ghc
  • jaro/ghc
  • trac-kmiyazato/ghc
  • davidsd/ghc
  • Tritlo/ghc
  • I-B-3/ghc
  • lykahb/ghc
  • AriFordsham/ghc
  • turion1/ghc
  • berberman/ghc
  • christiantakle/ghc
  • zyklotomic/ghc
  • trac-ocramz/ghc
  • CSEdd/ghc
  • doyougnu/ghc
  • mmhat/ghc
  • why-not-try-calmer/ghc
  • plutotulp/ghc
  • kjekac/ghc
  • Manvi07/ghc
  • teo/ghc
  • cactus/ghc
  • CarrieMY/ghc
  • abel/ghc
  • yihming/ghc
  • tsakki/ghc
  • jessicah/ghc
  • oliverbunting/ghc
  • meld/ghc
  • friedbrice/ghc
  • Joald/ghc
  • abarbu/ghc
  • DigitalBrains1/ghc
  • sterni/ghc
  • alexDarcy/ghc
  • hexchain/ghc
  • minimario/ghc
  • zliu41/ghc
  • tommd/ghc
  • jazcarate/ghc
  • peterbecich/ghc
  • alirezaghey/ghc
  • solomon/ghc
  • mikael.urankar/ghc
  • davjam/ghc
  • int-index/ghc
  • MorrowM/ghc
  • nrnrnr/ghc
  • Sonfamm/ghc-test-only
  • afzt1/ghc
  • nguyenhaibinh-tpc/ghc
  • trac-lierdakil/ghc
  • MichaWiedenmann1/ghc
  • jmorag/ghc
  • Ziharrk/ghc
  • trac-MitchellSalad/ghc
  • juampe/ghc
  • jwaldmann/ghc
  • snowleopard/ghc
  • juhp/ghc
  • normalcoder/ghc
  • ksqsf/ghc
  • trac-jberryman/ghc
  • roberth/ghc
  • 1ntEgr8/ghc
  • epworth/ghc
  • MrAdityaAlok/ghc
  • JunmingZhao42/ghc
  • jappeace/ghc
  • trac-Gabriel439/ghc
  • alt-romes/ghc
  • HugoPeters1024/ghc
  • 10ne1/ghc-fork
  • agentultra/ghc
  • Garfield1002/ghc
  • ChickenProp/ghc
  • clyring/ghc
  • MaxHearnden/ghc
  • jumper149/ghc
  • vem/ghc
  • ketzacoatl/ghc
  • Rosuavio/ghc
  • jackohughes/ghc
  • p4l1ly/ghc
  • konsumlamm/ghc
  • shlevy/ghc
  • torsten.schmits/ghc
  • andremarianiello/ghc
  • amesgen/ghc
  • googleson78/ghc
  • InfiniteVerma/ghc
  • uhbif19/ghc
  • yiyunliu/ghc
  • raehik/ghc
  • mrkun/ghc
  • telser/ghc
  • 1Jajen1/ghc
  • slotThe/ghc
  • WinstonHartnett/ghc
  • mpilgrem/ghc
  • dreamsmasher/ghc
  • schuelermine/ghc
  • trac-Viwor/ghc
  • undergroundquizscene/ghc
  • evertedsphere/ghc
  • coltenwebb/ghc
  • oberblastmeister/ghc
  • agrue/ghc
  • lf-/ghc
  • zacwood9/ghc
  • steshaw/ghc
  • high-cloud/ghc
  • SkamDart/ghc
  • PiDelport/ghc
  • maoif/ghc
  • RossPaterson/ghc
  • CharlesTaylor7/ghc
  • ribosomerocker/ghc
  • trac-ramirez7/ghc
  • daig/ghc
  • NicolasT/ghc
  • FinleyMcIlwaine/ghc
  • lawtonnichols/ghc
  • jmtd/ghc
  • ozkutuk/ghc
  • wildsebastian/ghc
  • nikshalark/ghc
  • lrzlin/ghc
  • tobias/ghc
  • fw/ghc
  • hawkinsw/ghc
  • type-dance/ghc
  • rui314/ghc
  • ocharles/ghc
  • wavewave/ghc
  • TheKK/ghc
  • nomeata/ghc
  • trac-csabahruska/ghc
  • jonathanjameswatson/ghc
  • L-as/ghc
  • Axman6/ghc
  • barracuda156/ghc
  • trac-jship/ghc
  • jake-87/ghc
  • meooow/ghc
  • rebeccat/ghc
  • hamana55/ghc
  • Enigmage/ghc
  • kokobd/ghc
  • agevelt/ghc
  • gshen42/ghc
  • chrismwendt/ghc
  • MangoIV/ghc
  • teto/ghc
  • Sookr1/ghc
  • trac-thomasjm/ghc
  • barci2/ghc-dev
  • trac-m4dc4p/ghc
  • dixonary/ghc
  • breakerzirconia/ghc
  • alexsio27444/ghc
  • glocq/ghc
  • sourabhxyz/ghc
  • ryantrinkle/ghc
  • Jade/ghc
  • scedfaliako/ghc
  • martijnbastiaan/ghc
  • trac-george.colpitts/ghc
  • ammarbinfaisal/ghc
  • mimi.vx/ghc
  • lortabac/ghc
  • trac-zyla/ghc
  • benbellick/ghc
  • aadaa-fgtaa/ghc
  • jvanbruegge/ghc
  • archbung/ghc
  • gilmi/ghc
  • mfonism/ghc
  • alex-mckenna/ghc
  • Ei30metry/ghc
  • DiegoDiverio/ghc
  • jorgecunhamendes/ghc
  • liesnikov/ghc
  • akrmn/ghc
  • trac-simplifierticks/ghc
  • jacco/ghc
  • rhendric/ghc
  • damhiya/ghc
  • ryndubei/ghc
  • DaveBarton/ghc
  • trac-Profpatsch/ghc
  • GZGavinZhao/ghc
  • ncfavier/ghc
  • jameshaydon/ghc
  • ajccosta/ghc
  • dschrempf/ghc
  • cydparser/ghc
  • LinuxUserGD/ghc
  • elodielander/ghc
  • facundominguez/ghc
  • psilospore/ghc
  • lachrimae/ghc
  • dylan-thinnes/ghc-type-errors-plugin
  • hamishmack/ghc
  • Leary/ghc
  • lzszt/ghc
  • lyokha/ghc
  • trac-glaubitz/ghc
  • Rewbert/ghc
  • andreabedini/ghc
  • Jasagredo/ghc
  • sol/ghc
  • OlegAlexander/ghc
  • trac-sthibaul/ghc
  • avdv/ghc
  • Wendaolee/ghc
  • ur4t/ghc
  • daylily/ghc
  • boltzmannrain/ghc
  • mmzk1526/ghc
  • trac-fizzixnerd/ghc
  • soulomoon/ghc
  • rwmjones/ghc
  • j14i/ghc
  • tracsis/ghc
  • gesh/ghc
  • flip101/ghc
  • eldritch-cookie/ghc
  • LemonjamesD/ghc
  • pgujjula/ghc
  • skeuchel/ghc
  • noteed/ghc
  • gulin.serge/ghc
  • Torrekie/ghc
  • jlwoodwa/ghc
  • ayanamists/ghc
  • husong998/ghc
  • trac-edmundnoble/ghc
  • josephf/ghc
  • contrun/ghc
  • baulig/ghc
  • edsko/ghc
  • mzschr/ghc-issue-24732
  • ulidtko/ghc
  • Arsen/ghc
  • trac-sjoerd_visscher/ghc
  • crumbtoo/ghc
  • L0neGamer/ghc
  • DrewFenwick/ghc
  • benz0li/ghc
  • MaciejWas/ghc
  • jordanrule/ghc
  • trac-qqwy/ghc
  • LiamGoodacre/ghc
  • isomorpheme/ghc
  • trac-danidiaz/ghc
  • Kariim/ghc
  • MTaimoorZaeem/ghc
  • hololeap/ghc
  • ticat-fp/ghc
  • meritamen/ghc
  • criskell/ghc
  • trac-kraai/ghc
  • aergus/ghc
  • jdral/ghc
  • SamB/ghc
  • Tristian/ghc
  • ywgrit/ghc
  • KatsuPatrick/ghc
  • OsePedro/ghc
  • mpscholten/ghc
  • fp/ghc
  • zaquest/ghc
  • fangyi-zhou/ghc
  • augyg/ghc
  • rkirkman/ghc
641 results
Show changes
Showing
with 8338 additions and 0 deletions
#!/usr/bin/env bash
set -e
out_dir="$(git rev-parse --show-toplevel)/.gitlab"
# Update job metadata for ghcup
generate-ci metadata "$out_dir/jobs-metadata.json"
echo "Updated $out_dir/jobs-metadata.json"
#!/usr/bin/env bash
set -e
out_dir="$(git rev-parse --show-toplevel)/.gitlab"
tmp="$(mktemp)"
generate-ci gitlab "$tmp"
rm -f "$out_dir/jobs.yaml"
echo "### THIS IS A GENERATED FILE, DO NOT MODIFY DIRECTLY" > "$out_dir/jobs.yaml"
cat "$tmp" | jq >> "$out_dir/jobs.yaml"
rm "$tmp"
echo "Updated $out_dir/jobs.yaml"
cradle:
cabal:
{-# OPTIONS_GHC -Wall -Wno-missing-fields #-}
import GHC.Unit.Types (stringToUnitId)
import GHC hiding (parseModule)
import GHC.Data.StringBuffer
import GHC.Driver.Config.Parser
import GHC.Parser
import GHC.Parser.Lexer
import GHC.Platform
import GHC.Plugins
import GHC.Settings
import GHC.Settings.Config
import System.Mem.Weak
fakeSettings :: Settings
fakeSettings =
Settings
{ sGhcNameVersion =
GhcNameVersion
{ ghcNameVersion_programName =
"ghc",
ghcNameVersion_projectVersion =
cProjectVersion
},
sFileSettings =
FileSettings {},
sToolSettings = ToolSettings {},
sTargetPlatform =
genericPlatform,
sPlatformMisc = PlatformMisc {},
sUnitSettings = UnitSettings { unitSettings_baseUnitId = stringToUnitId "base" }
}
fakeDynFlags :: DynFlags
fakeDynFlags = defaultDynFlags fakeSettings
parse :: DynFlags -> String -> IO (Located (HsModule GhcPs))
parse dflags src = do
let buf = stringToStringBuffer src
let loc = mkRealSrcLoc (mkFastString "Main.hs") 1 1
case unP parseModule (initParserState (initParserOpts dflags) buf loc) of
PFailed _ -> fail "parseModule failed"
POk _ rdr_module -> pure rdr_module
main :: IO ()
main = do
_ <- mkWeak runGhc runGhc Nothing
m <- parse fakeDynFlags "main = putStrLn \"hello world\""
putStrLn $ showSDoc fakeDynFlags $ ppr m
## Summary
Please read the guidance in [https://gitlab.haskell.org/ghc/ghc/-/wikis/report-a-bug] and write a brief description of the issue.
## Steps to reproduce
Please provide a set of concrete steps to reproduce the issue.
## Expected behavior
What do you expect the reproducer described above to do?
## Environment
* GHC version used:
Optional:
* Operating System:
* System Architecture:
/label ~bug
/label ~"needs triage"
## Summary
Location of documentation issue: (e.g. the Haddocks of `base`, the GHC user's guide)
Write a brief description of the issue.
## Proposed improvements or changes
Feel free to propose changes, either concrete or abstract.
## Environment
* GHC version used (if appropriate):
/label ~bug
/label ~documentation
/label ~"needs triage"
<!--
READ THIS FIRST: If the feature you are proposing changes the language that GHC accepts
or adds any warnings to `-Wall`, it should be written as a [GHC Proposal](https://github.com/ghc-proposals/ghc-proposals/).
Other features, appropriate for a GitLab feature request, include GHC API/plugin
innovations, new low-impact compiler flags, or other similar additions to GHC.
-->
## Motivation
Briefly describe the problem your proposal solves and why this problem should
be solved.
## Proposal
Describe your proposed feature here.
/label ~"feature request"
/label ~"needs triage"
This diff is collapsed.
Thank you for your contribution to GHC!
**Please read the checklist below to make sure your contribution fulfills these
expectations. Also please answer the following question in your MR description:**
**Where is the key part of this patch? That is, what should reviewers look at first?**
Please take a few moments to address the following points:
* [ ] if your MR touches `base` (or touches parts of `ghc-internal` used
or re-exported by `base`) more substantially than just amending comments
or documentation, you likely need to raise a
[CLC proposal](https://github.com/haskell/core-libraries-committee#base-package)
before merging it.
* [ ] if your MR may break existing programs (e.g. causes the
compiler to reject programs), please describe the expected breakage and add
the ~"user-facing" label. This will run ghc/head.hackage> to characterise
the effect of your change on Hackage.
* [ ] ensure that your commits are either individually buildable or squashed
* [ ] ensure that your commit messages describe *what they do*
(referring to tickets using `#NNNN` syntax when appropriate)
* [ ] have added source comments describing your change. For larger changes you
likely should add a [Note][notes] and cross-reference it from the relevant
places.
* [ ] add a [testcase to the testsuite][adding test].
* [ ] updates the users guide if applicable
* [ ] mentions new features in the release notes for the next release
If you have any questions don't hesitate to open your merge request and inquire
in a comment. If your patch isn't quite done yet please do add prefix your MR
title with `WIP:`.
By default a minimal validation pipeline is run on each merge request, the ~full-ci
label can be applied to perform additional validation checks if your MR affects a more
unusual configuration.
Once your change is ready please remove the `WIP:` tag and wait for review. If
no one has offered a review in a few days then please leave a comment mentioning
@triagers and apply the ~"Blocked on Review" label.
[notes]: https://gitlab.haskell.org/ghc/ghc/wikis/commentary/coding-style#comments-in-the-source-code
[adding test]: https://gitlab.haskell.org/ghc/ghc/wikis/building/running-tests/adding
Thank your for your contribution to Haddock!
* [ ] This MR relates to \<ticket number>
* [ ] I have read the [CONTRIBUTING](./utils/haddock/CONTRIBUTING.md) guide
* [ ] I have inserted a CHANGELOG entry if this warrants it
* [ ] I have squashed my commits
* [ ] I have added tests if necessary
* [ ] I have updated the documentation
If you have any questions don't hesitate to open your merge request and inquire
in a comment. If your patch isn't quite done yet please do add prefix your MR
title with `Draft:`.
Once your change is ready please remove the `Draft:` tag and wait for review. If
no one has offered a review in a few days then please leave a comment mentioning
@triagers and apply the ~"Blocked on Review" label.
let sources = import ./nix/sources.nix; in
{ nixpkgs ? (import sources.nixpkgs {}) }:
with nixpkgs;
let
fetch-gitlab-artifacts = nixpkgs.callPackage ./fetch-gitlab-artifacts {};
mk-ghcup-metadata = nixpkgs.callPackage ./mk-ghcup-metadata { fetch-gitlab=fetch-gitlab-artifacts;};
bindistPrepEnv = pkgs.buildFHSUserEnv {
name = "enter-fhs";
targetPkgs = pkgs: with pkgs; [
# all
gcc binutils gnumake gmp ncurses5 git elfutils
# source-release.sh
xorg.lndir curl python3 which automake autoconf m4 file
haskell.compiler.ghc8107 haskellPackages.happy haskellPackages.alex
];
runScript = "$SHELL -x";
};
scripts = stdenv.mkDerivation {
name = "rel-eng-scripts";
nativeBuildInputs = [ makeWrapper ];
preferLocalBuild = true;
buildCommand = ''
mkdir -p $out/bin
makeWrapper ${./recompress-all} $out/bin/recompress-all \
--prefix PATH : ${gnumake}/bin \
--prefix PATH : ${gnutar}/bin \
--prefix PATH : ${lzip}/bin \
--prefix PATH : ${bzip2}/bin \
--prefix PATH : ${gzip}/bin \
--prefix PATH : ${xz}/bin \
--prefix PATH : ${zip}/bin
makeWrapper ${./upload.sh} $out/bin/upload.sh \
--prefix PATH : ${moreutils}/bin \
--prefix PATH : ${lftp}/bin \
--prefix PATH : ${lzip}/bin \
--prefix PATH : ${zip}/bin \
--prefix PATH : ${s3cmd}/bin \
--prefix PATH : ${gnupg}/bin \
--prefix PATH : ${pinentry}/bin \
--prefix PATH : ${python3}/bin \
--prefix PATH : $out/bin \
--set ENTER_FHS_ENV ${bindistPrepEnv}/bin/enter-fhs \
--set BASH ${bash}/bin/bash
makeWrapper ${./upload_ghc_libs.py} $out/bin/upload-ghc-libs
'';
};
in
symlinkJoin {
name = "ghc-rel-eng";
preferLocalBuild = true;
paths = [
scripts
fetch-gitlab-artifacts
mk-ghcup-metadata
];
}
result
fetch-gitlab
out
# fetch-gitlab-artifacts
This script is used to fetch and rename GHC binary distributions from GitLab
Pipelines for upload to `downloads.haskell.org`.
## Workflow
1. [Configure]() a `python-gitlab` profile for <https://gitlab.haskell.org/>:
```
$ cat > $HOME/.python-gitlab.cfg <<EOF
[haskell]
url = https://gitlab.haskell.org/
private_token = $PRIVATE_GITLAB_TOKEN
ssl_verify = true
api_version = 4
EOF
```
1. Push a release tag to ghc/ghc>
1. Wait until the CI pipeline completes
1. Run `fetch-gitlab -p $PIPELINE_ID -r $RELEASE_NAME` where `$PIPELINE_ID` is
the ID of the GitLab release pipeline and `$RELEASE_NAME` is the name of the
GHC release (e.g. `8.8.1` or `8.8.1-alpha1`)
1. The binary distributions will be placed in the `out` directory.
\ No newline at end of file
{ nix-gitignore, python3Packages, unzip }:
let
fetch-gitlab = { buildPythonPackage, python-gitlab, unzip }:
buildPythonPackage {
pname = "fetch-gitlab";
version = "0.0.1";
src = nix-gitignore.gitignoreSource [] ./.;
propagatedBuildInputs = [ python3Packages.python-gitlab unzip ];
preferLocalBuild = true;
};
in
python3Packages.callPackage fetch-gitlab { inherit unzip; }
import logging
from pathlib import Path
import subprocess
import gitlab
import json
logging.basicConfig(level=logging.INFO)
def strip_prefix(s, prefix):
if s.startswith(prefix):
return s[len(prefix):]
else:
return None
do_not_distribute = set(["release-x86_64-linux-fedora33-release-hackage"])
def job_triple(job_name):
bindists = {
'release-x86_64-windows-release': 'x86_64-unknown-mingw32',
'release-x86_64-windows-int_native-release': 'x86_64-unknown-mingw32-int_native',
'release-x86_64-linux-rocky8-release': 'x86_64-rocky8-linux',
'release-x86_64-linux-ubuntu22_04-release': 'x86_64-ubuntu22_04-linux',
'release-x86_64-linux-ubuntu20_04-release': 'x86_64-ubuntu20_04-linux',
'release-x86_64-linux-ubuntu18_04-release': 'x86_64-ubuntu18_04-linux',
'release-x86_64-linux-fedora38-release': 'x86_64-fedora38-linux',
'release-x86_64-linux-fedora33-release+debug_info': 'x86_64-fedora33-linux-dwarf',
'release-x86_64-linux-fedora33-release': 'x86_64-fedora33-linux',
'release-x86_64-linux-fedora27-release': 'x86_64-fedora27-linux',
'release-x86_64-linux-deb12-release': 'x86_64-deb12-linux',
'release-x86_64-linux-deb11-release': 'x86_64-deb11-linux',
'release-x86_64-linux-deb10-release+debug_info': 'x86_64-deb10-linux-dwarf',
'release-x86_64-linux-deb10-release': 'x86_64-deb10-linux',
'release-x86_64-linux-deb9-release': 'x86_64-deb9-linux',
'release-x86_64-linux-alpine3_12-release+fully_static': 'x86_64-alpine3_12-linux-static',
'release-x86_64-linux-alpine3_12-release': 'x86_64-alpine3_12-linux',
'release-x86_64-linux-alpine3_12-int_native-release+fully_static': 'x86_64-alpine3_12-linux-static-int_native',
'release-x86_64-linux-alpine3_20-release': 'x86_64-alpine3_20-linux',
'release-x86_64-darwin-release': 'x86_64-apple-darwin',
'release-i386-linux-deb12-release': 'i386-deb12-linux',
'release-i386-linux-deb10-release': 'i386-deb10-linux',
'release-i386-linux-deb9-release': 'i386-deb9-linux',
'release-armv7-linux-deb10-release': 'armv7-deb10-linux',
'release-aarch64-linux-deb10-release': 'aarch64-deb10-linux',
'release-aarch64-linux-deb11-release': 'aarch64-deb11-linux',
'release-aarch64-linux-deb12-release': 'aarch64-deb12-linux',
'release-aarch64-linux-alpine3_18-release+no_split_sections': 'aarch64-alpine3_18-linux',
'release-aarch64-darwin-release': 'aarch64-apple-darwin',
'source-tarball': 'src',
'package-hadrian-bootstrap-sources': 'hadrian-bootstrap-sources',
'doc-tarball': 'docs',
'hackage-doc-tarball': 'hackage_docs',
}
# Some bindists use the +no_split_sections transformer due to upstream
# toolchain bugs.
bindists.update({
f'{k}+no_split_sections': v
for k,v in bindists.items()
})
if job_name in bindists:
return bindists[job_name]
else:
#return strip_prefix(job.name, 'validate-')
return None
class UnhandledJobException(Exception):
# Raised when there is a release job in the pipeline but we don't explicitly handle it.
def __init__(self, name):
self.message = f"{name} is a release job but not downloaded"
super().__init__(self.message)
def fetch_artifacts(release: str, pipeline_id: int,
dest_dir: Path, gl: gitlab.Gitlab):
dest_dir.mkdir(exist_ok=True)
# Write the pipeline id into output directory
with open(f"{dest_dir}/metadata.json", 'w') as out: json.dump({ "pipeline_id": pipeline_id }, out)
proj = gl.projects.get('ghc/ghc')
pipeline = proj.pipelines.get(pipeline_id)
tmpdir = Path("fetch-gitlab")
tmpdir.mkdir(exist_ok=True)
for pipeline_job in pipeline.jobs.list(all=True):
if len(pipeline_job.artifacts) == 0:
logging.info(f'job {pipeline_job.name} ({pipeline_job.id}) has no artifacts')
continue
job = proj.jobs.get(pipeline_job.id)
triple = job_triple(job.name)
if triple is None:
if job.name.startswith("release") and not (job.name in do_not_distribute):
raise(UnhandledJobException(job.name))
logging.info(f'ignoring {job.name}')
continue
#artifactZips = [ artifact
# for artifact in job.artifacts
# if artifact['filename'] == 'artifacts.zip' ]
try:
destdir = tmpdir / job.name
zip_name = Path(f"{tmpdir}/{job.name}.zip")
if not zip_name.exists() or zip_name.stat().st_size == 0:
logging.info(f'downloading archive {zip_name} for job {job.name} (job {job.id})...')
with open(zip_name, 'wb') as f:
job.artifacts(streamed=True, action=f.write)
if zip_name.stat().st_size == 0:
logging.info(f'artifact archive for job {job.name} (job {job.id}) is empty')
continue
subprocess.run(['unzip', '-bo', zip_name, '-d', destdir])
bindist_files = list(destdir.glob('ghc*.tar.xz'))
if job.name == 'source-tarball':
for f in bindist_files:
dest = dest_dir / f.name
logging.info(f'extracted {job.name} to {dest}')
f.replace(dest)
elif job.name == 'package-hadrian-bootstrap-sources':
all_bootstrap_sources = destdir / 'hadrian-bootstrap-sources-all.tar.gz'
dest = dest_dir / 'hadrian-bootstrap-sources'
dest.mkdir()
subprocess.run(['tar', '-xf', all_bootstrap_sources, '-C', dest])
logging.info(f'extracted {job.name}/{all_bootstrap_sources} to {dest}')
elif job.name == 'doc-tarball':
dest = dest_dir / 'docs'
dest.mkdir()
doc_files = list(destdir.glob('*.tar.xz'))
for f in doc_files:
subprocess.run(['tar', '-xf', f, '-C', dest])
logging.info(f'extracted docs {f} to {dest}')
index_path = destdir / 'docs' / 'index.html'
index_path.replace(dest / 'index.html')
elif job.name == 'hackage-doc-tarball':
dest = dest_dir / 'hackage_docs'
logging.info(f'moved hackage_docs to {dest}')
(destdir / 'hackage_docs').replace(dest)
else:
dest = dest_dir / f'ghc-{release}-{triple}.tar.xz'
if dest.exists():
logging.info(f'bindist {dest} already exists')
continue
if len(bindist_files) == 0:
logging.warn(f'Bindist does not exist')
continue
bindist = bindist_files[0]
logging.info(f'extracted {job.name} to {dest}')
bindist.replace(dest)
except Exception as e:
logging.error(f'Error fetching job {job.name}: {e}')
pass
def main():
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('--pipeline', '-p', required=True, type=int, help="pipeline id")
parser.add_argument('--release', '-r', required=True, type=str, help="release name")
parser.add_argument('--output', '-o', type=Path, default=Path.cwd(), help="output directory")
parser.add_argument('--profile', '-P', default='haskell',
help='python-gitlab.cfg profile name')
args = parser.parse_args()
gl = gitlab.Gitlab.from_config(args.profile)
fetch_artifacts(args.release, args.pipeline,
dest_dir=args.output, gl=gl)
#!/usr/bin/env python
from distutils.core import setup
setup(name='fetch-gitlab',
author='Ben Gamari',
author_email='ben@smart-cactus.org',
py_modules=['fetch_gitlab'],
entry_points={
'console_scripts': [
'fetch-gitlab=fetch_gitlab:main',
]
}
)
result
fetch-gitlab
out
# mk-ghcup-metadata
This script is used to automatically generate metadata suitable for consumption by
GHCUp.
# Usage
```
nix run -f .gitlab/rel_eng/ -c ghcup-metadata
```
```
options:
-h, --help show this help message and exit
--metadata METADATA Path to GHCUp metadata
--pipeline-id PIPELINE_ID
Which pipeline to generate metadata for
--release-mode Generate metadata which points to downloads folder
--fragment Output the generated fragment rather than whole modified file
--version VERSION Version of the GHC compiler
--date DATE Date of the compiler release
```
The script also requires the `.gitlab/jobs-metadata.yaml` file which can be generated
by running `.gitlab/generate-ci/generate_jobs_metadata` script if you want to run it locally.
## CI Pipelines
The metadata is generated by the nightly and release pipelines.
* Nightly pipelines generate metadata where the bindist URLs point immediatley to
nightly artifacts.
* Release jobs can pass the `--release-mode` flag which downloads the artifacts from
the pipeline but the final download URLs for users point into the downloads folder.
The mapping from platform to bindist is not clever, it is just what the GHCUp developers
tell us to use.
## Testing Pipelines
The metadata is tested by the `ghcup-ci` repo which is triggered by the
`ghcup-metadata-testing-nightly` job.
This job sets the following variables which are then used by the downstream job
to collect the metadata from the correct place:
* `UPSTREAM_PIPELINE_ID` - The pipeline ID which the generated metadata lives in
* `UPSTREAM_PROJECT_ID` - The project ID for the upstream project (almost always `1` (for ghc/ghc))
* `UPSTREAM_JOB_NAME` - The job which the metadata belongs to (ie `ghcup-metadata-nightly`)
* `UPSTREAM_PROJECT_PATH` - The path of the upstream project (almost always ghc/ghc)
Nightly pipelines are tested automaticaly but release pipelines are manually triggered
as the testing requires the bindists to be uploaded into the final release folder.
{ nix-gitignore, python3Packages, fetch-gitlab }:
let
ghcup-metadata = { buildPythonPackage, python-gitlab, pyyaml }:
buildPythonPackage {
pname = "ghcup-metadata";
version = "0.0.1";
src = nix-gitignore.gitignoreSource [] ./.;
propagatedBuildInputs = [fetch-gitlab python-gitlab pyyaml ];
preferLocalBuild = true;
};
in
python3Packages.callPackage ghcup-metadata { }
#! /usr/bin/env nix-shell
#! nix-shell -i python3 -p curl "python3.withPackages (ps:[ps.pyyaml ps.python-gitlab ])"
"""
A tool for generating metadata suitable for GHCUp
There are two ways to prepare metadata:
* From a nightly pipeline.
* From a release pipeline.
In any case the script takes the same arguments:
* --metadata: The path to existing GHCup metadata to which we want to add the new entry.
* --version: GHC version of the pipeline
* --pipeline-id: The pipeline to generate metadata for
* --release-mode: Download from a release pipeline but generate URLs to point to downloads folder.
* --fragment: Only print out the updated fragment rather than the modified file
The script will then download the relevant bindists to compute the hashes. The
generated metadata is printed to stdout.
The metadata can then be used by passing the `--url-source` flag to ghcup.
"""
from subprocess import run, check_call
from getpass import getpass
import shutil
from pathlib import Path
from typing import NamedTuple, Callable, List, Dict, Optional
import tempfile
import re
import pickle
import os
import yaml
import gitlab
from urllib.request import urlopen
from urllib.parse import urlparse
import hashlib
import sys
import json
import urllib.parse
import fetch_gitlab
def eprint(*args, **kwargs):
print(*args, file=sys.stderr, **kwargs)
gl = gitlab.Gitlab('https://gitlab.haskell.org', per_page=100)
# TODO: Take this file as an argument
metadata_file = ".gitlab/jobs-metadata.json"
release_base = "https://downloads.haskell.org/~ghc/{version}/ghc-{version}-{bindistName}"
eprint(f"Reading job metadata from {metadata_file}.")
with open(metadata_file, 'r') as f:
job_mapping = json.load(f)
eprint(f"Supported platforms: {job_mapping.keys()}")
# Artifact precisely specifies a job what the bindist to download is called.
class Artifact(NamedTuple):
job_name: str
download_name: str
output_name: str
subdir: str
anchor_name: str
# Platform spec provides a specification which is agnostic to Job
# PlatformSpecs are converted into Artifacts by looking in the jobs-metadata.json file.
class PlatformSpec(NamedTuple):
name: str
subdir: str
source_artifact = Artifact('source-tarball'
, 'ghc-{version}-src.tar.xz'
, 'ghc-{version}-src.tar.xz'
, 'ghc-{version}'
, 'ghc{version}-src')
test_artifact = Artifact('source-tarball'
, 'ghc-{version}-testsuite.tar.xz'
, 'ghc-{version}-testsuite.tar.xz'
, 'ghc-{version}/testsuite'
, 'ghc{version}-testsuite')
def debian(n, arch='x86_64'):
return linux_platform(arch, "{arch}-linux-deb{n}".format(arch=arch, n=n))
def darwin(arch):
return PlatformSpec ( '{arch}-darwin'.format(arch=arch)
, 'ghc-{version}-{arch}-apple-darwin'.format(arch=arch, version="{version}") )
windowsArtifact = PlatformSpec ( 'x86_64-windows'
, 'ghc-{version}-x86_64-unknown-mingw32' )
def fedora(n, arch='x86_64'):
return linux_platform(arch, "{arch}-linux-fedora{n}".format(n=n,arch=arch))
def alpine(n, arch='x86_64'):
return linux_platform(arch, "{arch}-linux-alpine{n}".format(n=n,arch=arch))
def rocky(n, arch='x86_64'):
return linux_platform(arch, "{arch}-linux-rocky{n}".format(n=n,arch=arch))
def ubuntu(n, arch='x86_64'):
return linux_platform(arch, "{arch}-linux-ubuntu{n}".format(n=n,arch=arch))
def linux_platform(arch, opsys):
return PlatformSpec( opsys, 'ghc-{version}-{arch}-unknown-linux'.format(version="{version}", arch=arch) )
base_url = 'https://gitlab.haskell.org/api/v4/projects/1/jobs/{job_id}/artifacts/{artifact_name}'
hash_cache = {} # type: Dict[str, str]
# Download a URL and return its hash
def download_and_hash(url):
if url in hash_cache: return hash_cache[url]
eprint ("Opening {}".format(url))
response = urlopen(url)
sz = response.headers['content-length']
hasher = hashlib.sha256()
CHUNK = 2**22
for n,text in enumerate(iter(lambda: response.read(CHUNK), '')):
if not text: break
eprint("{:.2f}% {} / {} of {}".format (((n + 1) * CHUNK) / int(sz) * 100, (n + 1) * CHUNK, sz, url))
hasher.update(text)
digest = hasher.hexdigest()
hash_cache[url] = digest
return digest
uri_to_anchor_cache=dict()
# Make the metadata for one platform.
def mk_one_metadata(release_mode, version, job_map, artifact):
job_id = job_map[artifact.job_name].id
url = base_url.format(job_id=job_id, artifact_name=urllib.parse.quote_plus(artifact.download_name.format(version=version)))
# In --release-mode, the URL in the metadata needs to point into the downloads folder
# rather then the pipeline.
if release_mode:
# the test artifact is bundled with the source artifact, so it doesn't have its own job name
# So we must manually set the name of the bindist location
if artifact == test_artifact:
bindist_name = "testsuite"
else:
bindist_name = fetch_gitlab.job_triple(artifact.job_name)
final_url = release_base.format( version=version
, bindistName=urllib.parse.quote_plus(f"{bindist_name}.tar.xz"))
else:
final_url = url
eprint(f"Making metadata for: {artifact}")
eprint(f"Bindist URL: {url}")
eprint(f"Download URL: {final_url}")
# Download and hash from the release pipeline, this must not change anyway during upload.
h = download_and_hash(url)
res = { "dlUri": final_url
, "dlSubdir": artifact.subdir.format(version=version)
, "dlHash" : h }
# Only add dlOutput if it is inconsistent with the filename inferred from the URL
output = artifact.output_name.format(version=version)
if Path(urlparse(final_url).path).name != output:
res["dlOutput"] = output
eprint(res)
# add the uri to the anchor name cache so we can lookup an anchor for this uri
uri_to_anchor_cache[final_url] = artifact.anchor_name
return res
# Turns a platform into an Artifact respecting pipeline_type
# Looks up the right job to use from the .gitlab/jobs-metadata.json file
def mk_from_platform(pipeline_type, platform):
info = job_mapping[platform.name][pipeline_type]
eprint(f"From {platform.name} / {pipeline_type} selecting {info['name']}")
return Artifact(info['name']
, f"{info['jobInfo']['bindistName']}.tar.xz"
, "ghc-{version}-{pn}.tar.xz".format(version="{version}", pn=platform.name)
, platform.subdir
, f"ghc{{version}}-{platform.name}")
# Generate the new metadata for a specific GHC mode etc
def mk_new_yaml(release_mode, version, date, pipeline_type, job_map):
def mk(platform):
eprint("\n=== " + platform.name + " " + ('=' * (75 - len(platform.name))))
return mk_one_metadata(release_mode, version, job_map, mk_from_platform(pipeline_type, platform))
# Here are all the bindists we can distribute
ubuntu1804 = mk(ubuntu("18_04"))
ubuntu2004 = mk(ubuntu("20_04"))
ubuntu2204 = mk(ubuntu("22_04"))
rocky8 = mk(rocky("8"))
fedora33 = mk(fedora(33))
darwin_x86 = mk(darwin("x86_64"))
darwin_arm64 = mk(darwin("aarch64"))
windows = mk(windowsArtifact)
alpine3_12 = mk(alpine("3_12"))
alpine3_20 = mk(alpine("3_20"))
alpine3_18_arm64 = mk(alpine("3_18", arch='aarch64'))
deb9 = mk(debian(9, "x86_64"))
deb10 = mk(debian(10, "x86_64"))
deb11 = mk(debian(11, "x86_64"))
deb12 = mk(debian(12, "x86_64"))
deb10_arm64 = mk(debian(10, "aarch64"))
deb12_arm64 = mk(debian(12, "aarch64"))
deb10_i386 = mk(debian(10, "i386"))
deb12_i386 = mk(debian(12, "i386"))
source = mk_one_metadata(release_mode, version, job_map, source_artifact)
test = mk_one_metadata(release_mode, version, job_map, test_artifact)
# The actual metadata, this is not a precise science, but just what the ghcup
# developers want.
a64 = { "Linux_Debian": { "< 10": deb9
, "( >= 10 && < 11 )": deb10
, "( >= 11 && < 12 )": deb11
, ">= 12": deb12
, "unknown_versioning": deb11 }
, "Linux_Ubuntu" : { "unknown_versioning": ubuntu2004
, "( >= 16 && < 18 )": deb9
, "( >= 18 && < 19 )": ubuntu1804
, "( >= 19 && < 21 )": ubuntu2004
, "( >= 21 )": ubuntu2204
}
, "Linux_Mint" : { "< 20": ubuntu1804
, ">= 20": ubuntu2004
, "unknown_versioning": ubuntu2004 }
, "Linux_CentOS" : { "( >= 8 && < 9 )" : rocky8
, "unknown_versioning" : rocky8 }
, "Linux_Fedora" : { ">= 33": fedora33
, "unknown_versioning": rocky8 }
, "Linux_RedHat" : { "< 9": rocky8
, ">= 9": fedora33
, "unknown_versioning": fedora33 }
, "Linux_UnknownLinux" : { "unknown_versioning": rocky8 }
, "Darwin" : { "unknown_versioning" : darwin_x86 }
, "Windows" : { "unknown_versioning" : windows }
, "Linux_Alpine" : { "( >= 3.12 && < 3.20 )": alpine3_12
, ">= 3.20": alpine3_20
, "unknown_versioning": alpine3_12 }
}
a32 = { "Linux_Debian": { "( >= 10 && < 12 )": deb10_i386
, ">= 12": deb12_i386
, "unknown_versioning": deb10_i386 }
, "Linux_Ubuntu": { "unknown_versioning": deb10_i386 }
, "Linux_Mint" : { "unknown_versioning": deb10_i386 }
, "Linux_UnknownLinux" : { "unknown_versioning": deb10_i386 }
}
arm64 = { "Linux_UnknownLinux": { "unknown_versioning": deb10_arm64 }
, "Linux_Alpine" : { "unknown_versioning": alpine3_18_arm64 }
, "Linux_Debian": { "( >= 10 && < 12 )": deb10_arm64
, "( >= 12 )": deb12_arm64
, "unknown_versioning": deb10_arm64
}
, "Darwin": { "unknown_versioning": darwin_arm64 }
}
if release_mode:
version_parts = version.split('.')
if len(version_parts) == 3:
final_version = version
elif len(version_parts) == 4:
final_version = '.'.join(version_parts[:2] + [str(int(version_parts[2]) + 1)])
change_log = f"https://downloads.haskell.org/~ghc/{version}/docs/users_guide/{final_version}-notes.html"
else:
change_log = "https://gitlab.haskell.org"
if release_mode:
tags = ["Latest", "TODO_base_version"]
else:
tags = ["LatestNightly"]
return { "viTags": tags
, "viReleaseDay": date
# Check that this link exists
, "viChangeLog": change_log
, "viSourceDL": source
, "viTestDL": test
, "viArch": { "A_64": a64
, "A_32": a32
, "A_ARM64": arm64
}
}
def setNightlyTags(ghcup_metadata):
for version in ghcup_metadata['ghcupDownloads']['GHC']:
if "LatestNightly" in ghcup_metadata['ghcupDownloads']['GHC'][version]["viTags"]:
ghcup_metadata['ghcupDownloads']['GHC'][version]["viTags"].remove("LatestNightly")
ghcup_metadata['ghcupDownloads']['GHC'][version]["viTags"].append("Nightly")
def mk_dumper(version):
class CustomAliasDumper(yaml.Dumper):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def generate_anchor(self, node):
if isinstance(node, yaml.MappingNode):
node_dict = { k.value : v.value for (k,v) in node.value }
if 'dlUri' in node_dict:
return uri_to_anchor_cache[node_dict['dlUri']].format(version=version.replace('.',''))
return super().generate_anchor(node)
return CustomAliasDumper
def main() -> None:
import argparse
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('--metadata', required=True, type=Path, help='Path to GHCUp metadata')
parser.add_argument('--pipeline-id', required=True, type=int, help='Which pipeline to generate metadata for')
parser.add_argument('--release-mode', action='store_true', help='Generate metadata which points to downloads folder')
parser.add_argument('--fragment', action='store_true', help='Output the generated fragment rather than whole modified file')
# TODO: We could work out the --version from the project-version CI job.
parser.add_argument('--version', required=True, type=str, help='Version of the GHC compiler')
parser.add_argument('--date', required=True, type=str, help='Date of the compiler release')
args = parser.parse_args()
project = gl.projects.get(1, lazy=True)
pipeline = project.pipelines.get(args.pipeline_id)
jobs = pipeline.jobs.list()
job_map = { job.name: job for job in jobs }
# Bit of a hacky way to determine what pipeline we are dealing with but
# the aarch64-darwin job should stay stable for a long time.
if 'nightly-aarch64-darwin-validate' in job_map:
pipeline_type = 'n'
if args.release_mode:
raise Exception("Incompatible arguments: nightly pipeline but using --release-mode")
elif 'release-aarch64-darwin-release' in job_map:
pipeline_type = 'r'
else:
raise Exception("Not a nightly nor release pipeline")
eprint(f"Pipeline Type: {pipeline_type}")
new_yaml = mk_new_yaml(args.release_mode, args.version, args.date, pipeline_type, job_map)
if args.fragment:
print(yaml.dump({ args.version : new_yaml }, Dumper=mk_dumper(args.version)))
else:
with open(args.metadata, 'r') as file:
ghcup_metadata = yaml.safe_load(file)
if args.version in ghcup_metadata['ghcupDownloads']['GHC']:
raise RuntimeError("Refusing to override existing version in metadata")
setNightlyTags(ghcup_metadata)
ghcup_metadata['ghcupDownloads']['GHC'][args.version] = new_yaml
print(yaml.dump(ghcup_metadata))
if __name__ == '__main__':
main()