On all of my platforms, ghcup built from the master completely fails on the network access:
$ ~/.cabal/bin/ghcup -v debug-info[ Debug ] Receiving download info from: GHCupURL[ Debug ] Unable to get/parse Last-Modified headerghcup: SSL_connect: resource vanished (Connection reset by peer)$
I thought we already talked about the [un]reasonable-ness of the need to reach over the Internet to answer simple questions, like "what my config is?", "what kind of system am I running on?", and "what is my version?"
Since there's no more of debugging output, the above is all I can tell you.
The binary provided via $ ghcup upgrade seems functional, the binary built from the current master is not.
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Child items
...
Show closed items
Linked items
0
Link issues together to show that they're related or that one is blocking others.
Learn more.
I thought we already talked about the [un]reasonable-ness of the need to reach over the Internet to answer simple questions, like "what my config is?", "what kind of system am I running on?", and "what is my version?"
debug-info is currently still a command (unlike --version and --numeric-version). Before any command runs, ghcup checks for a new version. I find this reasonable.
I'm in the process of improving that side, as in:
if it cannot fetch the download info, emit a warning, then:
try to fall back to the cached version in ~/.ghcup/cache/foo.json
allow to run commands that don't need it
If you want that to be improved quicker, provide a pull request. I'm the only maintainer of this project. And all these things are already on my TODO.
On all of my platforms, ghcup built from the master completely fails on the network access:
I didn't mention this in the last issue, but there are two ways to build ghcup:
with curl, which is what all shipped binaries do (run: cabal build -fcurl)
with the internal downloader, which is very custom and links to OpenSSL. This one has had other problems, which is why the shipped binaries don't use it right now.
debug-info is currently still a command (unlike --version and --numeric-version). Before any command runs, ghcup checks for a new version. I find this reasonable.
I don't, given the result. Recommendation:
do that check only for some of the commands;
make that check a part of upgrade command only;
make debug-info like --version
I'm in the process of improving that side...
Yes, that would be much better. BTW, for this plan to work, I expect ~/.ghcup/cache/foo.json to be included in the distribution, to avoid the chicken-and-egg problem (in order to "fall back" there had to be at least one successful run before that, which successfully fetched the file and cached it).
I didn't mention this in the last issue, but there are two ways to build ghcup:
with curl, which is what all shipped binaries do (run: cabal build -fcurl)
with the internal downloader, which is very custom and links to OpenSSL. This one has had other problems, which is why the shipped binaries don't use it right now.
Try with -fcurl and report back.
Let me begin with expressing my strong dislike for (2), because of how it (doesn't) handle web proxies.
With -fcurl it builds, installs, and fails on attempt to fetch ghcup-0.0.2.json, which apparently doesn't exist? Trying to cut-and-paste the URL in question to "direct" curl:
No. We want that for list and everything else as well. Users should be annoyed for not upgrading.
make debug-info like --version
That's possible with some adjustment. Please provide a PR.
Yes, that would be much better. BTW, for this plan to work, I expect ~/.ghcup/cache/foo.json to be included in the distribution, to avoid the chicken-and-egg problem (in order to "fall back" there had to be at least one successful run before that, which successfully fetched the file and cached it).
That should already be the case when you ran the bootstrap script and when it did its job successfully. If the bootstrap script fails to download anything, then your configuration is likely broken.
With -fcurl it builds, installs, and fails on attempt to fetch ghcup-0.0.2.json, which apparently doesn't exist? Trying to cut-and-paste the URL in question to "direct" curl:
We differ on this. Given my experience with the recent upgrades, I think my position is understandable.
Please provide a PR...
If I knew Haskell well enough to do that, rest assured it would've been done a while ago.
I expect ~/.ghcup/cache/foo.json to be included in the distribution
That should already be the case when you ran the bootstrap script and when it did its job successfully
OK, that takes care of it.
With -fcurl it builds, installs...
... use the local version ... --url-source ...
The --url-source works. It would be nice if this flag (and --cache - BTW, what exactly does it do?) were included in the --help output.
I'd like to ask to make this config the default. I.e., have the default build with curl, and build with the internal downloader (which does not work) only if something like -fno-curl is specified.
P.S. When do you think ghcup-0.0.2.json would end up on the main web site, obviating the need to specify --url-source ... flag?
We differ on this. Given my experience with the recent upgrades, I think my position is understandable.
I cannot follow.
The --url-source works. It would be nice if this flag (and --cache - BTW, what exactly does it do?) were included in the --help output.
--cache is already in the help output:
-c,--cache Cache downloads in ~/.ghcup/cache
--url-source is undocumented, because it's a developer feature.
I'd like to ask to make this config the default. I.e., have the default build with curl, and build with the internal downloader (which does not work) only if something like -fno-curl is specified.
I thought about it. I also thought about making a switch --downloader=<internal|wget|curl>, but I'm not sure yet.
P.S. When do you think ghcup-0.0.2.json would end up on the main web site, obviating the need to specify --url-source ... flag?
I can upload it, but it isn't meant to be uploaded before the next release. I don't get paid to do ghcup and depending on tomorrow, I might be gone for a couple of weeks. So I can't say when the next release will be. The cross branch is almost done. But testing it is cumbersome, because of the GHC compile times.
If you care, check out the cross branch of this repo, it also contains a lot of bugfixes.
--url-source is undocumented, because it's a developer feature.
I'm not a developer (of this package at least), yet it turned out that I had to use it. I think it's worth documenting.
I thought about it. I also thought about making a switch --downloader=<internal|wget|curl>, but I'm not sure yet.
I like this. Then, regardless of what default the binary has been built with, the user could alias invocation to, e.g., ghcup --downloader=wget, and things would work automatically from that point on.
Unless it's much simpler/quicker for you to just change the default build for the binary? Not as nice architecturally, but maybe easier for you in the short term.
can upload ghcup-0.0.2.json, but it isn't meant to be uploaded before the next release...
Understood. Still, could you please upload it?
If you care, check out the cross branch of this repo, it also contains a lot of bugfixes.
If that's your recommendation... I can't be considered a Haskell developer because of lack of experience and expertise - but I've enough computing resources to run compiles.
Re. Tomorrow - I hope things will turn out the way you'd like them to.
Unless it's much simpler/quicker for you to just change the default build for the binary? Not as nice architecturally, but maybe easier for you in the short term.
It'll still be ugly, because I need to CPP-ifdef a lot. The internal downloader must be optional at compile-time for a simple reason: it causes linking to OpenSSL. All linux binaries are statically linked. OpenSSL also isn't particularly ABI compatible across, say, FreeBSD releases. So it's just not that portable.
I still think the cabal way might be the sanest: if curl exists, use that, otherwise, try wget. I can't think of a case where a user has both curl and wget installed, but only wget succeeds with the download?
Understood. Still, could you please upload it?
Yes, I uploaded it
I'd that's your recommendation... I can't be considered a Haskell developer because of lack of experience and expertise - but I've enough computing resources to run compiles.
I just need to manage to compile GHC once for cross-compiling to test some stuff, then I am almost done.
Re. Tomorrow - I hope things will turn out the way you'd like them to.
I still think the cabal way might be the sanest: if curl exists, use that, otherwise, try wget...
If you replace "exists" with "succeeds" - I'd agree. With this approach you can probably cycle through all the three if you wish.
I can't think of a case where a user has both curl and wget installed, but only wget succeeds with the download?
I can. One of my office machines is like that. Also, it could be the matter of passing the right (for the user's specific environment) flags to the program you're invoking. Like, some curl installations would require -k, but that's not something desirable in general...
I can. One of my office machines is like that. Also, it could be the matter of passing the right (for the user's specific environment) flags to the program you're invoking. Like, some curl installations would require -k, but that's not something desirable in general...
My distro solves this by allowing an environmet variable EXTRA_CURL or something, same for wget. So you don't control the entire invocation, but you can pass additional arguments. If you pass bad arguments, it's on you... I'll think about it.
$ ghcup --versionThe GHCup Haskell installer, version 0.1.4$ type ghcupghcup is hashed (/home/ur20980/.cabal/bin/ghcup)
So, this is unquestionably the binary that I built from source via Cabal, from the master branch. Why it doesn't say something like version 0.1.5_alpha, I don't know.
And as I said - master branch seems to function fine on CentOS 8.
If you built from source, this should be something like The GHCup Haskell installer, version v0.1.4-36-ge54a216. So if the git tag is missing, this might not be the version built from git.
I'm currently changing the default to build for curl and will add wget support.
$ ghcup list[ Warn ] Could not get download info, trying cached version (this may not be recent!)[ Error ] Error fetching download info: FileDoesNotExistError "/Users/ur20980/.ghcup/cache/ghcup-0.0.2.json"
Yes, this is actually as expected, it falls back to the offline version now!
P.S. Wouldn't it be better to always include all the three download options, and change only the default on invocation? For example,
-finternal-downloader causes linking to OpenSSL, which causes portability issues (e.g. the ghcup binary wouldn't work on both FreeBSD 11 and 12). So it is off by default.
-finternal-downloader causes linking to OpenSSL, which causes portability issues (e.g. the ghcup binary wouldn't work on both FreeBSD 11 and 12). So it is off by default.
Ah, I understand... Yes, what you did should be perfectly OK for the distributed binaries.
I'm still trying to find a way that would allow a user to buildghcup from the source with all the three downloaders, but keeping curl as default... What do you say/think?
Because, e.g., when I build a binary on my system, I don't mind linking it with my OpenSSL (understanding that it's not likely to be portable) - but I'd prefer to have a reasonable default, which on my systems is curl...