Fetching from urls: Error Message (#16434)

* Fetching from urls: Error Message

Fix the error message when fetching from consecutive `urls` of a
package version. Each fail should show the currently failing URL,
not the first url.

Example multi-problem run that occured in real life:
```
==> 5821: Installing util-macros
curl: (28) Connection timed out after 10000 milliseconds
curl: (16) Error in the HTTP2 framing layer
curl: (22) The requested URL returned error: 403 Forbidden
==> Fetching https://www.x.org/archive/individual/util/util-macros-1.19.1.tar.bz2
==> Failed to fetch file from URL: https://www.x.org/archive/individual/util/util-macros-1.19.1.tar.bz2
    Curl failed with error 28
==> Fetching https://mirrors.ircam.fr/pub/x.org/individual/util/util-macros-1.19.1.tar.bz2
==> Failed to fetch file from URL: https://www.x.org/archive/individual/util/util-macros-1.19.1.tar.bz2
    Curl failed with error 16
==> Fetching http://xorg.mirrors.pair.com/individual/util/util-macros-1.19.1.tar.bz2
==> Failed to fetch file from URL: https://www.x.org/archive/individual/util/util-macros-1.19.1.tar.bz2
    URL https://www.x.org/archive/individual/util/util-macros-1.19.1.tar.bz2 was not found!
==> Fetching from https://www.x.org/archive/individual/util/util-macros-1.19.1.tar.bz2 failed.
==> Error: FetchError: All fetchers failed for spack-stage-util-macros-1.19.1-se2a2e74oyusj2r4esgcb7pr3qhh45ef
```

- `urls[0]`: HTTP2 layer error
- `urls[1]`: timeout
- `urls[2]`: missing file on mirror

* x.org: two more mirrors

x.org mirrors are a bit tricky, since many are out-of-sync or off.
A good package to test with is `util-macros`, which had a "recent"
release.
This commit is contained in:
Axel Huebl 2020-06-18 01:37:32 -07:00 committed by GitHub
parent 1f85d6eceb
commit 2fd3ab3c9c
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 10 additions and 4 deletions

View file

@ -14,9 +14,14 @@ class XorgPackage(spack.package.PackageBase):
xorg_mirror_path = None
#: List of x.org mirrors used by Spack
# Note: x.org mirrors are a bit tricky, since many are out-of-sync or off.
# A good package to test with is `util-macros`, which had a "recent"
# release.
base_mirrors = [
'https://www.x.org/archive/individual/',
'https://mirrors.ircam.fr/pub/x.org/individual/',
'https://mirror.transip.net/xorg/individual/',
'ftp://ftp.freedesktop.org/pub/xorg/individual/',
'http://xorg.mirrors.pair.com/individual/'
]

View file

@ -292,6 +292,7 @@ def fetch(self):
tty.msg("Already downloaded %s" % self.archive_file)
return
url = None
for url in self.candidate_urls:
try:
partial_file, save_file = self._fetch_from_url(url)
@ -303,7 +304,7 @@ def fetch(self):
pass
if not self.archive_file:
raise FailedDownloadError(self.url)
raise FailedDownloadError(url)
def _fetch_from_url(self, url):
save_file = None
@ -369,12 +370,12 @@ def _fetch_from_url(self, url):
if curl.returncode == 22:
# This is a 404. Curl will print the error.
raise FailedDownloadError(
self.url, "URL %s was not found!" % self.url)
url, "URL %s was not found!" % url)
elif curl.returncode == 60:
# This is a certificate error. Suggest spack -k
raise FailedDownloadError(
self.url,
url,
"Curl was unable to fetch due to invalid certificate. "
"This is either an attack, or your cluster's SSL "
"configuration is bad. If you believe your SSL "
@ -386,7 +387,7 @@ def _fetch_from_url(self, url):
# This is some other curl error. Curl will print the
# error, but print a spack message too
raise FailedDownloadError(
self.url,
url,
"Curl failed with error %d" % curl.returncode)
# Check if we somehow got an HTML file rather than the archive we