Use the same compression level as `gzip` (6) instead of what Python uses
(9).
The LLVM tarball takes 4m instead of 12m to create, and is <1% larger.
That's not worth the wait...
* udunits: Update download URL
* udunits: Deprecate older versions
Unidata now only provides the latest version of each X.Y branch. Older 2.2 versions have been deprecated accordingly but are still available in the build cache.
Co-authored-by: RemiLacroix-IDRIS <RemiLacroix-IDRIS@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Update wgrib2 from JCSDA/NOAA-EMC fork
* var/spack/repos/builtin/packages/wgrib2/package.py: fix typo in comment, add conflict for variants netcdf3, netcdf4
* wget hdf5/netcdf4 internal dependencies for wgrib2
* Black-format var/spack/repos/builtin/packages/wgrib2/package.py
* More format changes in var/spack/repos/builtin/packages/wgrib2/package.py
#32137 added an option to update() a BinaryCacheIndex with a
cooldown: repeated attempts within this cooldown would not
actually retry. However, the cooldown was not properly
tracked for failures (which is common when the mirror
does not store any binaries and therefore has no index.json).
This commit ensures that update(..., with_cooldown=True) will
also skip the update even if a failure has occurred within the
cooldown period.
Due to reuse concretization, we may generate DAGs with two occurrences
of the same package corresponding to distinct specs. This happens when
build dependencies are reused, since their dependencies are ignored in
concretization.
This caused a regression, for example: `spec['openssl']` would take the
'openssl' of the build dep `cmake`, instead of the direct `openssl`
dependency, simply because the edge to `cmake` was traversed first and
we do depth first traversal.
One solution that was discussed is to limit `spec[name]` to just direct
deps, or direct deps + transitive link deps, but this is too breaking.
Instead, this PR simply prioritizes transitive link and direct
build/run/test deps, and then falls back to a full DAG traversal. So,
it's just about order of iteration.
* Update py-sphinxcontrib-mermaid
* Add py-myst-parser
* Fix py-mdit-py-plugins and py-myst-parser dependencies
* Add py-exhale package
* Update var/spack/repos/builtin/packages/py-mdit-py-plugins/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-myst-parser/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-myst-parser/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-myst-parser/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update py-exhale and py-myst-parser dependencies
* Add @svenevs as py-exhale maintainer
* Update var/spack/repos/builtin/packages/py-mdit-py-plugins/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Scan the text files for relocatable prefixes *before* creating a tarball,
to reduce the amount of work to be done during install from binary
cache.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* py-drep: new package
* fixed file extension
* added darwin conflict
* py-checkm-genome and py-pysam: bumped version and updated deps (#10)
added checkm and pysam deps
* added dep documentation and fixed style
* changed checkm and pysam back to dev version for upstreaming
* added url and perl run dep
* fixed style
Instead of showing
```
==> Error: Timed out waiting for a write lock.
```
show
```
==> Error: Timed out waiting for a write lock after 1.200ms and 4 attempts on file: /some/file
```
s.t. we actually get to see where acquiring a lock failed even when not
running in debug mode.
And use pretty time units everywhere, so we don't get 1.45e-9 seconds
but 1.450ns etc.
* backtraces without --debug
Currently `--debug` is too verbose and not-`--debug` gives to little
context about where exceptions are coming from.
So, instead, it'd be nice to have `spack --backtrace` and
`SPACK_BACKTRACE=1` as methods to get something inbetween: no verbose
debug messages, but always a full backtrace.
This is useful for CI, where we don't want to drown in debug messages
when installing deps, but we do want to get details where something goes
wrong if it goes wrong.
* completion
* acts: new versions
In the 20.x release line, these are the changes, https://github.com/acts-project/acts/compare/v20.0.0...v20.2.0
- `option(ACTS_SETUP_ACTSVG "Build ActSVG display plugin" OFF)` introduced in v20.1.0
- `option(ACTS_USE_SYSTEM_ACTSVG "Use the ActSVG system library" OFF)` introduced in v20.1.0
- `option(ACTS_BUILD_PLUGIN_ACTSVG "Build SVG display plugin" OFF)` introduced in v20.1.0
- `option(ACTS_USE_EXAMPLES_TBB "Use Threading Building Blocks library in examples" ON)` introduced in v20.1.0
- `option(ACTS_EXATRKX_ENABLE_ONNX "Build the Onnx backend for the exatrkx plugin" OFF)` introduced in v20.2.0
- `option(ACTS_EXATRKX_ENABLE_TORCH "Build the torchscript backend for the exatrkx plugin" ON)` introduced in v20.2.0
In the 19.x release line, these are the changes: https://github.com/acts-project/acts/compare/v19.7.0...v19.9.0
- `option(ACTS_USE_EXAMPLES_TBB "Use Threading Building Blocks library in examples" ON)` introduced in v19.8.0
The new build options have not been implemented in this commit but will be implemented next.
* acts: new variant svg
* actsvg: new package
* actsvg: style fixes
* acts: new versions 20.3.0 and 19.10.0
* astsvg: depends_on boost googletest
* actsvg: new version 0.4.26 (and style fix)
Includes fix to build issue when +examples, https://github.com/acts-project/actsvg/pull/23
* acts: new variant tbb when +examples @19.8:19 @20.1:
* acts: set ACTS_USE_EXAMPLES_TBB
* acts: no need for ACTS_SETUP_ACTSVG
* acts: move tbb variant to examples block
* acts: ACTS_USE_SYSTEM_ACTSDD4HEP removed in 20.3
* acts: use new ACTS_USE_SYSTEM_LIBS
* acts-dd4hep: new version 1.0.1, maintainer handle fixed
* acts: simplify variant tbb condition
* updated python version requirements
* updated sha256
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-checkm-genome and py-pysam: bumped version and updated deps
* updated setuptools dep type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
When we lose a running pod (possibly loss of spot instance) or encounter
some other infrastructure-related failure of this job, we need to retry
it. This retries the job the maximum number of times in those cases.
Currently `relocate_text` and `relocate_text_bin` are unsafe in the
sense that they run in parallel, and lead to races when modifying
different items pointing to the same inode.
This leads to the issue observed in #33453.
This PR:
1. Renames those functions to `unsafe_*` so people are aware
2. Adds logic to deal with hardlinks in current binary packages
3. Adds logic to deal with hardlinks when creating new binary tarballs,
so the install side doesn't have to de-dupe hardlinks.
4. Adds a test for 3
The assumption is that all our relocation logic preserves inodes. That
is, we should never copy a file, modify it, and then move it back. I
quickly verified, and its seems like this is true for (binary) text
relocation, as well as rpath patching in patchelf (even when the file
grows) and mach-o fixes.