* For py-torch: Also update dependencies: many version constraints
with an upper bound of @1.9 are now open (e.g. `@1.8.0:1.9` is
converted to `@1.8.0:`).
* For py-torchvision: Also add 0.11.0 and update ^pil constraint
to avoid building with 8.3.0
This PR permits to specify the `url` and `ref` of the Spack instance used in a container recipe simply by expanding the YAML schema as outlined in #20442:
```yaml
container:
images:
os: amazonlinux:2
spack:
ref: develop
resolve_sha: true
```
The `resolve_sha` option, if true, verifies the `ref` by cloning the Spack repository in a temporary directory and transforming any tag or branch name to a commit sha. When this new ability is leveraged an additional "bootstrap" stage is added, which builds an image with Spack setup and ready to install software. The Spack repository to be used can be customized with the `url` keyword under `spack`.
Modifications:
- [x] Permit to pin the version of Spack, either by branch or tag or sha
- [x] Added a few new OSes (centos:8, amazonlinux:2, ubuntu:20.04, alpine:3, cuda:11.2.1)
- [x] Permit to print the bootstrap image as a standalone
- [x] Add documentation on the new part of the schema
- [x] Add unit tests for different use cases
* [pkg][new version] Provide eospac@6.5.0 and mark it as default.
* Merge in changes found in #21629
* Mark all alpha/beta versions as deprecated.
- Addresses @sethrj's recommendation
- Also add a note indicated why these versions are marked this way.
1. Currently it prints not just the spec name, but the dependencies +
their variants + their compilers + their architectures + ...
2. It's clear from the context what spec the message applies to, so,
let's not print the spec at all.
These three rules in `concretize.lp` are overly complex:
```prolog
:- not provider(Package, Virtual),
provides_virtual(Package, Virtual),
virtual_node(Virtual).
```
```prolog
:- provides_virtual(Package, V1), provides_virtual(Package, V2), V1 != V2,
provider(Package, V1), not provider(Package, V2),
virtual_node(V1), virtual_node(V2).
```
```prolog
provider(Package, Virtual) :- root(Package), provides_virtual(Package, Virtual).
```
and they can be simplified to just:
```prolog
provider(Package, Virtual) :- node(Package), provides_virtual(Package, Virtual).
```
- [x] simplify virtual rules to just one implication
- [x] rename `provides_virtual` to `virtual_condition_holds`
fixes#26866
This semantics fits with the way Spack currently treats providers of
virtual dependencies. It needs to be revisited when #15569 is reworked
with a new syntax.
* gcc: support runtime ability to not install spack rpaths
Fixes#26582 .
* gcc: Fix malformed specs file and add docs
The updated docs point out that the spack-modified GCC does *not*
follow the usual behavior of LD_RUN_PATH!
* gcc: fix bad rpath on macOS
This bug has been around since the beginning of the GCC package file:
the rpath command it generates for macOS writes a single (invalid)
rpath entry.
* gcc: only write rpaths for directories with shared libraries
The original lib64+lib was just a hack for "in case either has one" but
it's easy to tell whether either has libraries that can be directly
referenced.
* py-vermin: add latest version 1.3.1
* Exclude line from Vermin since version is already being checked for
Vermin 1.3.1 finds that `encoding` kwarg of builtin `open()` requires Python 3+.
* Update py-aiohttp to 3.7.4 and py-chartdet to 4.0
* Changes from review
* Update package.py
* Update var/spack/repos/builtin/packages/py-chardet/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The OS should only interpret shebangs, if a file is executable.
Thus, there should be no need to modify files where no execute bit is set.
This solves issues that are e.g. encountered while packaging software as
COVISE (https://github.com/hlrs-vis/covise), which includes example data
in Tecplot format. The sbang post-install hook is applied to every installed
file that starts with the two characters #!, but this fails on the binary Tecplot
files, as they happen to start with #!TDV. Decoding them with UTF-8 fails
and an exception is thrown during post_install.
Co-authored-by: Martin Aumüller <aumuell@reserv.at>
This commit contains changes to support Google Cloud Storage
buckets as mirrors, meant for hosting Spack build-caches. This
feature is beneficial for folks that are running infrastructure on
Google Cloud Platform. On public cloud systems, resources are
ephemeral and in many cases, installing compilers, MPI flavors,
and user packages from scratch takes up considerable time.
Giving users the ability to host a Spack mirror that can store build
caches in GCS buckets offers a clean solution for reducing
application rebuilds for Google Cloud infrastructure.
Co-authored-by: Joe Schoonover <joe@fluidnumerics.com>
* scr/veloc: component releases
Update the ECP-VeloC component packages in preparation for an
upcoming scr@3.0rc2 release.
All
- Add new release versions
- Add new `shared` variant for all components
- Add zlib link dependency to packages that were missing it
- Add maintainers
- Use self.define and self.define_from_variant to clean up cmake_args
axl
- Add independent vendor async support variants
rankstr
- Update older version sha that fails checksum on install
* Fix scr build error
Lock dependencies for scr@3.0rc1 to the versions released at the same
time.
* py-neurokit2: add 0.1.4.1
* Update var/spack/repos/builtin/packages/py-neurokit2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>