* add cray detection taken from upcxx
* add CUDA/ROCm support
* add numerous pass-through options to Chapel build,
like gpu_mem_strategy, comm_substrate, etc.; all variants are
translated to analogous CHPL_* environment variables. As a side
effect, this defines a number of environment variables that are
not actually used by Chapel.
* Define LD_LIBRARY_PATH, LIBRARY_PATH, and PKG_CONFIG_PATH to
help programs built with Chapel properly locate needed runtime
dependencies
---------
Co-authored-by: bonachea <dobonachea@lbl.gov>
The "use_store" context manager is used to swap the value
of a global variable (spack.store.STORE), while keeping
another global variable consistent (spack.config.CONFIG).
When doing that it tries to evaluate the previous value
of the store, if that was not done already. This is wrong,
since the configuration might be in an "intermediate" state
that was never meant to trigger side effects.
Remove that operation, and add a unit test to
prevent regressions.
`Spec.__getitem__` queries dependent edges, which almost always point to
nodes outside the sub-dag considered. It should only ever look at edges
being traversed.
This modifies heuristic to decay to clingo default
over time. The hope is that this helps with specs
that have an optimal solution with a high penalty.
Let target and compiler heuristic decay too, do not
guess compiler
If netlib-lapack is built with ~external-blas, it internally links
liblapack.so with libblas.so, meaning that whenever netlib-lapack is
used as a lapack provider, the package must also be a blas provider.
Conversely using netli-lapack as a blas provider does not imply that it
also must provide lapack, but nothing is lost disallowing that...
When we changed how to deal with errors in November,
we didn't realize that for an unconstrained choice
rule it is more important in the heuristic to guess
what is NOT in the answer set, since it will be the
majority of options.
Previously this was following automatically from what
was in the answer set, via `1 { ... } 1` cardinality
constraints.
Here we improve the heuristic and the solve time for specs.
#40773 introduced python-venv, which improved build isolation and avoids issues with,
e.g., `ubuntu`'s system python modifying `sysconfig` to include a (very unwanted)
`local` directory within the default install layout.
This addresses a few cases where #40773 removed functionality, without harming the
default cases where we use `python-venv`.
Traditionally, *every* view with `python` in it was essentially a virtual environment,
because we would copy the `python` interpreter and `os.py` into every view when linking.
We now rely on `python-venv` to do that, but only when it's used (i.e. new builds) and
only for packages that have an `extends("python")` directive.
This again makes every view with `python` in it a virtual environment, but only
if we're not already using a package like `python-venv`. This uses a different
mechanism from before -- instead of using the `virtualenv` trick of copying `python`
into the prefix, we instead create a `pyvenv.cfg` like `venv` (the more modern way
to do it).
This fixes two things:
1. If you already had an environment before Spack `v0.22` that worked, it would
stop working without a reconcretize and rebuild in `v0.22`, because we no longer
copy the python interpreter on link. Adding `pyvenv.cfg` fixes this in a more
modern way, so old views will keep working.
2. If you have an env that only includes python packages that use `depends_on("python")`
instead of `extends("python")`, those packages will now be importable as before,
though they won't have the same level of build isolation you'd get with `extends`
and `python-venv`.
* views: avoid making client code deal with link functions
Users of views and ViewDescriptors shouldn't have to deal with link functions -- they
should just say what type of linking they want.
- [x] views take a link_type, not a link function
- [x] views work out the link function from the link type
- [x] view descriptors and commands now just tell the view what they want.
* python: simplify logic for avoiding pyvenv.cfg in copy views
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Add support for Gitlab CI on Windows
This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows
infrastructure).
* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals
---------
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
* archive: relative links only
Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix
* binary_distribution: limit extraction to prefix
Ensure files extracted from spackballs are not links pointing outside of the prefix
* Ensure rpaths are properly set on Windows
* hard error on extraction of absolute links
* refactor for non link-modifying approach
* Restore tarball extraction to original impl
* use custom readlink
* cleanup symlink module
* make lstrip
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).
Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
Use correct path separator in get_all_package_diffs for all platforms.
Ensures correct package change computation on Windows when pruning unchanged specs in Gitlab CI
Before this PR, if Spack could see a possibility to reuse a spec that
doesn't match a strong preference, it would do so. After the PR, a
strong preference would take precedence.
avoid calling `spec.target` when None.
When an external compiler package has an `os` set but no `target` set, Spack
currently falls into a codepath that calls `spec.target` (which itself calls
`spec.architecture.target.Microarchitecture`) when `spec.architecture.target`
is None, throwing an error.
e.g.
```
packages:
gcc:
externals:
- spec: gcc@12.3.1 os=rhel7
prefix: /usr
```
---------
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* glew: rework dependency on gl
This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.
* mesa-demos: rework dependency on gl
This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.
* mesa-glu: rework dependency on gl
This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.
* paraview: fix dependency on glew
* mesa: group dependency on when("+glx")
* Add missing dependency on libxml2
* paraview: remove the "osmesa" and "egl" variant
Instead, enforce consistency using the "gl" virtual that allows
only one provider.
* visit: remove osmesa variant
* Disable paraview in the aws-isc stacks
* data-vis-sdk: rework constrains to enforce front-ends
* e4s-power: remove redundant paraview
* Pipelines: update osmesa variants
* trilinos-catalyst-ioss-adapter: make gl a run dependency
* Remove mesa18 and libosmesa
mesa18 was introduced in #19528 as a way to maintain the old
autotools build of mesa separate from the new meson build.
We could add a second build system to mesa, but since mesa18 has
been deprecated for a long time, we'll just remove it.
libosmesa was used to multiplex the gl provider between mesa18
and mesa, and is thus unecessary. Remove it to reduce complexity
in the graphical stack.
* Remove references to mesa18 and libosmesa
* vtk: rework dependency on gl and osmesa
* memsurfer: rework dependency on vtk
* visit: minimal fix to avoid having both osmesa and glx
`glibc` and `musl` provide a basic implementation of `iconv` (`iconv`,
`iconv_open`, `iconv_close`), but in practice the installation may be
missing the character encoding methods to make them usable. On Fedora
for example, users need to
```yum install glibc-gconv-extra```
to get the character encodings that `gettext` requires during configure,
namely EUC-JP. Users may not have permissions to install the missing
parts of glibc.
Since Spack can install `libiconv`, it is simpler to use that by
default.
This fixes a bug occurring when two root specs need to select
old versions, and these versions have the same penalty in the
optimization. This sometimes caused an older version to be
preferred to a more recent one.
The issue was the omission of `PackageNode` in the optimization
tuple.
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* py-matplotlib: qualify when to do a post install
Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
* Change install paths for older matplotlib
---------
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>