Environment yaml files should not have default values written to them.
To accomplish this, we change the validator to not add the default values to yaml. We rely on the code to set defaults for all values (and use defaulting getters like dict.get(key, default)).
Includes regression test.
This creates a set of packages which all use the same script to install
components of Intel oneAPI. This includes:
* An inheritable IntelOneApiPackage which knows how to invoke the
installation script based on which components are requested
* For components which include headers/libraries, an inheritable
IntelOneApiLibraryPackage is provided to locate them
* Individual packages for DAL, DNN, TBB, etc.
* A package for the Intel oneAPI compilers (icx/ifx). This also includes
icc/ifortran but these are not currently detected in this PR
We have to repeat all the spec attributes in a number of places in
`concretize.lp`, and Spack has a fair number of spec attributes. If we
instead add some rules up front that establish equivalencies like this:
```
node(Package) :- attr("node", Package).
attr("node", Package) :- node(Package).
version(Package, Version) :- attr("version", Package, Version).
attr("version", Package, Version) :- version(Package, Version).
```
We can rewrite most of the repetitive conditions with `attr` and repeat
only for each arity (there are only 3 arities for spec attributes so far)
as opposed to each spec attribute. This makes the logic easier to read
and the rules easier to follow.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Continuing to convert everything in `asp.py` into facts, make the
generation of ground rules for conditional dependencies use facts, and
move the semantics into `concretize.lp`.
This is probably the most complex logic in Spack, as dependencies can be
conditional on anything, and we need conditional ASP rules to accumulate
and map all the dependency conditions to spec attributes.
The logic looks complicated, but essentially it accumulates any
constraints associated with particular conditions into a fact associated
with the condition by id. Then, if *any* condition id's fact is True, we
trigger the dependency.
This simplifies the way `declared_dependency()` works -- the dependency
is now declared regardless of whether it is conditional, and the
conditions are handled by `dependency_condition()` facts.
There are currently no places where we do not want to traverse
dependencies in `spec_clauses()`, so simplify the logic by consolidating
`spec_traverse_clauses()` with `spec_clauses()`.
`version_satisfies/2` and `node_compiler_version_satisfies/3` are
generated but need `#defined` directives to avoid " info: atom does not
occur in any rule head:" warnings.
This PR addresses a number of issues related to compiler bootstrapping.
Specifically:
1. Collect compilers to be bootstrapped while queueing in installer
Compiler tasks currently have an incomplete list in their task.dependents,
making those packages fail to install as they think they have not all their
dependencies installed. This PR collects the dependents and sets them on
compiler tasks.
2. allow boostrapped compilers to back off target
Bootstrapped compilers may be built with a compiler that doesn't support
the target used by the rest of the spec. Allow them to build with less
aggressive target optimization settings.
3. Support for target ranges
Backing off the target necessitates computing target ranges, so make Spack
handle those properly. Notably, this adds an intersection method for target
ranges and fixes the way ranges are satisfied and constrained on Spec objects.
This PR also:
- adds testing
- improves concretizer handling of target ranges
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Currently, version range constraints, compiler version range constraints,
and target range constraints are implemented by generating ground rules
from `asp.py`, via `one_of_iff()`. The rules look like this:
```
version_satisfies("python", "2.6:") :- 1 { version("python", "2.4"); ... } 1.
1 { version("python", "2.4"); ... } 1. :- version_satisfies("python", "2.6:").
```
So, `version_satisfies(Package, Constraint)` is true if and only if the
package is assigned a version that satisfies the constraint. We
precompute the set of known versions that satisfy the constraint, and
generate the rule in `SpackSolverSetup`.
We shouldn't need to generate already-ground rules for this. Rather, we
should leave it to the grounder to do the grounding, and generate facts
so that the constraint semantics can be defined in `concretize.lp`.
We can replace rules like the ones above with facts like this:
```
version_satisfies("python", "2.6:", "2.4")
```
And ground them in `concretize.lp` with rules like this:
```
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
:- version_satisfies(Package, Constraint).
version_satisfies(Package, Constraint)
:- version(Package, Version), version_satisfies(Package, Constraint, Version).
```
The top rule is the same as before. It makes conditional dependencies and
other places where version constraints are used work properly. Note that
we do not need the cardinality constraint for the second rule -- we
already have rules saying there can be only one version assigned to a
package, so we can just infer from `version/2` `version_satisfies/3`.
This form is also safe for grounding -- If we used the original form we'd
have unsafe variables like `Constraint` and `Package` -- the original
form only really worked when specified as ground to begin with.
- [x] use facts instead of generating rules for package version constraints
- [x] use facts instead of generating rules for compiler version constraints
- [x] use facts instead of generating rules for target range constraints
- [x] remove `one_of_iff()` and `iff()` as they're no longer needed
I was keeping the old `clingo` driver code around in case we had to run
using the command line tool instad of through the Python interface.
So far, the command line is faster than running through Python, but I'm
working on fixing that. I found that if I do this:
```python
control = clingo.Control()
control.load("concretize.lp")
control.load("hdf5.lp") # code from spack solve --show asp hdf5
control.load("display.lp")
control.ground([("base", [])])
control.solve(...)
```
It's just as fast as the command line tool. So we can always generate the
code and load it manually if we need to -- we don't need two drivers for
clingo. Given that the python interface is also the only way to get unsat
cores, I think we pretty much have to use it.
So, I'm removing the old command line driver and other unused code. We
can dig it up again from the history if it is needed.
Track all the variant values mentioned when emitting constraints, validate them
and emit a fact that allows them as possible values.
This modification ensures that open-ended variants (variants accepting any string
or any integer) are projected to the finite set of values that are relevant for this
concretization.
Other parts of the concretizer code build up lists of things we can't
know without traversing all specs and packages, and they output these
list at the very end.
The code for this for variant values from spec literals was intertwined
with the code for traversing the input specs. This only covers the input
specs and misses variant values that might come from directives in
packages.
- [x] move ad-hoc value handling code into spec_clauses so we do it in
one place for CLI and packages
- [x] move handling of `variant_possible_value`, etc. into
`concretize.lp`, where we can automatically infer variant existence
more concisely.
- [x] simplify/clarify some of the code for variants in `spec_clauses()`
fixes#20055
Compiler with custom versions like gcc@foo are not currently
matched to the appropriate targets. This is because the
version of spec doesn't match the "real" version of the
compiler.
This PR replicates the strategy used in the original
concretizer to deal with that and tries to detect the real
version of compilers if the version in the spec returns no
results.
fixes#20040
Matching compilers among nodes has been prioritized
in #20020. Selection of default variants has been
tuned in #20182. With this setup there is no need
to have an ad-hoc rule for external packages. On
the contrary it should be removed to prefer having
default variant values over more external nodes in
the DAG.
refers #20040
Before this PR optimization rules would have selected default
providers at a higher priority than default variants. Here we
swap this priority and we consider variants that are forced by
any means (root spec or spec in depends_on clause) the same as
if they were with a default value.
This prevents the solver from avoiding expected configurations
just because they contain directives like:
depends_on('pkg+foo')
and `+foo` is not the default variant value for pkg.
fixes#19981
This commit adds support for target ranges in directives,
for instance:
conflicts('+foo', when='target=x86_64:,aarch64:')
If any target in a spec body is not a known target the
following clause will be emitted:
node_target_satisfies(Package, TargetConstraint)
when traversing the spec and a definition of
the clause will then be printed at the end similarly
to what is done for package and compiler versions.
fixes#20019
Before this modification having a newer version of a node came
at higher priority in the optimization than having matching
compilers. This could result in unexpected configurations for
packages with conflict directives on compilers of the type:
conflicts('%gcc@X.Y:', when='@:A.B')
where changing the compiler for just that node is preferred to
lower the node version to less than 'A.B'. Now the priority has
been switched so the solver will try to lower the version of the
nodes in question before changing their compiler.
refers #20079
Added docstrings to 'concretize' and 'concretized' to
document the format for tests.
Added tests for the activation of test dependencies.
refers #20040
This modification emits rules like:
provides_virtual("netlib-lapack","blas") :- variant_value("netlib-lapack","external-blas","False").
for packages that provide virtual dependencies conditionally instead
of a fact that doesn't account for the condition.
This adds a new `mark` command that can be used to mark packages as either
explicitly or implicitly installed. Apart from fixing the package
database after installing a dependency manually, it can be used to
implement upgrade workflows as outlined in #13385.
The following commands demonstrate how the `mark` and `gc` commands can be
used to only keep the current version of a package installed:
```console
$ spack install pkgA
$ spack install pkgB
$ git pull # Imagine new versions for pkgA and/or pkgB are introduced
$ spack mark -i -a
$ spack install pkgA
$ spack install pkgB
$ spack gc
```
If there is no new version for a package, `install` will simply mark it as
explicitly installed and `gc` will not remove it.
Co-authored-by: Greg Becker <becker33@llnl.gov>
Users can add test() methods to their packages to run smoke tests on
installations with the new `spack test` command (the old `spack test` is
now `spack unit-test`). spack test is environment-aware, so you can
`spack install` an environment and then run `spack test run` to run smoke
tests on all of its packages. Historical test logs can be perused with
`spack test results`. Generic smoke tests for MPI implementations, C,
C++, and Fortran compilers as well as specific smoke tests for 18
packages.
Inside the test method, individual tests can be run separately (and
continue to run best-effort after a test failure) using the `run_test`
method. The `run_test` method encapsulates finding test executables,
running and checking return codes, checking output, and error handling.
This handles the following trickier aspects of testing with direct
support in Spack's package API:
- [x] Caching source or intermediate build files at build time for
use at test time.
- [x] Test dependencies,
- [x] packages that require a compiler for testing (such as library only
packages).
See the packaging guide for more details on using Spack testing support.
Included is support for package.py files for virtual packages. This does
not change the Spack interface, but is a major change in internals.
Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: wspear <wjspear@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The deprecatedProperties custom validator now can accept a function
to compute a better error message.
Improve error/warning message for deprecated properties
As of #18260, `spack load` and `spack env activate` now use
`prefix_inspections` from the modules configuration to decide
how to modify environment variables.
This updates the modules configuration documentation to describe
how to update environment variables with the `prefix_inspections`
section. This also updates the `spack load` and environments
documentation to refer to the new `prefix_inspections` documentation.
`spack load` and `spack env activate` now use the prefix inspections
defined in `modules.yaml`. This allows users to customize/override
environment variable modifications if desired.
If no `prefix_inspections` configuration is present, Spack uses the
values in the default configuration.