Commit graph

4609 commits

Author SHA1 Message Date
Tamara Dahlgren
b7d9e269ef
docs: add single node concurrent build example (#20416) 2020-12-17 17:23:55 +01:00
Massimiliano Culpo
76f1548fe2
unit-tests: ensure that installed packages can be reused (#20307)
refers #20292

Added a unit test that ensures we can reuse installed
packages even if in the repository variants have been
removed or added.
2020-12-17 00:31:59 -08:00
Adam J. Stewart
826cd07cf7
PythonPackage: add import module smoke tests (#20023) 2020-12-16 15:15:03 -08:00
Greg Becker
3840c0ac45
docs: fix spack install debug arg order (#20428) 2020-12-16 13:57:08 -08:00
Adam J. Stewart
20752db103
Docs: add more Command Reference links to spack test (#20413) 2020-12-16 12:08:32 -08:00
Tamara Dahlgren
cb01981628
docs: fix spack command for unit-test pytest help (#20415) 2020-12-16 10:13:22 -08:00
Greg Becker
352dc0624c
Fix comparisons for abstract specs (#20341)
bug only relevant for python3
2020-12-15 14:44:58 -08:00
Todd Gamblin
b6089ac691
concretizer: don't use one_of_iff for range constraints (#20383)
Currently, version range constraints, compiler version range constraints,
and target range constraints are implemented by generating ground rules
from `asp.py`, via `one_of_iff()`.  The rules look like this:

```
version_satisfies("python", "2.6:") :- 1 { version("python", "2.4"); ... } 1.
1 { version("python", "2.4"); ... } 1. :- version_satisfies("python", "2.6:").
```

So, `version_satisfies(Package, Constraint)` is true if and only if the
package is assigned a version that satisfies the constraint. We
precompute the set of known versions that satisfy the constraint, and
generate the rule in `SpackSolverSetup`.

We shouldn't need to generate already-ground rules for this. Rather, we
should leave it to the grounder to do the grounding, and generate facts
so that the constraint semantics can be defined in `concretize.lp`.

We can replace rules like the ones above with facts like this:

```
version_satisfies("python", "2.6:", "2.4")
```

And ground them in `concretize.lp` with rules like this:

```
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
  :- version_satisfies(Package, Constraint).
version_satisfies(Package, Constraint)
  :- version(Package, Version), version_satisfies(Package, Constraint, Version).
```

The top rule is the same as before. It makes conditional dependencies and
other places where version constraints are used work properly. Note that
we do not need the cardinality constraint for the second rule -- we
already have rules saying there can be only one version assigned to a
package, so we can just infer from `version/2` `version_satisfies/3`.
This form is also safe for grounding -- If we used the original form we'd
have unsafe variables like `Constraint` and `Package` -- the original
form only really worked when specified as ground to begin with.

- [x] use facts instead of generating rules for package version constraints
- [x] use facts instead of generating rules for compiler version constraints
- [x] use facts instead of generating rules for target range constraints
- [x] remove `one_of_iff()` and `iff()` as they're no longer needed
2020-12-15 11:58:58 -08:00
Tamara Dahlgren
21f30e3074
Bugfix/docs: correct and expand smoke test documentation (#20278) 2020-12-15 08:38:00 -08:00
Tamara Dahlgren
d67ca265a3
outputs: restore default output of fetch/build/total times (#20394) 2020-12-15 01:46:30 -08:00
Massimiliano Culpo
e7f4c2b49e
package sanity: ensure all variant defaults are allowed values (#20373) 2020-12-15 10:22:15 +01:00
Todd Gamblin
495e8cfb8e
concretizer: remove clingo command-line driver (#20362)
I was keeping the old `clingo` driver code around in case we had to run
using the command line tool instad of through the Python interface.

So far, the command line is faster than running through Python, but I'm
working on fixing that.  I found that if I do this:

```python
control = clingo.Control()
control.load("concretize.lp")
control.load("hdf5.lp")       # code from spack solve --show asp hdf5
control.load("display.lp")

control.ground([("base", [])])
control.solve(...)
```

It's just as fast as the command line tool. So we can always generate the
code and load it manually if we need to -- we don't need two drivers for
clingo. Given that the python interface is also the only way to get unsat
cores, I think we pretty much have to use it.

So, I'm removing the old command line driver and other unused code. We
can dig it up again from the history if it is needed.
2020-12-14 09:35:53 +01:00
Ben Cowan
cf37e9276d
Debugging support: fix compiler wrapper log on Mac OS (#20333)
This fixes a logging error observed on macOS 11.0.1 (Big Sur).
When performing a Spack install in debugging mode (e.g.
`spack -d install py-scipy`) Spack is supposed to write a log of
compiler wrapper command line invocations to the current working
directory.

Due to a regression error introduced by #18205, these files were
no-longer generated, and Spack was printing errors such as
"No such file or directory: None/." This is because the log file
directory gets set from `spack.main.spack_working_dir`, but that
variable is not set in the spawned process.

This PR ensures that the working directory (at the time of the
"spack install" invocation) is persisted to the subprocess.
2020-12-11 15:54:11 -08:00
Tamara Dahlgren
59628cd9e8
Tests: enable re-use of post-install tests in smoke tests (#20298) 2020-12-10 10:35:27 -08:00
Adam J. Stewart
3f5f80956e
Command Reference: add link to spack test docs (#20054) 2020-12-08 09:26:03 -08:00
Andrew W Elble
a90026fb89
concretizer: try hard to obtain all needed variant_possible_value()'s (#20102)
Track all the variant values mentioned when emitting constraints, validate them
and emit a fact that allows them as possible values.

This modification ensures that open-ended variants (variants accepting any string 
or any integer) are projected to the finite set of values that are relevant for this 
concretization.
2020-12-08 15:46:52 +01:00
Harmen Stoppels
50f8332d95
Compiler wrapper linting (#20249)
* Fix duplicate entries in case
* make sure the arg is not interpreted as two items in a list
* use -n over ! -z
2020-12-07 18:58:19 -08:00
Massimiliano Culpo
98c2627132 bugfix: work around issue handling packages not in any repo 2020-12-07 17:18:33 -08:00
Todd Gamblin
1343a815c0 concretizer: refactor handling of special variants dev_build and patches
Other parts of the concretizer code build up lists of things we can't
know without traversing all specs and packages, and they output these
list at the very end.

The code for this for variant values from spec literals was intertwined
with the code for traversing the input specs. This only covers the input
specs and misses variant values that might come from directives in
packages.

- [x] move ad-hoc value handling code into spec_clauses so we do it in
  one place for CLI and packages

- [x] move handling of `variant_possible_value`, etc. into
  `concretize.lp`, where we can automatically infer variant existence
  more concisely.

- [x] simplify/clarify some of the code for variants in `spec_clauses()`
2020-12-07 17:18:33 -08:00
vvolkl
ed258ca9e9
Add "spack versions --new" flag to only show new versions (#20030)
* [cmd versions] add spack versions --new flag to only fetch new versions

format

[cmd versions] rename --latest to --newest and add --remote-only

[cmd versions] add tests for --remote-only and --new

format

[cmd versions] update shell tab completion

[cmd versions] remove test for --remote-only --new which gives empty output

[cmd versions] final rename

format

* add brillig mock package

* add test for spack versions --new

* [brillig] format

* [versions] increase test coverage

* Update lib/spack/spack/cmd/versions.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update lib/spack/spack/cmd/versions.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-12-07 09:29:10 -06:00
Massimiliano Culpo
3843f43e69
concretizer: each external version is allowed by definition (#20247)
Registering external versions among the lists of allowed ones
generates the correct rules for `version_satisfies`
2020-12-06 10:29:05 +01:00
Harmen Stoppels
984ae7e695
Also allow --rpath as rpath linker flags (#18473) 2020-12-04 12:29:55 -05:00
Massimiliano Culpo
8b74b50cff
concretizer: restrict maximizing variant values to MV variants (#20194) 2020-12-04 16:27:03 +01:00
eugeneswalker
badf3368ad
allow install of build-deps from cache via --include-build-deps switch (#19955)
* allow install of build-deps from cache via --include-build-deps switch

* make clear that --include-build-deps is useful for CI pipeline troubleshooting
2020-12-03 15:27:01 -08:00
Matthias Wolf
794b60f7e7
environment installs: fix reporting. (#20004)
PR #15702 changed the invocation of the report context when installing
specs, do the same when building environments.
2020-12-03 15:04:13 -08:00
Greg Becker
d6765fe95d
avoid circular import (#20236) 2020-12-03 13:54:09 -08:00
Andrew W Elble
09aae616c7
concretizer: call inject_patches_variants() on the roots of the specs (#20203)
As was done in the old concretizer. Fixes an issue where conditionally
patched dependencies did not show up in spec (gdal+jasper)
2020-12-03 16:28:34 +01:00
Danny Taller
e22e037e30
Add CARE package, fixes for ROCmPackage and subclasses (#20070)
* use develop version of blt with fixes for rocm

* package updates for care+rocm

* fixes for plain cpu build

* add camp dependency on raja
2020-12-02 17:07:56 -08:00
Massimiliano Culpo
05848c87c5
concretizer: try hard to infer the real version of compilers (#20099)
fixes #20055

Compiler with custom versions like gcc@foo are not currently
matched to the appropriate targets. This is because the
version of spec doesn't match the "real" version of the
compiler.

This PR replicates the strategy used in the original
concretizer to deal with that and tries to detect the real
version of compilers if the version in the spec returns no
results.
2020-12-02 20:30:28 +01:00
Harmen Stoppels
b0baf42988
Fix hipcc once more (#20095) 2020-12-02 15:58:58 +01:00
Andrew W Elble
0c326e87a9
concretizer: don't optimize emitting version_satisfies() (#20128)
When all versions were allowed a version_satisfies rule was not emitted,
and this caused conditional directives to fail.
2020-12-02 09:53:53 +01:00
Massimiliano Culpo
7c01ba8fea
spec: return early from concretization if a spec is already concrete (#20196) 2020-12-01 18:09:14 +01:00
AMD Toolchain Support
ec1eddb6e7
AOCC-2.3.0 is now added to spack (#20089)
* AOCC-2.3.0 is now added to spack

Change-Id: I18fd9606e6fd9a288cc7dc6c6ead11ea17839a7c

* Added flag and version tests for AOCC-2.3.0

* Addressed review comments

Co-authored-by: vkallesh <Vijay-teekinavar.Kallesh@amd.com>
2020-12-01 08:07:58 -06:00
Massimiliano Culpo
e2033566bf
concretizer: remove ad-hoc rule for external packages (#20193)
fixes #20040

Matching compilers among nodes has been prioritized
in #20020. Selection of default variants has been
tuned in #20182. With this setup there is no need
to have an ad-hoc rule for external packages. On
the contrary it should be removed to prefer having
default variant values over more external nodes in
the DAG.
2020-12-01 10:11:40 +01:00
Massimiliano Culpo
7fd777c3d9
concretizer: swap priority of selecting provider and default variant (#20182)
refers #20040

Before this PR optimization rules would have selected default
providers at a higher priority than default variants. Here we
swap this priority and we consider variants that are forced by
any means (root spec or spec in depends_on clause) the same as
if they were with a default value.

This prevents the solver from avoiding expected configurations
just because they contain directives like:

depends_on('pkg+foo')

and `+foo` is not the default variant value for pkg.
2020-12-01 07:45:48 +01:00
George Hartzell
bb9f5d613c
Typos: add missing closing parens (#20174) 2020-11-30 10:28:07 -06:00
Adam J. Stewart
868dbb24c1
libQGLViewer: add new package (#20164) 2020-11-30 10:25:40 -05:00
Massimiliano Culpo
8dd3797d32
concretizer: treat target ranges in directives correctly (#19988)
fixes #19981

This commit adds support for target ranges in directives,
for instance:

conflicts('+foo', when='target=x86_64:,aarch64:')

If any target in a spec body is not a known target the
following clause will be emitted:

node_target_satisfies(Package, TargetConstraint)

when traversing the spec and a definition of
the clause will then be printed at the end similarly
to what is done for package and compiler versions.
2020-11-27 20:53:39 +01:00
Massimiliano Culpo
44665cb4e6
archspec: added support for aocc (#20124) 2020-11-26 16:18:40 +01:00
Massimiliano Culpo
03ff89fee6
concretizer: prioritize matching compilers over newer versions (#20020)
fixes #20019

Before this modification having a newer version of a node came
at higher priority in the optimization than having matching
compilers. This could result in unexpected configurations for
packages with conflict directives on compilers of the type:

conflicts('%gcc@X.Y:', when='@:A.B')

where changing the compiler for just that node is preferred to
lower the node version to less than 'A.B'. Now the priority has
been switched so the solver will try to lower the version of the
nodes in question before changing their compiler.
2020-11-26 13:10:48 +01:00
Massimiliano Culpo
8991cc4632
concretizer: allow a bool to be passed as argument for tests dependencies (#20082)
refers #20079

Added docstrings to 'concretize' and 'concretized' to
document the format for tests.

Added tests for the activation of test dependencies.
2020-11-26 08:55:17 +01:00
Massimiliano Culpo
983fb11dee
concretizer: treat conditional providers correctly (#20086)
refers #20040

This modification emits rules like:

provides_virtual("netlib-lapack","blas") :- variant_value("netlib-lapack","external-blas","False").

for packages that provide virtual dependencies conditionally instead
of a fact that doesn't account for the condition.
2020-11-25 22:03:42 +01:00
Harmen Stoppels
f40492b7d4
concretizer: remove debug statement (#20085) 2020-11-25 14:09:52 +01:00
Adam J. Stewart
fb2ac2077d
Docs: remove duplication in Command Reference (#20021) 2020-11-23 12:38:34 +01:00
Martin Aumüller
b490d65f28
recognize macOS 11.1 as big sur (#20038)
Big Sur versions go 11.0, 11.0.1, 11.1 (vs. prior versions that
only used the minor component)

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-11-23 08:37:40 +01:00
Adam J. Stewart
14a9359395
spack debug report: print concretizer (#19983) 2020-11-19 11:12:28 +01:00
Tomoki, Karatsu
8f3594564c
fujitsu compiler: added / fixed support for compiler flags (#19967)
Added flags for:
- Debug symbols
- C++17 standard

Fixed the list of flags for generic optimizations
2020-11-19 11:09:34 +01:00
Michael Kuhn
1b7a5e53a6
clang/llvm: fix version detection (#19978)
This PR fixes two problems with clang/llvm's version detection. clang's
version output looks like this:

```
clang version 11.0.0
Target: x86_64-unknown-linux-gnu
```

This caused clang's version to be misdetected as:

```
clang@11.0.0
Target:
```

This resulted in errors when trying to actually use it as a compiler.

When using `spack external find`, we couldn't determine the compiler
version, resulting in errors like this:

```
==> Warning: "llvm@11.0.0+clang+lld+lldb" has been detected on the system but will not be added to packages.yaml [reason=c compiler not found for llvm@11.0.0+clang+lld+lldb]
```

Changing the regex to only match until the end of the line fixes these
problems.

Fixes: #19473
2020-11-19 11:06:45 +01:00
Greg Becker
10f784338b
fix error handling for spack test results command (#19987) 2020-11-18 16:16:34 -08:00
Danny Taller
3b9155239b
hip support for umpire, chai, raja, camp (#19715)
* create HipPackage base class and do some refactoring

* comments and added conflict to raja for openmp with hip
2020-11-18 11:52:21 -08:00
Todd Gamblin
82383093ee bump version number to 0.16.0 2020-11-18 04:22:09 -08:00
Michael Kuhn
20367e472d
cmd: add spack mark command (#16662)
This adds a new `mark` command that can be used to mark packages as either
explicitly or implicitly installed. Apart from fixing the package
database after installing a dependency manually, it can be used to
implement upgrade workflows as outlined in #13385.

The following commands demonstrate how the `mark` and `gc` commands can be
used to only keep the current version of a package installed:
```console
$ spack install pkgA
$ spack install pkgB
$ git pull # Imagine new versions for pkgA and/or pkgB are introduced
$ spack mark -i -a
$ spack install pkgA
$ spack install pkgB
$ spack gc
```

If there is no new version for a package, `install` will simply mark it as
explicitly installed and `gc` will not remove it.

Co-authored-by: Greg Becker <becker33@llnl.gov>
2020-11-18 03:20:56 -08:00
Greg Becker
77b2e578ec
spack test (#15702)
Users can add test() methods to their packages to run smoke tests on
installations with the new `spack test` command (the old `spack test` is
now `spack unit-test`). spack test is environment-aware, so you can
`spack install` an environment and then run `spack test run` to run smoke
tests on all of its packages. Historical test logs can be perused with
`spack test results`. Generic smoke tests for MPI implementations, C,
C++, and Fortran compilers as well as specific smoke tests for 18
packages.

Inside the test method, individual tests can be run separately (and
continue to run best-effort after a test failure) using the `run_test`
method. The `run_test` method encapsulates finding test executables,
running and checking return codes, checking output, and error handling.

This handles the following trickier aspects of testing with direct
support in Spack's package API:

- [x] Caching source or intermediate build files at build time for
      use at test time.
- [x] Test dependencies,
- [x] packages that require a compiler for testing (such as library only
      packages).

See the packaging guide for more details on using Spack testing support.
Included is support for package.py files for virtual packages. This does
not change the Spack interface, but is a major change in internals.

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: wspear <wjspear@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-11-18 02:39:02 -08:00
Massimiliano Culpo
89181f253b Improve warning message for deprecated attributes in "packages.yaml"
The deprecatedProperties custom validator now can accept a function
to compute a better error message.

Improve error/warning message for deprecated properties
2020-11-17 17:34:27 -08:00
Peter Scheibel
c9ad2affcc
Documentation: spack load/environments prefix inspections (#19961)
As of #18260, `spack load` and `spack env activate` now use
`prefix_inspections` from the modules configuration to decide
how to modify environment variables.

This updates the modules configuration documentation to describe
how to update environment variables with the `prefix_inspections`
section. This also updates the `spack load` and environments
documentation to refer to the new `prefix_inspections` documentation.
2020-11-17 15:24:00 -08:00
Dr. Christian Tacke
d65f078f66
spack load/environments: allow customization of prefix inspections (#18260)
`spack load` and `spack env activate` now use the prefix inspections
defined in `modules.yaml`. This allows users to customize/override
environment variable modifications if desired.

If no `prefix_inspections` configuration is present, Spack uses the
values in the default configuration.
2020-11-17 14:04:13 -08:00
Massimiliano Culpo
5f636fc317
spack containerize: allow users to customize the base image (#15028)
This PR reworks a few attributes in the container subsection of
spack.yaml to permit the injection of custom base images when
generating containers with Spack. In more detail, users can still
specify the base operating system and Spack version they want to use:

  spack:
    container:
      images:
        os: ubuntu:18.04
        spack: develop

in which case the generated recipe will use one of the Spack images
built on Docker Hub for the build stage and the base OS image in the
final stage. Alternatively, they can specify explicitly the two
base images:

  spack:
    container:
      images:
        build: spack/ubuntu-bionic:latest
        final: ubuntu:18.04

and it will be up to them to ensure their consistency.

Additional changes:

* This commit adds documentation on the two approaches.
* Users can now specify OS packages to install (e.g. with apt or yum)
  prior to the build (previously this was only available for the
  finalized image).
* Handles to avoid an update of the available system packages have been
  added to the configuration to facilitate the generation of recipes
  permitting deterministic builds.
2020-11-17 11:25:13 -08:00
Massimiliano Culpo
7ffad278d3 concretizer: modified weights for providers and matching for externals
This commit address the case of concretizing a root spec with a
transitive conditional dependency on a virtual package, provided
by an external. Before these modifications default variant values
for the dependency bringing in the virtual package were not
respected, and the external package providing the virtual was added
to the DAG.

The issue stems from two facts:
- Selecting a provider has higher precedence than selecting default variants
- To ensure that an external is preferred, we used a negative weight

To solve it we shift all the providers weight so that:
- External providers have a weight of 0
- Non external provider have a weight of 10 or more

Using a weight of zero for external providers is such that having
an external provider, if present, or not having a provider at all
has the same effect on the higher priority minimization.

Also fixed a few minor bugs in concretize.lp, that were causing
spurious entries in the final answer set.

Cleaned concretize.lp from leftover rules.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
ca31f52be3 concretizer: maximize the number of default values used for a single variant
If a the default of a multi-valued variant is set to
multiple values either in package.py or in packages.yaml
we need to ensure that all the values are present in the
concretized spec.

Since each default value has a weight of 0 and the
variant value is set implicitly by the concretizer
we need to add a rule to maximize on the number of
default values that are used.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
9a03fd2834 concretizer: don't require a provider for virtual deps if spec is external
This commit introduces a new rule:

real_node(Package) :- not external(Package), node(Package).

that permits to distinguish between an external node and a
real node that shouldn't trim dependency. It solves the
case of concretizing ninja with an external Python.
2020-11-17 10:04:13 -08:00
Todd Gamblin
44aa94a210 concretizer: spec_clauses() shouldn't emit node_compiler_hard for rule bodies.
`node_compiler_hard()` means that something explicitly asked for a node's
compiler to be set -- i.e., it's not inherited, it's required. We're
generating this in spec_clauses even for specs in rule bodies, which
results in conditions like this for optional dependencies:

In py-torch/package.py:

    depends_on('llvm-openmp', when='%apple-clang +openmp')

In the generated ASP:

    declared_dependency("py-torch","llvm-openmp","build")
      :- node("py-torch"),
         variant_value("py-torch","openmp","True"),
         node_compiler("py-torch","apple-clang"),
         node_compiler_hard("py-torch","apple-clang"),
         node_compiler_version_satisfies("py-torch","apple-clang",":").

The `node_compiler_hard` there means we would have to *explicitly* set
py-torch's compiler to trigger the llvm-openmp dependency, rather than
just letting it be set by preferences. This is wrong; the dependency
should be there regardless of how the compiler was set.

- [x] remove fn.node_compiler_hard() call from spec_clauses when
      generating rule body clauses.
2020-11-17 10:04:13 -08:00
Todd Gamblin
0620d954f5 concretizer: don't generate rules for empty version lists
If the version list passed to one_of_iff is empty, it still generates a
rule like this:

    node_compiler_version_satisfies("fujitsu-mpi", "arm", ":") :- 1 {  } 1.
    1 {  } 1 :- node_compiler_version_satisfies("fujitsu-mpi", "arm", ":").

The cardinality rules on the right and left above are never
satisfiale, and these rules do nothing.

- [x] Skip generating any rules at all for empty version lists.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
2231dfc898 concretizer: add a rule to avoid cycles in the graph of dependencies 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
522be6cadf External packages have a consistent hash across different concretizers 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
5c5a44988e Don't fail if MV variants have a tuple as default value 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
1b0338befb Fixup for target preferences 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
e0ae60edc4 Added unit tests to for regressions on open concretizer bugs 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
63327d1eea Changed clingo options 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
577676106c Reworked optimization rules 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
6baa8157c7 concretizer: set target preference for inheritance from root 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
ccb537479a install: one less concretization when installing from file 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
27bb970a97 Fixed branch after rebase (port to archspec)
TODO: Investigate the need to remove
memoization on Spec.patches (infinite
recursion when testing `__contains__`)
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
930b05fab4 Add unit tests for dependencies being patched by parent 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
e226523aeb concretizer: handle dependencies conditional on other dependencies 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
e7208b1598 tests: verify to handle dependencies conditional on other dependencies 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
d00e8394f8 concretizer: handle conflicts with compiler ranges correctly
As reported, conflicts with compiler ranges were not treated
correctly. This commit adds tests to verify the expected behavior
for the new concretizer.

The new rules to enforce a correct behavior involve:
- Adding a rule to prefer the compiler selected for
  the root package, if no other preference is set
- Give a strong negative weight to compiler preferences
  expressed in packages.yaml
- Maximize on compiler AND compiler version match
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
0a56b7cfd6 Github actions: add CI for ASP based solver 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
7753d58e7e Make all tests pass
Fixed a couple of tests and marked a few xfails
to solve them later.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
8a855ddac5 concretizer: added handling for dev_path variant
This variant is currently either set from command line, in
which case it enters the concretization, or attached from
environment after concretization.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
a0e8ad7a8b concretizer: ensure upfront that variants are valid 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
ae1ef85af5 concretizer: account for test dependencies only when required 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
346beedfd4 Fix installer.py unit tests that check output 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
87c87ff767 Compute the correct package name for hierarchies that change class names 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
116f6b30eb concretizer: handle variants defined through validators
Variant of this kind don't have a list of possible
values encoded in the ASP facts. Since all we have
is a validator the list of possible values just includes
just the default value and possibly the value passed
from packages.yaml or cli.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
ada2fa36a9 concretizer: account for patches variant
This is done after the builder has actually built
the specs, to respect the semantics use with the
old concretizer.

Later we could move this to the solver as
a multivalued variant.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
a1fe88c95b concretizer: ensure that no deprecated spec is being used
This is done after the builder has actually built
the specs, to respect the semantics use with the
old concretizer.

A better approach is to substitute the spec
directly in concretization.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
58683b9e56 conftest: hook the new solver in the config fixture 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
3e4fd64169 concretizer: handle "none" value and '*' wildcard
The "none" variant value cannot be combined with
other values.

The '*' wildcard matches anything, including "none".
It's thus relevant in queries, but disregarded in
concretization.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
8b055ac8d8 Fixed failing unit tests
- The test on concretization of anonymous dependencies
  has been fixed by raising the expected exception.
- The test on compiler bootstrap has been fixed by
  updating the version of GCC used in the test.
  Since gcc@2.0 does not support targets later than
  x86_64, the new concretizer was looking for a
  non-existing spec, i.e. it was correctly trying
  to retrieve 'gcc target=x86_64' instead of
  'gcc target=core2'.
- The test on gitlab CI needed an update of the target
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
c047495981 concretizer: virtual entry in packages.yaml, external modules
This commit adds support for specifying rules in
packages.yaml that refer to virtual packages.

The approach is to normalize in memory each
configuration and turn it into an equivalent
configuration without rules on virtual. This
is possible if the set of packages to be handled
is considered fixed.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
4fe527cd3b concretizer: concretize a virtual root
Before this modification the root of a DAG has to be
a real package. This commit adds rules to concretize
virtual roots.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
1b115e200b concretizer: handle version preferences from packages.yaml 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
9c23ed6484 concretizer: handle target preferences from packages.yaml
The weight of the target used in concretization is, in order:
1. A specific per package weight, if set in packages.yaml
2. Inherited from the parent, if possible
3. The default target weight (always set)
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
67344326c3 concretizer: fixed test on compiler preferences 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
d4b83daa48 concretizer: added logic for preferred variants
If preferred variants are present, they'll
set the default value of a variant. Otherwise
the default value is what is encoded
in package.py
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
81c7cf45e1 concretizer: refine compiler logic
Concrete versions for compilers are respected
verbatim.

Permit to use a non-existing compiler if the
appropriate configuration option has been
set.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
c065245f14 Fixed failing unit tests
- Tests based on TestArchitecture
- Tests on non-buildable external
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
2ea8bd0b19 concretizer: prefer using the same compiler over using newer versions 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
28afdb9530 concretizer: added support for versioned virtual specs 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
48eb50921a concretizer: added rules and code for externals
Generate facts on externals by inspecting
packages.yaml. Added rules in concretize.lp

Added extra logic so that external specs
disregard any conflict encoded in the
package.

In ASP this would be a simple addition to
an integrity constraint:

:- c1, c2, c3, not external(pkg)

Using the the Backend API from Python it
requires some scaffolding to obtain a default
negated statement.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
b92659c3bf package_sanity: fixed wrong string format 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
1cdee03c4b concretizer: add conflict rules from packages
Conflict rules from packages are added as integrity
constraints in the ASP formulation. Most of the code
to generate them has been reused from PyclingoDriver.rules
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
2595b58503 test_noversion_pkg: generalized the error to be caught
The new concretizer and the old concretizer solve constraints
in a different way. Here we ensure that a SpackError is raised,
instead of a specific error that made sense in the old concretizer
but probably not in the new.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
aaa75b831f compiler constraints: deduplicate the list of compilers before encoding one_of_iff rules
This fixes 8 bugs in test/concretize.py
2020-11-17 10:04:13 -08:00
Todd Gamblin
e56f90c3ef concretizer: add compiler version constraints
Add rules to account for compiler version
constraints in concretize.lp.
2020-11-17 10:04:13 -08:00
Todd Gamblin
115384afbd concretizer: use cardinality constraints for versions
Instead of python callbacks, use cardinality constraints for package
versions. This is slightly faster and has the advantage that it can be
written to an ASP program to be executed *outside* of Spack. We can use
this in the future to unify the pyclingo driver and the clingo text
driver.

This makes use of add_weight_rule() to implement cardinality constraints.
add_weight_rule() only has a lower bound parameter, but you can implement
a strict "exactly one of" constraint using it. In particular, wee want to
define:

    1 {v1; v2; v3; ...} 1 :- version_satisfies(pkg, constraint).
    version_satisfies(pkg, constraint) :- 1 {v1; v2; v3; ...} 1.

And we do that like this, for every version constraint:

    atleast1(pkg, constr) :- 1 {version(pkg, v1); version(pkg, v2); ...}.
    morethan1(pkg, constr) :- 2 {version(pkg, v1); version(pkg, v2); ...}.
    version_satisfies(pkg, constr) :- atleast1, not morethan1(pkg, constr).
    :- version_satisfies(pkg, constr), morethan1.
    :- version_satisfies(pkg, constr), not atleast1.

v1, v2, v3, etc. are computed on the Python side by comparing every
possible package version with the constraint.

Computing things like this has the added advantage that if v1, v2, v3,
etc. comprise *all* possible versions of a package, we can just omit the
rules for the constraint under consideration. This happens pretty
frequently in the Spack mainline.
2020-11-17 10:04:13 -08:00
Todd Gamblin
0ed019d4ef concretizer: first working version with pyclingo interface
- [x] Solver now uses the Python interface to clingo
- [x] can extract unsatisfiable cores from problems when things go wrong
- [x] use Python callbacks for versions instead of choice rules (this may
      ultimately hurt performance)
2020-11-17 10:04:13 -08:00
Todd Gamblin
14ab63f97c concretizer: add a configuration option to use new or old concretizer
- [x] spec.py can call out to the new concretizer
- [x] config.yaml now has an option to choose a concretizer (original, clingo)
2020-11-17 10:04:13 -08:00
Todd Gamblin
2dd06f14f9 concretizer: use repository names, not specs with is_virtual 2020-11-17 10:04:13 -08:00
Todd Gamblin
ac9405a80e concretizer: refactor to support multiple solver backends
There are now three parts:

- `SpackSolverSetup`
  - Spack-specific logic for generating constraints. Calls methods on
    `AspTextGenerator` to set up the solver with a Spack problem. This
    shouln't change much from solver backend to solver backend.

- ClingoDriver
  - The solver driver provides methods for SolverSetup to generates an ASP
    program, send it to `clingo` (run as an external tool), and parse the
    output into function tuples suitable for `SpecBuilder`.
  - The interface is generic and should not have to change much for a
    driver for, say, the Clingo Python interface.

- SpecBuilder
  - Builds Spack specs from function tuples parsed by the solver driver.
2020-11-17 10:04:13 -08:00
Todd Gamblin
5bb83be827 concretizer: set spec constraints correctly for body and head 2020-11-17 10:04:13 -08:00
Todd Gamblin
cd55fd4bd3 concretizer: allow non-default OS, inherit OS along dependencies 2020-11-17 10:04:13 -08:00
Todd Gamblin
51cb49743e tests: add framework to mock targets 2020-11-17 10:04:13 -08:00
Todd Gamblin
43e7255e19 concretizer: split platforms, OS, and targets apart in Python and ASP 2020-11-17 10:04:13 -08:00
Todd Gamblin
cb919c2e39 concretizer: targets are inherited like compilers 2020-11-17 10:04:13 -08:00
Todd Gamblin
afa74ea155 concretizer: change single-letter variables to descriptive names
The original implementation was difficult to read, as it only had
single-letter variable names.  This converts all of them to descriptive
names, e.g., P -> Package, V -> Virtual/Version/Variant, etc.
2020-11-17 10:04:13 -08:00
Todd Gamblin
35ae4c0ddd concretizer: handle compiler existence check settings
To handle unknown compilers propely in tests (and elsewhere), we need to
add unknown compilers from the spec to the list of possible compilers.

Rework how the compiler list is generated and includes compilers from
specs if the existence check is disabled.
2020-11-17 10:04:13 -08:00
Todd Gamblin
3b648c294e concretizer: add initial package existence check 2020-11-17 10:04:13 -08:00
Todd Gamblin
520b71e89b concretizer: handle virtual spec constraints better
Specs like hdf5 ^mpi were unsatisfiable because we added a requierment
for `node("mpi").`.  This can't be resolved because "mpi" is not a
package.

- [x] Introduce `virtual_node()`, which says *some* provider must be in
      the DAG.
2020-11-17 10:04:13 -08:00
Todd Gamblin
3ef7c06a48 concretizer: solve with compiler flags but preserve order
This adds compiler flags to the ASP solve so that we can have conditions
based on them in the solve.  But, it keeps order out of the solve to
avoid unneeded complexity and combinatorial explosions.

The solver determines which flags are on a spec, but the order is
determined by DAG precedence (childrens' flags take precedence over
parents' and are added on the right) and order (order flags were
specified on the command line is respected).

The solver is responsible for determining when to propagate flags, when
to inheit them from other nodes, when to take them from compiler
preferences, etc.
2020-11-17 10:04:13 -08:00
Todd Gamblin
7a1b5ca65e concretizer: add timers around phases 2020-11-17 10:04:13 -08:00
Todd Gamblin
5185ed1d28 concretizer: optimize microarchitectures, constrained by compiler support
Weight microarchitectures and prefers more rercent ones.  Also disallow
nodes where the compiler does not support the selected target.

We should revisit this at some point as it seems like if I play around
with the compiler support for different architectures, the solver runs
very slowly.  See notes in comments -- the bad case was gcc supporting
broadwell and skylake with clang maxing out at haswell.
2020-11-17 10:04:13 -08:00
Todd Gamblin
71726a9b33 concretizer bugfix: require at least one value for multi-value variants
We didn't have a cardinality constraint for multi-valued variants, so the
solver wasn't filling them in.

- [x] add a requirement for at least one value for multi-valued variants
2020-11-17 10:04:13 -08:00
Todd Gamblin
309ae856ab commands: add --json argument to spack solve 2020-11-17 10:04:13 -08:00
Todd Gamblin
cb8ca505ef concretizer: make some rules into facts 2020-11-17 10:04:13 -08:00
Todd Gamblin
810542d4fe concretizer bugfix: all variants need possible values
Variants like `cpu_target` on `openblas` don't have defineed values, but
they have a default.  Ensure that the default is always a possible value
for the solver.
2020-11-17 10:04:13 -08:00
Todd Gamblin
9b1f05df00 concretizer bugfix: fix generations of conditionals for dependencies
Spack was generating the same dependency connstraints twice in the output ASP:

```
declared_dependency("abinit", "hdf5", "link")
    :- node("abinit"),
       variant_value("abinit", "mpi", "True"),
       variant_value("abinit", "mpi", "True").
```

This was because `AspFunction` was modifying itself when called.

- [x] fix `AspFunction` so that every call returns a new object
2020-11-17 10:04:13 -08:00
Todd Gamblin
e31be3da56 concretizer bugfix: *at most* one provider for any virtual 2020-11-17 10:04:13 -08:00
Todd Gamblin
04295f6531 concretizer: optimized for preferred virtuals before recent versions 2020-11-17 10:04:13 -08:00
Todd Gamblin
f365373a3d concretizer: handle compiler preferences with optimization
- [x] Add support for packages.yaml and command-line compiler preferences.
- [x] Rework compiler version propagation to use optimization rather than
  hard logic constraints
2020-11-17 10:04:13 -08:00
Todd Gamblin
1859ff31c9 concretizer: deterministic order for asp output for better diffs
Technically the ASP output order does not matter, but it's hard to diff
two different solve fomulations unless we order it.

- [x] make sure ASP output is emitted in a deterministic order (by
      sorting all hash keys)
2020-11-17 10:04:13 -08:00
Todd Gamblin
36dae9ee05 concretizer: rename --dump to --show 2020-11-17 10:04:13 -08:00
Todd Gamblin
da215b50a3 concretizer: handle package namespaces 2020-11-17 10:04:13 -08:00
Todd Gamblin
4d34363c1d concretizer: handle constraints on dependencies, adjust optimization
This needs more thought, as I am pretty sure the weights are not correct.
Or, at least, I'm not convinced that they do what we want in all cases.
See note in concretize.lp.
2020-11-17 10:04:13 -08:00
Todd Gamblin
db62b00d58 concretizer: handle dependency types 2020-11-17 10:04:13 -08:00
Todd Gamblin
cde10692b0 concretizer: prioritize versions by package pref, newest, preferred, actual
Solver now prefers newer versions like the old concretizer.  Prefer
package preferences from packages.yaml, preferred=True, package
definition, and finally each version itself.
2020-11-17 10:04:13 -08:00
Todd Gamblin
18fba433f6 concretizer: Use "competition" output format to avoid extra parsing
Competition output only prints out one model, so we do not have to
unnecessarily parse all the non-optimal models.  We'll just look at the
best model and bring that in.

In practice, this saves a lot of JSON parsing and spec construction time.
2020-11-17 10:04:13 -08:00
Todd Gamblin
b4e6d9d28e concretizer: handle virtual provider preferences from packages.yaml 2020-11-17 10:04:13 -08:00
Todd Gamblin
36ec66d997 concretizer: use clingo json output instead of text
Clingo actually has an option to output JSON -- use that instead of
parsing the raw otuput ourselves.

This also allows us to pick the best answer -- modify the parser to
*only* construct a spec for that one rather than building all of them
like we did before.
2020-11-17 10:04:13 -08:00
Todd Gamblin
a332981f2f concretizer: require only one provider for any virtual in the DAG 2020-11-17 10:04:13 -08:00
Todd Gamblin
501cb371c9 concretizer: handle variant defaults with optimization
- Instead of using default logic, handle variant defaults by minimizing
  the number of non-default variants in the solution.

- This actually seems to be pretty fast, and it fixes the long-standing
  issue that writing this:

      spack install hdf5 ^mpich

  will fail if you don't specify hdf5+mpi.  With optimization and
  allowing enums to be enumerated, the solver seems to be able to quickly
  discover that +mpi is the only way hdf5 can depend on mpich, and it
  forces the switch to be thrown.
2020-11-17 10:04:13 -08:00
Todd Gamblin
1cab1b1994 concretizer: support conditional dependencies 2020-11-17 10:04:13 -08:00
Todd Gamblin
51af590e64 variants: allow MultiValuedVariants to be constructed incrementally 2020-11-17 10:04:13 -08:00
Todd Gamblin
be10568a6a concretizer: initial support for virtual dependencies
Add initial support for virtual dependencies.  Solver now knows about all
virtuals and can choose one to resolve a dependency.
2020-11-17 10:04:13 -08:00
Todd Gamblin
3f93553a08 concretizer: print out virtuals 2020-11-17 10:04:13 -08:00
Todd Gamblin
8a6207aa70 concretizer: handle versions with choice construct rather than conflicts
Use '1 { version(x); version(y); version(z) } 1.' instead of declaring
conflicts for non-matching versions.  This keeps the sense of version
clauses positive, which will allow them to be used more easily in
conditionals later.

Also refactor `spec_clauses()` method to return clauses that can be used
in conditions, etc. instead of just printing out facts.
2020-11-17 10:04:13 -08:00
Todd Gamblin
6e31430bec concretizer: add another definition pragma.
- single_value_variant may not be defined by the generated program.  Mark
  it to avoid warnings.
2020-11-17 10:04:13 -08:00
Todd Gamblin
a81258663c concretizer: cleanup 2020-11-17 10:04:13 -08:00
Todd Gamblin
6bbc64555b concretizer: use conditional literals for versions. 2020-11-17 10:04:13 -08:00
Todd Gamblin
4288639770 concretizer: mark depends_on/2 defined for solves without dependencies. 2020-11-17 10:04:13 -08:00
Todd Gamblin
60cf3fdb34 concretizer: add basic semantics for compilers
- This handles setting the compiler and falling back to a default
  compiler, as well as providing default values for compilers/compiler
  versions.

- Versions still aren't quite right -- you can't properly override
  versions on compiler specs.
2020-11-17 10:04:13 -08:00
Todd Gamblin
f7dce19754 concretizer: simplify and move architecture semantics into concretize.lp
- Model architecture default settings and propagation off of variants

- Leverage ASP default logic to set architecture to default if it's not
  set otherwise.

- Move logic out of Python and into concretize.lp as first-order rules.
2020-11-17 10:04:13 -08:00
Todd Gamblin
34ea3d20cf concretizer: break output up into easier-to-understand sections 2020-11-17 10:04:13 -08:00
Todd Gamblin
d94d957536 concretizer: simplify and suppress warnings for variant handling
We are relying on default logic in the variant handling in that we set a
default value if we never see `variant_set(P, V, X)`.

- Move the logic for this into `concretize.lp` instead of generating it
  for every package.

- For programs that don't have explicit variant settings, clingo warns
  that variant_set(P, V, X) doesn't appear in any rule head, because a
  setting is never generated.

  - Specifically suppress this warning.
2020-11-17 10:04:13 -08:00
Todd Gamblin
b171ac5050 concretizer: split long lines in ASP programs 2020-11-17 10:04:13 -08:00
Todd Gamblin
573a2612dc concretizer: split main logic program out into files
- Add `concretize.lp` and `display.lp` as independent files
- Dump them instead of embedded strings
2020-11-17 10:04:13 -08:00
Todd Gamblin
8bc1092f41 concretizer: colorize ASP output 2020-11-17 10:04:13 -08:00
Todd Gamblin
3637b611a7 concretizer: move dump logic into solver.asp
- moving the dump logic into spack.solver.asp.solve() allows us to print
  out useful debug info sooner

- prior approach required a successful solve to print out anyhting.
2020-11-17 10:04:13 -08:00
Todd Gamblin
81e187e410 concretizer: first rudimentary round-trip with asp-based solver 2020-11-17 10:04:13 -08:00
Todd Gamblin
c7812f7e10 concretizer: add rudimentary variants with defaults to ASP solve 2020-11-17 10:04:13 -08:00
Todd Gamblin
a8a6d943d6 concretizer: beginnings of solve() command
- `spack solve` command outputs a really basic ASP program that handles
  unconditional dependencies, architecture and versions

- doesn't yet handle conflicts, picking latest versions, preferred
  versions, compilers, etc.

- doesn't handle variants
2020-11-17 10:04:13 -08:00
Todd Gamblin
5b725a37bc repo: Add all_package_classes() method.
- We were able to get names and instances previously
- Add a convenience function to get package classes
2020-11-17 10:04:13 -08:00
Robert Underwood
f359664493
include share/pkgconfig in user environments (#19909)
According to the documentation for spack and pkg-config,
$view/share/pkgconfig should also be a valid place to look
for package config files.  This commit ensures that when
spack activate env $dir is called, the environment has this
directory in PKG_CONFIG_PATH.
2020-11-17 11:10:28 -06:00
Tamara Dahlgren
6fa6af1070
Support parallel environment builds (#18131)
As of #13100, Spack installs the dependencies of a _single_ spec in parallel.
Environments, when installed, can only get parallelism from each individual
spec, as they're installed in order.  This PR makes entire environments build
in parallel by extending Spack's package installer to accept multiple root
specs.  The install command and Environment class have been updated to use
the new parallel install method.

The specs and kwargs for each *uninstalled* package (when not force-replacing
installations) of an environment are collected, passed to the `PackageInstaller`,
and processed using a single build queue.

This introduces a `BuildRequest` class to track install arguments, and it
significantly cleans up the code used to track package ids during installation.
Package ids in the build queue are now just DAG hashes as you would expect,

Other tasks:

- [x] Finish updating the unit tests based on `PackageInstaller`'s use of
      `BuildRequest` and the associated changes
- [x] Change `environment.py`'s `install_all` to use the `PackageInstaller` directly
- [x] Change the `install` command to leverage the new installation process for multiple specs
- [x] Change install output messages for external packages, e.g.:
       `[+] /usr` -> `[+] /usr (external bzip2-1.0.8-<dag-hash>`
- [x] Fix incomplete environment install's view setup/update and not confirming all 
       packages are installed (?)
- [x] Ensure externally installed package dependencies are properly accounted for in 
       remaining build tasks
- [x] Add tests for coverage (if insufficient and can identity the appropriate, uncovered non-comment lines)
- [x] Add documentation
- [x] Resolve multi-compiler environment install issues
- [x] Fix issue with environment installation reporting (restore CDash/JUnit reports)
2020-11-17 02:41:07 -08:00
Wouter Deconinck
423e80af23
spack edit: accept readonly packages (#19949) 2020-11-16 17:14:46 -08:00
Scott Wittenburg
ef0a555ca2
pipelines: support testing PRs from forks (#19248)
This change makes improvements to the `spack ci rebuild` command
which supports running gitlab pipelines on PRs from forks.  Much
of this has to do with making sure we can run without the secrets
previously required for running gitlab pipelines (e.g signing key,
aws credentials, etc).  Specific improvements in this PR:

Check if spack has precisely one signing key, and use that information
as an additional constraint on whether or not we should attempt to sign
the binary package we create.

Also, if spack does not have at least one public key, add the install
option "--no-check-signature"

If we are running a pipeline without any profile or environment
variables allowing us to push to S3, the pipeline could still
successfully create a buildcache in the artifacts and move on.  So
just print a message and move on if pushing either the buildcache
entry or cdash id file to the remote mirror fails.

When we attempt to generate a pacakge or gpg key index on an S3
mirror, and there is nothing to index, just print a warning and
exit gracefully rather than throw an exception.

Support the use of PR-specific mirrors for temporary binary pkg
storage.  This will allow quality-of-life improvement for developers,
providing a place to store binaries over the lifetime of a PR, so
that they must only wait for packages to rebuild from source when
they push a new commit that causes it to be necessary.

Replace two-pass install with a single pass and the new option:
 --require-full-hash-match.  Doing this also removes the need to
save a copy of the spack.yaml to be copied over the one spack
rewrites in between the two spack install passes.

Work around a mirror configuration issue caused by using
spack.util.executable to do the package installation.

* Update pipeline trigger jobs for PRs from forks

Moving to PRs from forks relies on external synchronization script
pushing special branch names.  Also secrets will only live on the
spack mirror project, and must be propagated to the E4S project via
variables on the trigger jobs.

When this change is merged, pipelines will not run until we update
the "Custom CI configuration path" in the Gitlab CI Settings, as the
name of the file has changed to better reflect its purpose.

* Arg to MirrorCollection is used exclusively, so add main remote mirror to it

* Compute full hash less frequently

* Add tests covering index generation error handling code
2020-11-16 15:16:24 -08:00
Adam J. Stewart
3536433c41
macOS: Big Sur reports as either 10.16 or 11.0 (#19900) 2020-11-15 11:54:39 -06:00
Greg Becker
fafff0c6c0
move sbang to unpadded install tree root (#19640)
Since #11598 sbang has been installed within the install_tree. This doesn’t play
nicely with install_tree padding, since sbang can’t do its job if it is installed in a
long path (this is the whole point of sbang).

This PR changes the padding specification.  Instead of $padding inside paths,
we now have a separate `padding:` field in the `install_tree` configuration.

Previously, the `install_tree` looked like this:

```
    /path/to/opt/spack_padding_padding_padding_padding_padding/
        bin/
            sbang
        .spack-db/
            ...
        linux-rhel7-x86_64/
            ...
```

```
This PR updates things to look like this:

    /path/to/opt/
        bin/
            sbang
        spack_padding_padding_padding_padding_padding/
            .spack-db/
                ...
            linux-rhel7-x86_64/
                ...

So padding is added at the start of all install prefixes *within* the unpadded
root.  The database and all installations still go under the padded root.

This ensures that `sbang` is in the shorted possible path while also allowing
us to make long paths for relocatable binaries.
2020-11-12 16:08:55 -08:00
Peter Scheibel
32bfe0a001
Testing: ensure that all packages can be pickled (#19890)
As of #18205, all packages must be pickle-able to be installed by
Spack.

This adds a test to check that each package can be pickled. If any
package fails to pickle, the test keeps going and collects the names
of all failed packages; it then takes the first one that failed and
attempts to re-pickle it, generating the full stack trace for the
failed pickle attempt.
2020-11-12 15:55:34 -08:00
Adam J. Stewart
02281a891d
MavenPackage: allow additional build args (#19676) 2020-11-12 12:57:49 -08:00
Peter Scheibel
bb42470211
macos: update build process to use spawn instead of fork (#18205)
Spack creates a separate process to do package installation. Different
operating systems and Python versions use different methods to create
it but up until Python 3.8 both Linux and Mac OS used "fork" (which
duplicates process memory, file descriptor table, etc.).

Python >= 3.8 on Mac OS prefers creating an entirely new process
(referred to as the "spawn" start method) because "fork" was found to
cause issues (in other words "spawn" is the default start method used
by multiprocessing.Process). Spack was dependent on the particular
behavior of fork to replicate process memory and transmit file
descriptors.

This PR refactors the Spack internals to support starting a child
process with the "spawn" method. To achieve this, it makes the
following changes:

- ensure that the package repository and other global state are
  transmitted to the child process
- ensure that file descriptors are transmitted to the child process in
  a way that works with multiprocessing and spawn
- make all the state needed for the build process and tests picklable
  (package, stage, etc.)
- move a number of locally-defined functions into global scope so that
  they can be pickled
- rework tests where needed to avoid using local functions

This PR also reworks sbang tests to work on macOS, where temporary
directories are deeper than the Linux sbang limit. We make the limit
platform-dependent (macOS supports 512-character shebangs)

See: #14102
2020-11-12 12:26:23 -08:00
Scott Wittenburg
fbbd71d3d7
Pipelines: Compare target family instead of architecture (#19884)
In compiler bootstrapping pipelines, we add an artificial dependency
between jobs for packages to be built with a bootstrapped compiler
and the job building the compiler.  To find the right bootstrapped
compiler for each spec, we compared not only the compiler spec to
that required by the package spec, but also the architectures of
the compiler and package spec.

But this prevented us from finding the bootstrapped compiler for a
spec in cases where the architecture of the compiler wasn't exactly
the same as the spec.  For example, a gcc@4.8.5 might have 
bootstrapped a compiler with haswell as the architecture, while the 
spec had broadwell.  By comparing the families instead of the architecture
 itself, we know that we can build the zlib for broadwell with the gcc for 
haswell.
2020-11-12 10:46:15 -08:00
Greg Becker
527a81b469
Keep output machine readable using spack find --format in an env (#19698)
Currently, full JSON output is the only machine readable option for `spack find`
in an environment.

`spack find --format` is also designed to be machine readable, but we print extra
headers in environments.

-[x] don't print headers in `spack find` output when in an environment
2020-11-11 22:13:51 -08:00
Satish Balay
33469414a5
fix typo wrt target=graviton (#19865)
* fix typo wrt target=graviton

This fixes spack build on aarch64 box

* update archspec hash
2020-11-11 19:29:13 -06:00
Massimiliano Culpo
e80276cd14
spack env deactivate/spack unload: demote warning message to debug message (#19864) 2020-11-11 12:02:03 -08:00
Adam J. Stewart
d1ca322aef
Restore spack checksum verbosity (#19480) 2020-11-11 11:26:17 -06:00
Peter Scheibel
9d5f4f9c6f
Binary caching: fix buildcache list (multiple invocations) (#19848)
When invoking "buildcache list" multiple times, the command was
reporting no specs in the cache the second time around. The
presence of an up-to-date index was causing the internal
representation to be left un-initialized.
2020-11-10 23:24:18 -08:00
Greg Becker
0183a51c6c
tutorial cmd: fix gpg invocation (#19829) 2020-11-09 21:09:36 -08:00
Adam J. Stewart
228a4d353c
Fix minor typo in function comment (#19804) 2020-11-09 20:25:45 -05:00
Emir İşman
98e709da3b
[docs] getting_started.rst: fix typo (#19815) 2020-11-09 16:31:53 -06:00
Todd Gamblin
4092c90b57
commands: add spack tutorial command (#19808)
Added a command to set up Spack for our tutorial at
https://spack-tutorial.readthedocs.io.

The command does some common operations we need first-time users to do.
Specifically:

- checks out a particular branch of Spack
- deletes spurious configuration in `~/.spack` that might be
  left over from prior parts of the tutorial
- adds a mirror and trusts its public key
2020-11-09 12:47:08 +01:00
Greg Becker
8b96e10ecc
Remove hardcoded version numbers from container logic (#19716)
Previously, we hardcoded a list of Spack versions which could be used by the containerize command.

This PR removes that list. It's a maintenance burden when cutting a release, and prevents older versions of Spack from creating containers to be used by newer versions.
2020-11-05 18:59:44 +01:00
Tamara Dahlgren
619eb6c08b
bug fix: Display error when curl is missing even in non-debug mode (#19695) 2020-11-04 15:47:08 -08:00
Shahzeb Siddiqui
75e73d7fcc
documentation: fix formatting of code-block section (#19693) 2020-11-03 12:15:46 +01:00
Todd Gamblin
ecc3bfd484
Bugfix - hashing: don't recompute full_hash or build_hash (#19672)
There was an error introduced in #19209 where `full_hash()` and
`build_hash()` are called on older specs that we've read in from the DB;
older specs may not be able to compute these hashes (e.g. if they have
removed patches used in computing the full_hash).

When serializing a Spec, we want to generate the full/build hash when
possible, but we need a mechanism to skip it for Specs that have
themselves been read from YAML (and may not support this).

To get around this ambiguity and to fix the issue, we:

- Add an attribute to the spec called `_hashes_final`, that is `True`
  if we can't lazily compute `build_hash` and `full_hash`.
- Set `_hashes_final` to `False` for new specs (i.e., lazily
  computing hashes is ok)
- Set `_hashes_final` to `True` for concrete specs read in via
  `from_node_dict`, as it may be too late to recompute hashes.
- Compute and write out all hashes in `node_dict_with_hashes` *if
  possible*.

Effectively what this means is that we can round-trip specs that are
missing `_build_hash` and `_full_hash` without recomputing them, but for
all new specs, we'll compute them and store them. So Spack should work
fine with old DBs now.
2020-11-02 13:21:11 -08:00
Todd Gamblin
a80d221bfa sbang: fixes for sbang relocation
This fixes sbang relocation when using old binary packages, and updates
code in `relocate.py`.

There are really two places where we would want to handle an `sbang`
relocation:

1. Installing an old package that uses `sbang` with shebang lines like
   `#!/bin/bash $spack_prefix/sbang`
2. Installing a *new* package that uses `sbang` with shebang lines like
   `#!/bin/sh $install_tree/sbang`

The second case is actually handled automatically by our text relocation;
we don't need any special relocation logic for new shebangs, as our
relocation logic already changes references to the build-time
`install_tree` to point to the `install_tree` at intall-time.

Case 1 was not properly handled -- we would not take an old binary
package and point its shebangs at the new `sbang` location. This PR fixes
that and updates the code in `relocation.py` with some notes.

There is one more case we don't currently handle: if a binary package is
created from an installation in a short prefix that does *not* need
`sbang` and is installed to a long prefix that *does* need `sbang`, we
won't do anything. We should just patch the file as we would for a normal
install. In some upcoming PR we should probably change *all* `sbang`
relocation logic to be idempotent and to apply to any sort of shebang'd
file. Then we'd only have to worry about which files to `sbang`-ify at
install time and wouldn't need to care about these special cases.
2020-11-01 16:23:48 -08:00
Massimiliano Culpo
c4aa5cb5bc
Update documentation on containers (#19631)
fixes #15183

- Moved the container related content from
  workflows.rst into containers.rst
- Deleted the docker_for_developers.rst file,
  since it describes an outdated procedure

Co-authored-by: Axel Huebl <a.huebl@hzdr.de>
Co-authored-by: Omar Padron <omar.padron@kitware.com>
2020-10-30 21:17:15 +01:00
Massimiliano Culpo
33c3c3c700
Config: cache results of get_config (#19605)
`config.get_config` now caches the results and returns the same
configuration if called multiple times with the same arguments
(i.e. the same section and scope).

As a consequence, it is expected that users will always call
update methods provided in the `config` module after changing
the configuration (even if manipulating it as a Python nested
dictionary). The following two examples should cover most
scenarios:

* Most configuration update logic in the core (e.g. relating to
  adding new compiler) should call `Configuration.update_config`
* Tests that need to change the global configuration should use the
  newly-provided `config.replace_config` function.

(if neither of these methods apply, then the essential requirement
is to use a method marked as `_config_mutator`)

Failure to call such a function after modifying the configuration
will lead to unexpected results (e.g. calling `get_config` after
changing the configuration will not reflect the changes since the
first call to get_config).
2020-10-30 13:10:45 -07:00
Massimiliano Culpo
458d88eaad
Make archspec a vendored dependency (#19600)
- Added archspec to the list of vendored dependencies
- Removed every reference to llnl.util.cpu
- Removed tests from Spack code base
2020-10-30 13:02:14 -07:00
Scott Wittenburg
31f57e56bb
Binary caching: use full hashes (#19209)
* "spack install" now has a "--require-full-hash-match" option, which
  forces Spack to skip an available binary package when the full hash
  doesn't match. Normally only a DAG-hash match is required, which
  ensures equivalent Specs, but does not account for changing logic
  inside the associated package.
* Add a local binary cache index which tracks specs that have a binary
  install available in a remote binary cache. It is updated with
  "spack buildcache list" or for a given spec when a binary package
  is retrieved for that Spec.
2020-10-30 12:53:33 -07:00
Peter Scheibel
3a863020f0
CI: disable vermin check for deprecated hash (#19612)
Spack has a fallback for hash checking with m55sums that may not be
supported in earlier versions of Python 3.x. The comments in the
Spack code acknowledge that this is best effort and may fail, but
recent vermin checks (running as part of our CI) reject this. This
disables vermin checks for that fallback.
2020-10-29 22:23:36 -07:00
Frank Willmore
c954d50998
Oneapi add compiler (#19330)
* enable flatcc to be built with gcc/9.X.X

* add static option for building libyogrt

* cleanup

* Initial working version

* rework new oneapi wrappers

* tested and removed my initials from source

* cleanup

* Update __init__.py

* remove whitespace

* working now with mods for testing, detection. Detection for oneapi is working, but entry needs to be modified to add link path for libimf.so. Cleared cruft for old Intel versions

* fixed some formatting

* cleanup

* flake8 cleanup

* flake8

* fixed syntax of compiler version detection tests

* fixed syntax of compiler version detection tests

	modified:   detection.py

* fix typo

* fixes for compilers tests

* remove erroneous tests for outdated -std= flags, remove ifx version check (output won't parse)

Co-authored-by: Frank Willmore <willmore@anl.gov>
2020-10-29 16:52:54 -05:00
Todd Gamblin
aebf20ebdc sbang: vendor sbang
`sbang` now lives at https://github.com/spack/sbang, and it has its own
test suite that's more extensive than what's in Spack. We'll leave sbang
tests to sbang from now on, and just vendor `bin/sbang` directly.
Remaining `sbang` tests have to do with patching files, not with
`sbang`'s functionality.

This update also fixes a bug with `sbang` and multiple command line
arguments that was introduced in #19529. See:
  * https://github.com/spack/sbang/pull/1
  * https://github.com/spack/sbang/pull/2

- [x] include latest `sbang` from https://github.com/spack/sbang
- [x] remove old `sbang` tests from Spack
- [x] update `COPYRIGHT` and `cmd/license.py`
2020-10-28 17:43:23 -07:00
Todd Gamblin
ec9456feb8 sbang: convert sbang script to POSIX shell
`sbang` was previously a bash script but did not need to be. This
converts it to a plain old POSIX shell script and adds some options. This
also allows us to simplify sbang shebangs to `#!/bin/sh /path/to/sbang`
instead of `#!/bin/bash /path/to/sbang`.

The new script passes shellcheck (with a few exceptions noted in the file)

- [x] `SBANG_DEBUG` env var enables printing what *would* be executed
- [x] `sbang` checks whether it has been passed an option and fails gracefully
- [x] `sbang` will now fail if it can't find a second shebang line, or if
      the second line happens to be sbang (avoid infinite loops)
- [x] add more rigorous tests for `sbang` behavior using `SBANG_DEBUG`
2020-10-27 13:59:46 -07:00
Toyohisa Kameyama
bb00b1a7c9
sbang: add support for php (#18299)
PHP supports an initial shebang, but its comment syntax can't handle our 2-line
shebangs. So, we need to embed the 2nd-line shebang comment to look like a
PHP comment:

    <?php #!/path/to/php ?>

This adds patching support to the sbang hook and support for
instrumenting php shebangs.

This also patches `phar`, which is a tool used to create php packages.
`phar` itself has to add sbangs to those packages (as phar archives
apparently contain UTF-8, as well as binary blobs), and `phar` sets a
checksum based on the contents of the package.

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-10-26 22:11:43 -07:00
Patrick Gartung
1c2c30a139
sbang: put sbang in the install_tree (#11598)
`sbang` is not always accessible to users of packages, e.g., if Spack
is installed in someone's home directory and they deploy software
for others.  Avoid this by:

1. Always installing the `sbang` script in the `install_tree`
2. Relocating binaries to point to the copy in the `install_tree` 
   and not the one in the Spack installation.

This PR also:
- ensures that `sbang` is reinstalled if it is modified in Spack
- adds tests
- updates the way `gobject-introspection` patches Makefiles
   to support `sbang`

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-10-26 12:37:54 -07:00
Todd Gamblin
fd6c163e02
bugfix: test_push_and_fetch_keys should be skipped w/o gpg (#19511)
- [x] add a `@pytest.skipif` decorator
2020-10-26 07:24:49 -07:00
Todd Gamblin
01953dc513
bugfix: fix config merge order for OrderdDicts (#18482)
The logic in `config.py` merges lists correctly so that list elements
from higher-precedence config files come first, but the way we merge
`dict` elements reverses the precedence.

Since `mirrors.yaml` relies on `OrderedDict` for precedence, this bug
causes mirrors in lower-precedence config scopes to be checked before
higher-precedence scopes.

We should probably convert `mirrors.yaml` to use a list at some point,
but in the meantie here's a fix for `OrderedDict`.

- [x] ensuring that keys are ordered correctly in `OrderedDict` by
      re-inserting keys from the destination `dict` after adding the keys from
      the source `dict`.

- [x] also simplify the logic in `merge_yaml` by always reinserting
      common keys -- this preserves mark information without all the special
      cases, and makes it simpler to preserve insertion order.

Assuming a default spack configuration, if we run this:

```console
$ spack mirror add foo https://bar.com
```

Results before this change:

```console
$ spack config blame mirrors
---                                                          mirrors:
/Users/gamblin2/src/spack/etc/spack/defaults/mirrors.yaml:2    spack-public: https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
/Users/gamblin2/.spack/mirrors.yaml:2                          foo: https://bar.com
```

Results after:

```console
$ spack config blame mirrors
---                                                          mirrors:
/Users/gamblin2/.spack/mirrors.yaml:2                          foo: https://bar.com
/Users/gamblin2/src/spack/etc/spack/defaults/mirrors.yaml:2    spack-public: https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
```
2020-10-24 16:48:04 -07:00
Todd Gamblin
2893c23e7c
docs: update docs on shell support and using packages (#19486)
Shell integration no longer requires setting `SPACK_ROOT`, so we can
simplify the documentation on it. The docs on shell support and using
packages are getting a bit old, and information on `spack load` (which
seems to be everyone's most common way of using packages) is hard to
find.

This PR simplifies the shell documentation to remove SPACK_ROOT, and also
moves some sections around for clearer organization.

- [x] make docs on sourcing setup scripts clearer and simpler

- [x] introduce `spack load` early in the basic usage guide instead of
      burying it in the module docs

- [x] clean up module docs so that spack module tcl loads comes later

- [x] be clear about the different ways to use packages so that the users
      can find the docs better.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-10-23 22:16:01 -07:00
Todd Gamblin
560beb098e
csh: don't require SPACK_ROOT for sourcing setup-env.csh (#18225)
Don't require SPACK_ROOT for sourcing setup-env.csh and make output more consistent
2020-10-23 18:54:34 -07:00
Massimiliano Culpo
4ec404dfc0
Add scratch module roots to test configuration (#19477)
fixes #19476

Module file content is written to file in a
temporary location and read back to be analyzed
by unit tests.

The approach to patch "open" and write to a
StringIO in memory has been abandoned, since
over time other operations insisting on the
filesystem have been added to the module file
generator.
2020-10-22 13:59:39 -07:00
Todd Gamblin
8060cca494
tests: increase tolerance of termios tests (#19456)
Synchronization on GitHub macOS runners seems to be very slow, and
frequently the foreground/background tests fail due to the race this
causes. This increases the tolerance for slowness a bit more, to allow up
to 4 spurious output lines in the tests.

This should hopefully result in no more false negatives on these tests
for macOS on GitHub.
2020-10-21 18:12:48 -07:00
Tamara Dahlgren
e78764caa1
Added _poll_lock exception tests (#19446) 2020-10-21 17:32:04 -07:00
iarspider
399ca3b671
Add qgraf (#19404)
* Add recipe for qgraf

* Revert "Add recipe for qgraf"

This reverts commit 76783f73867a32b4a96e980e31a433ed3c0037fd.

* Add qgraf

* Update package.py

Changes from review

* Changes from MR

* Fix for URLs containing @ symbol

Co-authored-by: Ivan Razumov <ivan.razumov@cern.ch>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-10-21 19:16:11 -05:00
Massimiliano Culpo
c696518efd
Skip malformed spec strings when searching for externals (#19438)
fixes #19266

fzf search method has also been updated

Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
2020-10-21 21:35:02 +02:00
Omri Mor
1147220b9b
CMakePackage: added 'ipo' variant (#18374)
+ipo sets CMAKE_INTERPROCEDURAL_OPTIMIZATION=ON
The option is not supported for CMake < 3.9
2020-10-21 11:09:45 +02:00
Massimiliano Culpo
bed929e7a9
Make url_fetch test independent of locale settings (#19390) 2020-10-21 09:18:50 +02:00
Greg Becker
a5faf7d27a
Change 'any' to wildcard for variants (#19381) 2020-10-21 08:05:29 +02:00
Massimiliano Culpo
a51702e3a2
py-archspec: added new package at v0.1.1 (#19395) 2020-10-20 18:40:31 +02:00
GaneshPrasadMA
0253f0af29
Adding AOCC compiler to SPACK community (#19345)
* Adding AOCC compiler to SPACK community

The AOCC compiler system offers a high level of advanced optimizations, multi-threading and processor support that includes global optimization, vectorization, inter-procedural analyses, loop transformations, and code generation. AMD also provides highly optimized libraries, which extract the optimal performance from each x86 processor core when utilized. The AOCC Compiler Suite simplifies and accelerates development and tuning for x86 applications.

* Added unit tests for detection and flags for AOCC

* Addressed reviewers comments w.r.t version checks and url,checksum related line lengths

Co-authored-by: Test User <spack@example.com>
2020-10-20 10:50:09 -05:00
Greg Becker
bf47045302
Fix python scripts relying on external python in env (#19241) 2020-10-20 10:09:13 +02:00
Scott Wittenburg
f561d3845b
Fix "buildcache update-index --keys ..." when mirror is S3 (#19141) 2020-10-19 15:24:41 -06:00
Sergey Kosukhin
fd7bfb1a50
filter_compiler_wrappers: a fix for NAG (#17133) 2020-10-18 23:18:18 -05:00
elsagermann
4750d479a0
Add testing option to dev-build command (#17293)
* ADD: testing to dev-build command

* RM: mutally exclusive group for testing in parser

* FIX: test option to subparser and not testing

* ADD: spack-completion.bash

* RM: local devbuildcosmo cmd

* FIX: bad merge --drop-in -b --before options forgotten

* FIX: --test place in spack-completion.bash

* FIX: typo

* FIX: blank line removing

* FIX: trailing white space

Co-authored-by: Elsa Germann <egermann@tsa-ln002.cm.cluster>
2020-10-18 23:17:07 -05:00
Wouter Deconinck
9471e9cb03
[docs] Pkg list: current version, not latest release (#18213)
The package list at https://spack.readthedocs.io/en/latest/package_list.html claims "it is automatically generated based on the packages in the latest Spack release" but it is actually based on the develop branch. This leads to confusion when users find that e.g. herwigpp is included in the list, but it cannot be found when they install the latest release. That latest release has a package list at https://spack.readthedocs.io/en/stable/package_list.html which does indeed not include herwigpp.

Changing the language from "the latest Spack release" to "this Spack version" might make that clearer. Maybe.
2020-10-18 23:11:20 -05:00
Jason Miller
25f817b8ae
Fix for buildcache -o (#19354)
* Fix for buildcache -o

The method has more positional arguments than the caller expects.

* Address length issue.

Fix pylint/flake errors.
2020-10-18 19:40:28 -05:00
Omri Mor
bff0291dac
Spec: fix multiple generator iteration in satisfies_dependencies (#18527) (#18559) 2020-10-17 11:40:31 +02:00
Scott McMillan
a612be1c98
New compiler: nvhpc (NVIDIA HPC SDK) (#19294)
* Add nvhpc compiler definition: "spack compiler add" will now look
  for instances of the NVIDIA HPC SDK compiler executables
  (nvc, nvc++, nvfortran) in supplied paths
* Add the nvhpc package which installs the nvhpc compiler
* Add testing for nvhpc detection and C++-standard/pic flags

Co-authored-by: Scott McMillan <smcmillan@nvidia.com>
2020-10-16 14:04:27 -07:00
iarspider
8c7385096e
spack external find: fix debug output (#19342)
Output was, e.g. `Executables in /bin and /,u,s,r,/,b,i,n are both associated with the same spec xz@5.2.2`, will be `Executables in /bin and /usr/bin are both associated with the same spec xz@5.2.2`.
2020-10-16 16:46:41 +02:00
Toyohisa Kameyama
a481087695
autotools: recursively patch config.guess and config.sub (#18347)
Previously config.guess and config.sub were patched only
in the root of the source path. 

This modification extend the previous behavior to patch every
config.guess or config.sub file even in subfolders, if need be.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-10-16 11:30:06 +02:00
Greg Becker
7a6268593c
Environments: specify packages for developer builds (#15256)
* allow environments to specify dev-build packages

* spack develop and spack undevelop commands

* never pull dev-build packges from bincache

* reinstall dev_specs when code has changed; reinstall dependents too

* preserve dev info paths and versions in concretization as special variant

* move install overwrite transaction into installer

* move dev-build argument handling to package.do_install

now that specs are dev-aware, package.do_install can add
necessary args (keep_stage=True, use_cache=False) to dev
builds. This simplifies driving logic in cmd and env._install

* allow 'any' as wildcard for variants

* spec: allow anonymous dependencies

raise an error when constraining by or normalizing an anonymous dep
refactor concretize_develop to remove dev_build variant
refactor tests to check for ^dev_path=any instead of +dev_build

* fix variant class hierarchy
2020-10-15 17:23:16 -07:00
Tomoki, Karatsu
72f431b67b
autotools.py: fix the list of objects to be removed from libtool (Fujitsu). (#19303) 2020-10-14 19:01:49 +02:00
Massimiliano Culpo
b84812256d
autotools: add attribute to delete libtool archives .la files (#18850)
* autotools: add attribute to delete libtool archives .la files

According to Autotools Mythbuster (https://autotools.io/libtool/lafiles.html)
libtool archive files are mostly vestigial, but they might create issues
when relocating binary packages as shown in #18694.

For GCC specifically, most distributions remove these files with
explicit commands:

https://git.stg.centos.org/rpms/gcc/blob/master/f/gcc.spec#_1303

Considered all of that, this commit adds an easy way for each
AutotoolsPackage to remove every .la file that has been installed.
The default, for the time being, is to maintain them - to be consistent
with what Spack was doing previously.

* autotools: delete libtool archive files by default

Following review this commit changes the default for
libtool archive files deletion and adds test to verify
the behavior.
2020-10-13 09:15:48 -07:00
Massimiliano Culpo
2399c2e78d
autotools: refactor search paths for aclocal in its own method (#19258)
This commit refactors the computation of the search path
for aclocal in its own method, so that it's easier to reuse
for packages that need to have a custom autoreconf phase.

Co-authored-by: Toyohisa Kameyama <kameyama@riken.jp>
2020-10-12 16:35:52 +02:00
Tomoki, Karatsu
db16f3e0d4
autotools.py: removed some options from libtool only for Fujitsu. (#19217) 2020-10-12 14:42:49 +02:00
Adam J. Stewart
372ac4a073
Add testing for Python 3.9 (#19261) 2020-10-11 21:16:00 -07:00
Adam J. Stewart
759e8ee4c2
CUDA: update maintainers (#19262) 2020-10-11 21:11:17 -07:00
谭九鼎
5e9f4dc982
Use https for links (#19244) 2020-10-09 11:24:09 -05:00
Scott Wittenburg
438f80d19e
Revert binary distribution cache manager (#19158)
This reverts #18359 and follow-on PRs intended to address issues with
#18359 because that PR changes the hash of all specs. A future PR will
reintroduce the changes.

* Revert "Fix location in spec.yaml where we look for full_hash (#19132)"
* Revert "Fix fetch of spec.yaml files from buildcache (#19101)"
* Revert "Merge pull request #18359 from scottwittenburg/add-binary-distribution-cache-manager"
2020-10-05 16:02:37 -07:00
Scott Wittenburg
7cfdf61979
Fix location in spec.yaml where we look for full_hash (#19132)
When we attempt to determine whether a remote spec (in a binary mirror)
is up-to-date or needs to be rebuilt, we compare the full_hash stored in
the remote spec.yaml file against the full_hash computed from the local
concrete spec.  Since the full_hash moved into the spec (and is no longer
at the top level of the spec.yaml), we need to look there for it.  This
oversight from #18359 was causing all specs to get rebuilt when the
full_hash wasn't fouhd at the expected location.
2020-10-02 15:37:47 -06:00
Scott Wittenburg
a44135dccf
Update buildcache key index when we update the package index (#19117)
This changes makes sure that when we run the pipeline job that updates
the buildcache package index on the remote mirror, we also update the
key index.  The public keys corresponding to the signing keys used to
sign the package was pushed to the mirror as a part of creating the
buildcache index, so this is just ensuring those keys are reflected
in the key index.

Also, this change makes sure the "spack buildcache update-index"
job runs even when there may have been pipeline failures, since we
would like the index always to reflect the true state of the mirror.
2020-10-02 11:00:42 -06:00
Scott Wittenburg
d3d98075c5
Fix fetch of spec.yaml files from buildcache (#19101)
Since those files currently exist in buildcaches (in S3 buckets) with
potentially different content types, we should be less restrictive in
what content types we accept when attempting to fetch them.  This PR
removes the content type constraint so any file with the matching
name will be found.
2020-10-01 14:32:43 -06:00
Patrick Gartung
a2795519df
Binary caching: avoid duplicate RPATHs, unnecessary updates (#19061)
* Remove duplication of reconstructed RPATHs caused by multiple
  identical entries in prefixes dictionary
* Don't rewrite RPATHs if relative RPATHs are unchanged because the
  directory layout is unchanged
2020-10-01 13:21:02 -07:00
Patrick Gartung
a380ceb139
Buildcache: Need to check the binary is not a Mach-o binary in a linux package or an ELF binary in a macOS package. (#18670)
* Need to check the binary is not a Mach-o binary in a linux package or an ELF binary in a macOS package.

* use sys.platform

* Darwin -> darwin for sys.platform
2020-10-01 12:39:59 -05:00
Scott Wittenburg
075c3e0d92
Merge pull request #18359 from scottwittenburg/add-binary-distribution-cache-manager
Add binary distribution cache manager
2020-09-30 16:37:35 -06:00
Axel Huebl
8edb31e934
CUDA: added v11.1.0 (#19036)
Compiler conflicts have been updated accordingly
2020-09-29 11:38:56 +02:00
David Beckingsale
04771ad9f8
Fixup conflicts for CUDA 11.0.2 and GCC (#19035)
* Fixup conflicts for CUDA 11.0.2 and GCC
* Updates for ppc64le
* Fix missing "or newer"

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2020-09-28 18:01:20 -07:00
Tomoki, Karatsu
b76189a2e5
autotools: patch 'libtool' recursively in subdirectories (#18620)
Previous version was doing it only in the root build directory.
2020-09-28 10:53:56 +02:00
Adam J. Stewart
c8ac61979b
Fix usage of builtin file as variable name (#19018) 2020-09-28 07:14:39 +02:00
Omar Padron
2d93154119
Streamline key management for build caches (#17792)
* Rework spack.util.web.list_url()

list_url() now accepts an optional recursive argument (default: False)
for controlling whether to only return files within the prefix url or to
return all files whose path starts with the prefix url.  Allows for the
most effecient implementation for the given prefix url scheme.  For
example, only recursive queries are supported for S3 prefixes, so the
returned list is trimmed down if recursive == False, but the native
search is returned as-is when recursive == True.  Suitable
implementations for each case are also used for file system URLs.

* Switch to using an explicit index for public keys

Switches to maintaining a build cache's keys under build_cache/_pgp.
Within this directory is an index.json file listing all the available
keys and a <fingerprint>.pub file for each such key.

 - Adds spack.binary_distribution.generate_key_index()
   - (re)generates a build cache's key index

 - Modifies spack.binary_distribution.build_tarball()
   - if tarball is signed, automatically pushes the key used for signing
     along with the tarball
   - if regenerate_index == True, automatically (re)generates the build
     cache's key index along with the build cache's package index; as in
     spack.binary_distribution.generate_key_index()

 - Modifies spack.binary_distribution.get_keys()
   - a build cache's key index is now used instead of programmatic
     listing

 - Adds spack.binary_distribution.push_keys()
   - publishes keys from Spack's keyring to a given list of mirrors

 - Adds new spack subcommand: spack gpg publish
   - publishes keys from Spack's keyring to a given list of mirrors

 - Modifies spack.util.gpg.Gpg.signing_keys()
   - Accepts optional positional arguments for filtering the set of keys
     returned

 - Adds spack.util.gpg.Gpg.public_keys()
   - As spack.util.gpg.Gpg.signing_keys(), except public keys are
     returned

 - Modifies spack.util.gpg.Gpg.export_keys()
   - Fixes an issue where GnuPG would prompt for user input if trying to
     overwrite an existing file

 - Modifies spack.util.gpg.Gpg.untrust()
   - Fixes an issue where GnuPG would fail for input that were not key
     fingerprints

 - Modifies spack.util.web.url_exists()
   - Fixes an issue where url_exists() would throw instead of returning
     False

* rework gpg module/fix error with very long GNUPGHOME dir

* add a shim for functools.cached_property

* handle permission denied error in gpg util

* fix tests/make gpgconf optional if no socket dir is available
2020-09-25 12:54:24 -04:00
Greg Becker
f616422fd7
refactor install_tree to use projections format (#18341)
* refactor install_tree to use projections format

* Add update method for config.yaml

* add test for config update config
2020-09-25 11:15:49 -05:00
Greg Becker
5565b6494d
typo (#18845) 2020-09-21 11:54:23 -05:00
Massimiliano Culpo
fcb4dfc307
Ensure variant defaults are parsable from CLI. (#18661)
- Add a unit test to check if there are unparsable defaults
- Fix 'rust' and 'nsimd' variants
2020-09-19 07:54:26 +02:00
Greg Becker
7585b37865
do out of source builds in hashed directories (#18574) 2020-09-18 12:21:13 -07:00
Greg Becker
2e4892c111
env view failures: print underlying error message (#18713) 2020-09-18 10:21:14 -07:00
Scott Wittenburg
28ef5b1204 Do not assume we sit in the directory where the env file lives. 2020-09-14 10:37:42 -06:00
Scott Wittenburg
031490f6aa Remove :<name> interpolation, add SPACK_VERSION variables
Also fix issues with documentation to reflect changes
2020-09-14 10:37:42 -06:00
Scott Wittenburg
bf90cdd6c7 Document pipeline keys which can be global but overridden
Update pipelines documentation to describe how 'tags', 'variables',
'image', 'before_script', 'script', and 'after_script' can be
supplied at the top level, to be used by any of the runner mappings,
and also overridden by any of the runner mappings.

Also show an example of capturing the custom spack SHA at pipeline
generation time, so all jobs are sure to run with the same version
of spack, as a means to illustrate the $env:VARIABLE_NAME syntax.
2020-09-14 10:37:42 -06:00
Scott Wittenburg
d9e0718c9d Allow overridable global runner attributes 2020-09-14 10:37:42 -06:00
Scott Wittenburg
e686f1500e Update pipeline documentation to describe user-provided scripts 2020-09-14 10:37:42 -06:00