This patchset refactors our GitHub actions into a single top-level ci workflow that
invokes a series of reusable actions. The main goal of this is to be able to easily
control which tests run and in what order based on the success or failure of top-level
prechecks. Our previous workflows ran in three sets:
* nix tests: style and verification first, then linux and macos tests if successful
* windows tests: style and verification first, then linux and macos tests if successful
* bootstrap tests
As a result, the bootstrap tests ran even if the style failed, and style and verification
had to run on two different platforms despite running identical checks. I'm relatively
sure that's because of the limitation on dependencies between steps in the jobs.
Reusable workflows allow us to run the style, verification and now audit checks once,
then depending on the results, and the files changed, run the appropriate nix, windows
and bootstrap tests. While it saves only a few minutes by itself, this makes it easier to
refactor checks to subset tests without having to replicate tests or other workflow
components in the future.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Move the copying of the buildcache to a root job that runs after all the child
pipelines have finished, so that the operation can be coordinated across all
child pipelines to remove the possibility of race conditions during potentially
simlutandous copies. This lets us ensure the .spec.json.sig and .spack files
for any spec in the root mirror always come from the same child pipeline
mirror (though which pipeline is arbitrary). It also allows us to avoid copying
of duplicates, which we now do.
If you have an environment like
```
$ cat spack.yaml
spack:
specs: [openmpi@4.1.0+cuda]
```
this PR provides a new command `spack change` that you can use to adjust environment specs from the command line:
```
$ spack change openmpi~cuda
$ cat spack.yaml
spack:
specs: [openmpi@4.1.0~cuda]
```
in other words, this allows you to tweak the details of environment specs from the command line.
Notes:
* This is only allowed for environments that do not define matrices
* This is possible but not anticipated to be needed immediately
* If this were done, it should probably only be done for "named"/not-anonymous specs (i.e. we can change `openmpi+cuda` but not spec like `+cuda` or `@4.0.1~cuda`)
fixes#31484
Before this change if anything was matching an external
condition, it was considered "external" and thus something
to be "built".
This was happening in particular to external packages
that were re-read from the DB, which then couldn't be
reused, causing the problems shown in #31484.
This PR fixes the issue by excluding specs with a
"hash" from being considered "external"
* Test that users have a way to select a virtual
This ought to be solved by extending the "require"
attribute to virtual packages, so that one can:
```yaml
mpi:
require: 'multi-provider-mpi'
```
* Prevent conflicts to be enforced on specs that can be reused.
* Rename the "external_only" fact to "buildable_false", to better reflect its origin
* Preliminary support for include URLs in spack.yaml (environment) files
This commit adds support in environments for external configuration files obtained from a URL with a preference for grabbing raw text from GitHub and gitlab for efficient downloads of the relevant files. The URL can also be a link to a directory that contains multiple configuration files.
Remote configuration files are retrieved and cached for the environment. Configuration files with the same name will not be overwritten once cached.
Extend the semantics of package requirements to
allow using them also under a virtual package
attribute in packages.yaml
These requirements are enforced whenever that
virtual spec is present in the DAG.
Allow users to express default requirements in packages.yaml.
These requirements are overridden if more specific requirements
are present for a given package.
This support requires adding the '--tests' option to 'spack ci rebuild'.
Packages whose stand-alone tests are broken (in the CI environment) can
be configured in gitlab-ci to be skipped by adding them to
broken-tests-packages.
Highlights include:
- Restructured 'spack ci' help to provide better subcommand summaries;
- Ensured only one InstallError (i.e., installer's) rather than allowing
build_environment to have its own; and
- Refactored CI and CDash reporting to keep CDash-related properties and
behavior in a separate class.
This allows stand-alone tests from `spack ci` to run when the `--tests`
option is used. With `--tests`, stand-alone tests are run **after** a
**successful** (re)build of the package. Test results are collected
and report(able) using CDash.
This PR adds the following features:
- Adds `-t` and `--tests` to `spack ci rebuild` to run stand-alone tests;
- Adds `--fail-fast` to stop stand-alone tests after the first failure;
- Ensures a *single* `InstallError` across packages
(i.e., removes second class from build environment);
- Captures skipping tests for externals and uninstalled packages
(for CDash reporting);
- Copies test logs and outputs to the CI artifacts directory to facilitate
debugging;
- Parses stand-alone test results to report outputs from each `run_test` as
separate test parts (CDash reporting);
- Logs a test completion message to allow capture of timing of the last
`run_test` part;
- Adds the runner description to the CDash site to better distinguish entries
in CDash tables;
- Adds `gitlab-ci` `broken-tests-packages` to CI configuration to skip
stand-alone testing for packages with known issues;
- Changes `spack ci --help` so description of each subcommand is a single line;
- Changes `spack ci <subcommand> --help` to provide the full description of
each command (versus no description); and
- Ensures `junit` test log file ends in an `.xml` extension (versus default where
it does not).
Tasks:
- [x] Include the equivalent of the architecture information, or at least the host target, in the CDash output
- [x] Upload stand-alone test results files as `test` artifacts
- [x] Confirm tests are run in GitLab
- [x] Ensure CDash results are uploaded as artifacts
- [x] Resolve issues with CDash build-and test results appearing on same row of the table
- [x] Add unit tests as needed
- [x] Investigate why some (dependency) packages don't have test results (e.g., related from other pipelines)
- [x] Ensure proper parsing and reporting of skipped tests (as `not run`) .. post- #28701 merge
- [x] Restore the proper CDash URLand or mirror ONCE out-of-band testing completed
* modified list.py and added functionality for --tag
* Removed long and very long, shifted rest of code above return statement
* removed results variable
* added import statement at top
* added the line accidentally deleted
* added line accidentally deleted
* changed p.name to p, added line inside if statement
* line order switched
* [@spackbot] updating style on behalf of sparkyniner
* ran update completion command
* add tests
* Update lib/spack/spack/test/cmd/list.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of sparkyniner
* changed argument to mock_packages and moved code under filter by tag
* removed bad rebase code and added additional test
* [@spackbot] updating style on behalf of sparkyniner
* added line removed earlier
* added line removed earlier
* replaced function
* added more recommended changes
Co-authored-by: sairaj <sairaj@sairajs-MacBook-Pro.local>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Assertions without messages if/when hit create a blank error message for users.
This PR adds error messages to all assertions in asp.py even
if it seems unlikely they will ever be needed.
The argument is very likely a typo, and was meant to
be given to the fixture decorator. Since the value
being passed is the default, let's just remove it.
Ensure that build tools with module-level commands in spack use
the version built as part of their build graph if one exists.
This is now also required for mesa, scons, cmake and ctest, out
of graph versions of these tools in path will not be found unless
added as an external.
This bug appeared because a new version of rocprim needs cmake
3.16, while I have 3.14 in my path I had added an external for
cmake 3.20 to the dag, but 3.14 was still used to configure
rocprim causing it to fail. As far as I can tell, all the build
tools added in build_environment.py had this problem, despite the
fact that they should have been resolving these tools by name
with a path search and find the one in the dag that way. I'm
still investigating why the path searching and Executable logic
didn't do it, but this makes three of the build systems much more
explicit, and leaves only gmake and ninja as dependencies from
out in the system while ensuring the version in the dag is used
if there is one.
The additional sqlite version is to perturb the hash of python to
work around a relocation bug which will be fixed in a subsequent
PR.
* filesystem: use lstat in recursive mtime
When a `develop` path contains a dead symlink, the `os.stat` in the recursive `mtime` determination trips up over it.
Closes#32165.
`requirement_policy/3` is generated and may not be in Spack's inputs to Clingo.
Currently this is causing warnings like:
```
$ spack spec zlib
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:510:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"one_of")
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:517:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"one_of")
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:523:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"any_of")
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:534:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"any_of")
Input spec
--------------------------------
zlib
Concretized
--------------------------------
zlib@1.2.11%gcc@7.5.0+optimize+pic+shared arch=cray-sles15-haswell
```
- [x] Silence warning with `#defined requirement_policy/3`
Spack doesn't have an easy way to say something like "If I build
package X, then I *need* version Y":
* If you specify something on the command line, then you ensure
that the constraints are applied, but the package is always built
* Likewise if you `spack add X...`` to your environment, the
constraints are guaranteed to hold, but the environment always
builds the package
* You can add preferences to packages.yaml, but these are not
guaranteed to hold (Spack can choose other settings)
This commit adds a 'require' subsection to packages.yaml: the
specs added there are guaranteed to hold. The commit includes
documentation for the feature.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
All PRs are failing the docs build on account of an error with
pygments. These errors coincide with a new release of pygments
(2.13.0) and restricting to < 2.13 allows the doc tests to pass,
so this commit enforces that constraint for the docs build.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
A few attribute in packages are meant to be reserved for
Spack internal use. This audit checks packages to ensure none
of these attributes are overridden.
- [x] add additional audit check
This PR fixes the performance regression reported in #31985 and a few
other issues found while refactoring the spack mirror create command.
Modifications:
* (Primary) Do not require concretization for
`spack mirror create --all`
* Forbid using --versions-per-spec together with --all
* Fixed a few issues when reading specs from input file (specs were
not concretized, command would fail when trying to mirror
dependencies)
* Fix issue with default directory for spack mirror create not being
canonicalized
* Add more unit tests to poke spack mirror create
* Skip externals also when mirroring environments
* Changed slightly the wording for reporting (it was mentioning
"Successfully created" even in presence of errors)
* Fix issue with colify (was not called properly during error
reporting)
`LD_LIBRARY_PATH` can break system executables (e.g., when an enviornment is loaded) and isn't necessary thanks to `RPATH`s. Packages that require `LD_LIBRARY_PATH` can set this in `setup_run_environment`.
- [x] Prefix inspections no longer set `LD_LIBRARY_PATH` by default
- [x] Document changes and workarounds for people who want `LD_LIBRARY_PATH`
These changes make many packages build on nixos where nearly nothing
comes from /bin or /usr/bin (the only things in "system locations" are
/bin/sh and /usr/bin/env, all the rest is found through PATH).
Many configuration scripts hardcode /usr/bin/file instead of using the
one from PATH. This patches them to use file from PATH.
fixes#31736
Catch errors when concretizing specs and report them as
debug messages. The corresponding spec is skipped.
Co-authored-by: Greg Becker <becker33@llnl.gov>
* Extracted two functions in cmd/install.py
* Extracted a function to perform installation from the active environment
* Rename a few functions, remove args from their arguments
* Rework conditional in install_from_active_environment to reduce nesting in the function
* Extract functions to parsespecs from cli and files
* Extract functions to getuser confirmation for overwrite
* Extract functions to install specs inside and outside environments
* Rename a couple of functions
* Fix outdated comment
* Add missing imports
* Split conditional to dedent one level
* Invert check and exit early to dedent one level when requiring user confirmation