* Add extra version of vc
* Update package.py
* Update package.py
* Update var/spack/repos/builtin/packages/vc/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Mark conflicts with binutils on darwin
* Explicitly require binutils bootstrapping and mark conflict with nvptx
* Disable gold variant by default on darwin
* igv: adding package igv
* removing some remaining initial boilerplate
* changing path construction to be more correct
* adding in type for java dep, also forgot about prefix.bin etc
* Added IRPF90 package
* PEP8
* SHA256
* Update var/spack/repos/builtin/packages/py-irpf90/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
libfabric used to install fabtests only when installed
using --test. fabtests has tools that are useful on a
running system, so they should be installed always.
* Rewrote the build/install part to always install
fabtests alongside libfabric.
* Updated a few fabtests resources.
* Updated the test related stuff. Works for most versions
now.
* Include tcp and udp fabrics so that the test suite works.
Change hpctoolkit's dependency on libunwind from 2018.10.12 to 1.4:.
In libunwind, 2018.10.12 is going away in favor of 1.4-rc1 (they're
nearly identical commits).
Remove the 'gpu' version. This was a temporary branch that is now
folded into master.
* py-notebook: make py-setuptools a run dependency
The py-setuptools dependency in py-notebook needs to be a run
dependency. The following message is received if it is not in the run
environment.
Traceback (most recent call last): File "/opt/ssoft/apps/2020.1/linu
x-centos7-sandybridge/gcc-9.2.0/py-notebook-6.0.1-6usbn4c/bin/jupyter-notebook",
line 6, in <module>
from pkg_resources import load_entry_point
Module NotFoundError: No module named 'pkg_resources'
* Remove extraneous whitespace
* Add extra version of py-jedi
* Update package.py
* Update package.py
Correct dependency types
* Update var/spack/repos/builtin/packages/py-jedi/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-jedi/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-jedi/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-parso package
* Remove boilerplate from py-parso
* Flake-8
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add extra version of py-isort
* Update package.py
* Update package.py
* Update var/spack/repos/builtin/packages/py-isort/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-isort/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-isort/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-isort/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add extra version of py-dask
* Update package.py
* Add extra dependencies for py-dask+distributed
* Update package.py
* Update var/spack/repos/builtin/packages/py-heapdict/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-heapdict/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-distributed/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-distributed/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update package.py
* Update package.py
* Update package.py
* Update var/spack/repos/builtin/packages/py-distributed/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-distributed/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update var/spack/repos/builtin/packages/py-distributed/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Flake-8
* Add patch step for py-distributed
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Fixes#9394Closes#13217.
## Background
Spack provides the ability to enable/disable parallel builds through two options: package `parallel` and configuration `build_jobs`. This PR changes the algorithm to allow multiple, simultaneous processes to coordinate the installation of the same spec (and specs with overlapping dependencies.).
The `parallel` (boolean) property sets the default for its package though the value can be overridden in the `install` method.
Spack's current parallel builds are limited to build tools supporting `jobs` arguments (e.g., `Makefiles`). The number of jobs actually used is calculated as`min(config:build_jobs, # cores, 16)`, which can be overridden in the package or on the command line (i.e., `spack install -j <# jobs>`).
This PR adds support for distributed (single- and multi-node) parallel builds. The goals of this work include improving the efficiency of installing packages with many dependencies and reducing the repetition associated with concurrent installations of (dependency) packages.
## Approach
### File System Locks
Coordination between concurrent installs of overlapping packages to a Spack instance is accomplished through bottom-up dependency DAG processing and file system locks. The runs can be a combination of interactive and batch processes affecting the same file system. Exclusive prefix locks are required to install a package while shared prefix locks are required to check if the package is installed.
Failures are communicated through a separate exclusive prefix failure lock, for concurrent processes, combined with a persistent store, for separate, related build processes. The resulting file contains the failing spec to facilitate manual debugging.
### Priority Queue
Management of dependency builds changed from reliance on recursion to use of a priority queue where the priority of a spec is based on the number of its remaining uninstalled dependencies.
Using a queue required a change to dependency build exception handling with the most visible issue being that the `install` method *must* install something in the prefix. Consequently, packages can no longer get away with an install method consisting of `pass`, for example.
## Caveats
- This still only parallelizes a single-rooted build. Multi-rooted installs (e.g., for environments) are TBD in a future PR.
Tasks:
- [x] Adjust package lock timeout to correspond to value used in the demo
- [x] Adjust database lock timeout to reduce contention on startup of concurrent
`spack install <spec>` calls
- [x] Replace (test) package's `install: pass` methods with file creation since post-install
`sanity_check_prefix` will otherwise error out with `Install failed .. Nothing was installed!`
- [x] Resolve remaining existing test failures
- [x] Respond to alalazo's initial feedback
- [x] Remove `bin/demo-locks.py`
- [x] Add new tests to address new coverage issues
- [x] Replace built-in package's `def install(..): pass` to "install" something
(i.e., only `apple-libunwind`)
- [x] Increase code coverage
* ENH: add catch2 CMake install
* add a variant allowing catch2 to be installed
via CMake, which is useful for generating a .cmake
config file for consumption by other projects
* Catch2: Simplify Package
- CMake install is also single-header for new releases
- testing triggered by Spack's test mechanism
- default to CMake build (better than simple copy, which is
just for old releases to be installed)
* Catch: Remove Variant
We can control all installs with CMake to be quick and complete.
Old versions prior to 1.7.0 will be manually installed, as the
`make install` target is missing in those.
Releases 1.7.0-1.9.3 do not expose control over test builds.
* openPMD-api: Catch Lost single_header
... variant is gone :)
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>