Environment.concretize returns newly-concretized specs rather than
printing them; as a result, the _display argument is removed from
Environment.concretize (originally only used to avoid printing specs
during unit testing). Command logic which invokes
Environment.concretize prints these explicitly.
This updates the Spack QT package to enable building qt version 4 on
MacOS.
This includes the following changes to the qt package:
* add version 4.8.7
* add option to build with or without shared libs
* add options to disable tools, ssl, sql, and freetype support
* add qt4-tools patch when building qt@4+tools
* add option to build as a framework (only available on MacOS)
* replace qt4-el-capitan patch with qt4-mac patch (which includes the
edits from qt4-el-capitan)
* apply qt4-pcre-include-conflict.patch only for version 4.8.6
(rather than all 4.x versions)
* apply qt4-gcc-and-webkit.patch for 4.x versions before 4.8.7 and
create a separate qt4-gcc-and-webkit-487.patch for version 4.8.7
* update patch function for qt@4 on MacOS to update configure
variables relevant to Spack (e.g. PREFIX)
* add option to build freetype with Spack, as a vendored dependency
of QT, or not at all (default is to build with Spack)
This includes the following edits outside of the qt package:
* Update MacOS version utility function to return all parts of the
Mac version (rather than just the first two)
* gettext package: implement "libs"
* python package: add gettext as a dependency
* Raise an exception and exit with a meaningful message when binary path substitution fails.
* Skip binary text replacement with padding and issue a warning when the new install path is longer than the old install path.
- We don't currently make enough use of the maintainers field on
packages, though we could use it to assign reviews.
- add a command that allows maintainers to be queried
- can ask who is maintaining a package or packages
- can ask what packages users are maintaining
- can list all maintained or unmaintained packages
- add tests for the command
* Added a unit test reproducing the failure in 12085
* Fixed name clash in the 'from_environment_diff' function
The bug reported in #12085 stemmed from a name clash among variables,
introduced during the refactor in #10753 and not caught by unit tests
and reviews.
- Setting specs from lockfiles was not correctly stringifying concretized
user specs.
- Fix `_set_user_specs_from_lockfile`
- Add some validation code to `SpecList` constructor
Spack has evolved to have three types of hash functions, and it's
becoming hard to tell when each one is called. Whlie we aren't yet ready
to get rid of them, we can refactor them so that the code is clearer and
easier to track.
- Add a `hash_types` module with concise descriptors for hashes.
- Consolidate hashing logic in a private `Spec._spec_hash()` function.
- `dag_hash()`, `build_hash()`, and `full_hash()` all call `_spec_hash()`
- `to_node_dict()`, `to_dict()`, `to_yaml()` and `to_json()` now take a
`hash` parameter consistent with the one that `_spec_hash()` requires.
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
- ensure that Spec._build_hash attr is defined
- add logic to compute _build_hash when it is not already computed (e.g. for specs prior to this PR)
- add test to ensure that different instance of a build dep are preserved
- test conversion of old env lockfile format to new format
- tests: avoid view creation, DAG display in tests using MockPackage
- add regression test for more-general bug also fixed by this PR
- update lockfile version since the way we are maintaining hashes has changed
- write out backup for version-1 lockfiles and test that
The database and mutable_database fixtures were installing and uninstalling the same specs multiple times to ensure the database for tests has the correct state.
This commit optimizes the procedure by caching the state in an external directory, and copying it in instead of going through the installation or uninstallation again.
The database fixture is meant not to be modified by tests. This commit enforces this invariant by making the database read-only before starting the test.
* Added missing db markers to tests
* Added test for uninstall_by_spec
* `database` fixture now returns a read-only database
* Tests that modify the DB now use `mutable_database` fixture
Summary:
- Allow multiple definitions of compiler in compilers.yaml (use first instance)
- Still print debug messages when there are duplicates, to assist users in finding this issue.
Merging configs from different scopes can result in multiple compiler being present in the same configuration list. Instead of raising when there are duplicates, take the one with highest precedence.
Print a debug message instead of raising, so that we can still diagnose this. We don't have a good way of warning the user about inconsistent configuration *in the same file* -- we'd need to dig into YAML file/line info for that.
- [x] Add shell tests to ensure that `spack env activate`, `spack env
deactivate`, and `despacktivate` continue to work.
- [x] Also ensure that activate and deactivate both work with `set -u`
* extends mkdirs with permissions for intermediate folders
Does not use os.makedirs mode parameter because its behavior is changed
with Python 3.7 (it ignores it for intermediate dirs), and moreover it
was not possible to set different modes for newly-created folders
and leaf folder.
reference:
- https://bugs.python.org/issue19930
- https://docs.python.org/3.7/library/os.html#os.makedirs
* comment mkdirp step easing code understanding
* revert mkdir to default for package metapath
since metapath is nested in package folder, there is no need
to specify permissions for intermediate folders because the prefix
already exists.
* comment create_install_directory package modes
Bug relates to the interplay between:
1. random dict orders in python 3.5
2. bugfix in initial implementation of stacks for `_concretize_dependencies`
when `self._dependencies` is empty
3. bug in coconcretization algorithm computation of split specs
Result was transient hang in coconcretization.
Fixed#3 (bug in coconcretization) to resolve.
- remove redundant code in Environment.__init__
- use socket.gethostname() instead of which('hostname')
- refactor updating SpecList references
- refactor 'specs' literals to a single variable for default list name
- use six.string_types for python 2/3 compatibility
* from_sourcing_file: fixed a bug + added a few ignored variables
closes#7536
Credits for this change goes to mgsternberg (original author of #7536)
The new variables being ignored are specific to Modules v4.
* Use Spack Executable in 'EnvironmentModifications.from_sourcing_file'
Using this class avoids duplicating lower level logic to decode
stdout and handle non-zero return codes
* Extracted a function that returns the environment after sourcing files
The logic in `EnvironmentModifications.from_sourcing_file` has been
simplified by extracting a function that returns a dictionary with the
environment one would have after sourcing the files passed as argument.
* Further refactoring of EnvironmentModifications.from_sourcing_file
Extracted a function that sanitizes a dictionary removing keys that are
blacklisted, but keeping those that are whitelisted. Blacklisting and
whitelisting can be done on literals or regex.
Extracted a new factory that creates an instance of
EnvironmentModifications from a diff of two environments.
* Added unit tests
* PS1 is blacklisted + more readable names for some variables
All documentation mentions that `build_jobs` is limited by the number of
cores available in the system. This is also enforced when setting it via
`--jobs`. However, when setting it via `config.yaml`, it can exceed the
number of cores available, making builds run out of memory.
This PR adds the ability to specify the auto-dispatch targets that can
be used by the Intel compilers. The `-ax` flag will be written to the
respective compiler configuration files. This ability is very handy when
wanting to build optimized builds for various architectures. This PR
does not set any optimization flags, however.
Fixes#3690Fixes#5637
Uninstalling dependents of a spec was relying on a traversal of the
parents done by inspecting spec._dependents. This is in turn a
DependencyMap that maps a package name to a single DependencySpec object
(an edge in the DAG) and cannot thus model the case where a spec has
multiple configurations of the same parent package installed (for
example if different versions of the same Python library depend on
the same Python installation).
This commit works around this issue by constructing the list of specs to
be uninstalled in an alternative way, and adds tests to verify the
behavior. The core issue with DependencyMap is not resolved here.
The default library search for a package checks the lib/ and lib64/
directories for libraries before the root prefix, in order to save
time when searching for libraries provided by externals (which e.g.
may have '/usr/' as their root).
This moves that logic into the "find_libraries" utility method so
packages implementing their own custom library search logic can
benefit from it.
This also updates packages which appear to be replicating this logic
exactly, replacing it with a single call to "find_libraries".
Fixes#11782
Spack was not properly resolving relative paths to absolute paths
when a relative path was passed to "spack compiler add [PATH]".
Now, if provided a relative path, the absolute path is written to
compilers.yaml rather than the relative path.
* Add template creation test
* Added --skip-editor option to "spack create": normally
"spack create" opens an editor for the user after generating a
package file; when the --skip-editor option is used, "spack create"
only generates the package file and does not open an editor
* Added --skip-editor option to bash completion
- Namepsaces were shown without dots after the new format strings were
added.
- Add a test for `spack find` to ensure that find -N shows the right
output.
Fixes#11781
* Rename build log to spack-build-log.txt
* Rename environment variables file to spack-build-env.txt
* The name of the log and env files is now the same during the build
and after the build completes
* Update packages which referred to the build log/env files
* For packages installed before this commit using older names for the
build and env files, search for the older names
- Fix a bug introdcued by removing parse_anonymous_spec()
- Conflicts' when specs are now *actually* anonymous, and the name of the
package is implicit, so we need to remember to add it back to error
messages.
- `parse_anonymous_spec()` is a vestige of the days when Spack didn't
support nameless specs. We don't need it anymore because now we can
write Spec() for a spec that will match anything, and satisfies()
semantics work properly for anonymous specs.
- Delete `parse_anonymous_spec()` and replace its uses with simple calls
to the Spec() constructor.
- make then handling of when='...' specs in directives more consistent.
- clean up Spec.__contains__()
- refactor directives and tests slightly to accommodate the change.
- CNL OS previously used the *Cray PE* version to determine the OS
version. Cray does not synchronize PE and CLE releases; you can run
CLE7 with PrgEnv 6 (and NERSC currently does).
- Fix Spack's OS detection by using the cle-release file to detect the OS
version. This file is updated with every CLE OS release.
- Add some tests for our parsing logic
Add an example of a 'modules:' entry for an external package in
packages.yaml. The 'External Packages' section of 'Build
Customization' mentions 'paths:' and 'modules:' and gives an
example of paths, but not modules.
Fixes#11816
Allow packages to refer to non-expanded downloads (e.g. a single
script) using Stage.archive_file. This addresses a regression from
#11688 and adds a unit test for it.
This change reverts to the previous behavior of only looking for pgcc
and friends, not pgcc-llvm and friends.
The llvm variant doesn't support all the same features as the
traditional variant of the pgi code generator; this change avoids
treating the llvm variant as a default pgi compiler.
This retains the changes in #10704 which accept the "LLVM" suffix of
the version string of the PGI compiler, which allows users to
explicitly add the llvm-pgi compiler if desired.
For resources, it is desirable to use the expanded archive name of
the resource as the name of the directory when adding it to the root
staging area.
#11528 established 'spack-src' as the universal directory where
source files are placed, which also affected the behavior of
resources managed with Stages.
This adds a new property ('srcdir') to Stage to remember the name of
the expanded source directory, and uses this as the default name when
placing a resource directory in the root staging area.
This also:
* Ensures that downloaded sources are archived using the expanded
archive name (otherwise Spack will not be able to determine the
original directory name when using a cached archive).
* Updates working_dir context manager to guarantee restoration of
original working directory when an exception occurs
* Adds a "temp_cwd" context manager which creates a temporary
directory and sets it as the working directory
The regression test for #11678 fails on at least some Mac OS systems
because they have a /usr/bin/gcc that is secretly clang.
This PR replaces the dependency on a system gcc executable with a
test-generated script that generates the expected output for the
compiler logic.
Some tests introduced in #11528 temporarily set the user's `config:build_stage`, which affected (or created) a config.yaml file in the user's `$HOME/.spack` directory that could leave entries behind if the tests fail.
This change ensures only temporary configuration files are used/affected by these tests.
The "spack location" command was previously untested. This also adds
a check to ensure that composite Stages can report whether they were
expanded (this property was previously only recorded in Stage but not
in CompositeStage).
DIYStage, used to treat a user-managed directory as a staging area,
should always be considered expanded (i.e. the source has been
decompressed if it was stored in an archive).
This also:
* Adds checks to ensure that the path used to instantiate a
DIYStage refers to an existing directory.
* Adds tests to check the behavior of DIYStage (including behavior
added here, but it was generally untested before).
#11528 updated Stage to always store a Package's source in a fixed
directory accessible via `Stage.source_path` This left behind a
number of packages which were expecting to access the source code
via `Stage.path`. This Updates those packages to use
`Stage.source_path` instead.
This also updates the name of the fixed directory: The original name
of the fixed directory was "src", so if an expanded archive created a
"src" directory, then users inspecting the directory structure could
see paths like "src/src" (which wasn't wrong but could be confusing).
Therefore this also updates the name of the fixed directory to
"spack-src".
Fixes#11678
`spack compiler find` was not searching `PATH` when provided with no
arguments. ea7910a updated the API for the search function and the
command logic did not update how it called this function. This also
adds a test to ensure that `spack compiler find` will collect
compilers from `PATH`.
"spack module tcl find -r <spec>" (and equivalents for other module
systems) was failing when a dependency was installed in an upstream
Spack instance. This updates the module index to handle locating
module files for upstream Spack installations (encapsulating the
logic in a new class called UpstreamModuleIndex); the updated index
handles the case where a Spack installation has multiple upstream
instances.
Note that if a module is not available locally but we are using the
local package, then we shouldn't use a module (i.e. if the package is
also installed upstream, and there is a module file for it, Spack
should not use that module). Likewise, if we are instance X using
upstreams Y and Z like X->Y->Z, and if we are using a package from
instance Y, then we should only use a module from instance Y. This
commit includes tests to check that this is handled properly.
Spack currently tries to unify everything in the DAG, but this is too strict for build dependencies, where it is fine to build a dependency with a tool that conflicts with a version fo that tool for a dependent's build.
To enable a workaround for conflicts among build dependencies, so that users can install in multiple steps to avoid these conflicts, make the following changes:
* Dont apply package dependency constraints for build deps of installed packages
* Avoid applying constraints for installed packages vs. concrete packages
* Mark all dependencies of installed packages as visited in normalization method
* don't remove dependency links for concrete specs in flat_dependencies
Also add tests:
* Update test to ensure that link dependencies of installed packages have constraints applied
* Add test to check for proper handling of transitive dependencies (which is currently not the case)
- spack.compilers.find_compilers now uses a multiprocess.pool.ThreadPool to execute
system commands for the detection of compiler versions.
- A few memoized functions have been introduced to avoid poking the filesystem multiple
times for the same results.
- Performance is much improved, and Spack no longer fork-bombs the system when doing a `compiler find`
- We use `spack list --foramt=html` now, as it is much faster and doesn't
make the docs build take forever.
- Remove `spack list --format=rst` as it is no longer used.