* SV variants are evaluated correctly in `when=` statements fixes#4113
The problem here was tricky:
```python
spec.satisfies(other)
```
changes already the MV variants in others into SV variants (where
necessary) if spec is concrete. If it is not concrete it does
nothing because we may be acting at a pure syntactical level.
When evaluating a `when=` keyword spec is for sure not concrete
as it is in the middle of the concretization process. In this case we
have to trigger manually the substitution in other to not end up
comparing a MV variant "foo=bar" to a SV variant "foo=bar" and having
False in return. Which is wrong.
* sv variants: improved error message for typos in "when=" statements
Modifications:
- added support for multi-valued variants
- refactored code related to variants into variant.py
- added new generic features to AutotoolsPackage that leverage multi-valued variants
- modified openmpi to use new features
- added unit tests for the new semantics
This allows people on systems that don't have all the fetchers to still
run Spack tests. Mark tests that require git, subversion, or mercurial to
be skipped if they're not installed.
* Filter all system paths introduced by dependencies from PATH
* Make sure path filtering works *even* for trailing slashes
* Revert some of the changes to `filter_system_paths`
* Yes, `bin64` is a real thing (sigh)
* add tests: /usr, /usr/, /usr/local/../bin, etc.
* Convert from rST to Google-style docstrings
The required hash of a submodule might point to the
non-HEAD commit of the current main branch and hence
would lead to a "no such remote ref" at checkout in
a shallow submodule.
## Motivation
Python installations are both important and unfortunately inconsistent. Depending on the Python version, OS, and the strength of the Earth's magnetic field when it was installed, the name of the Python executable, directory containing its libraries, library names, and the directory containing its headers can vary drastically.
I originally got into this mess with #3274, where I discovered that Boost could not be built with Python 3 because the executable is called `python3` and we were telling it to use `python`. I got deeper into this mess when I started hacking on #3140, where I discovered just how difficult it is to find the location and name of the Python libraries and headers.
Currently, half of the packages that depend on Python and need to know this information jump through hoops to determine the correct information. The other half are hard-coded to use `python`, `spec['python'].prefix.lib`, and `spec['python'].prefix.include`. Obviously, none of these packages would work for Python 3, and there's no reason to duplicate the effort. The Python package itself should contain all of the information necessary to use it properly. This is in line with the recent work by @alalazo and @davydden with respect to `spec['blas'].libs` and friends.
## Prefix
For most packages in Spack, we assume that the installation directory is `spec['python'].prefix`. This generally works for anything installed with Spack, but gets complicated when we include external packages. Python is a commonly used external package (it needs to be installed just to run Spack). If it was installed with Homebrew, `which python` would return `/usr/local/bin/python`, and most users would erroneously assume that `/usr/local` is the installation directory. If you peruse through #2173, you'll immediately see why this is not the case. Homebrew actually installs Python in `/usr/local/Cellar/python/2.7.12_2` and symlinks the executable to `/usr/local/bin/python`. `PYTHONHOME` (and presumably most things that need to know where Python is installed) needs to be set to the actual installation directory, not `/usr/local`.
Normally I would say, "sounds like user error, make sure to use the real installation directory in your `packages.yaml`". But I think we can make a special case for Python. That's what we decided in #2173 anyway. If we change our minds, I would be more than happy to simplify things.
To solve this problem, I created a `spec['python'].home` attribute that works the same way as `spec['python'].prefix` but queries Python to figure out where it was actually installed. @tgamblin Is there any way to overwrite `spec['python'].prefix`? I think it's currently immutable.
## Command
In general, Python 2 comes with both `python` and `python2` commands, while Python 3 only comes with a `python3` command. But this is up to the OS developers. For example, `/usr/bin/python` on Gentoo is actually Python 3. Worse yet, if someone is using an externally installed Python, all 3 commands may exist in the same directory! Here's what I'm thinking:
If the spec is for Python 3, try searching for the `python3` command.
If the spec is for Python 2, try searching for the `python2` command.
If neither are found, try searching for the `python` command.
## Libraries
Spack installs Python libraries in `spec['python'].prefix.lib`. Except on openSUSE 13, where it installs to `spec['python'].prefix.lib64` (see #2295 and #2253). On my CentOS 6 machine, the Python libraries are installed in `/usr/lib64`. Both need to work.
The libraries themselves change name depending on OS and Python version. For Python 2.7 on macOS, I'm seeing:
```
lib/libpython2.7.dylib
```
For Python 3.6 on CentOS 6, I'm seeing:
```
lib/libpython3.so
lib/libpython3.6m.so.1.0
lib/libpython3.6m.so -> lib/libpython3.6m.so.1.0
```
Notice the `m` after the version number. Yeah, that's a thing.
## Headers
In Python 2.7, I'm seeing:
```
include/python2.7/pyconfig.h
```
In Python 3.6, I'm seeing:
```
include/python3.6m/pyconfig.h
```
It looks like all Python 3 installations have this `m`. Tested with Python 3.2 and 3.6 on macOS and CentOS 6
Spack has really nice support for libraries (`find_libraries` and `LibraryList`), but nothing for headers. Fixed.
When a compiler was not found a stacktrace was displayed to user because
there were three arguments to be substituted in a string with only two
substitutions to be done.
Fixes#4026#1167 updated Database.reindex to keep old installation records to
support external packages. However, when a user manually removes a
prefix and reindexes this kept the records so the packages were
still installed according to "spack find" etc. This adds a check
for non-external packages to ensure they are properly installed
according to the directory layout.
- add Version.__format__ to support new-style formatting.
- Python3 doesn't handle this well -- it delegates to
object.__format__(), which raises an error for fancy format strings.
- not sure why it doesn't call str(self).__format__ instead, but that's
hwo things are.
* Properly ignore flake8 F811 redefinition errors
* Add unit tests for flake8 command
* Allow spack flake8 to work on systems with older git
* Skip flake8 unit tests for Python 2.6 and 3.3
* treats correctly a change from `explicit=False` to `explicit=True` in an external package DB entry.
* added unit tests
* fixed issues raised by @tgamblin . In particular the PR is no more hash-changing for packages that are not external.
* added a test to check correctness of a spec/yaml round-trip for things that involve an external
* Don't find external module path at each step of concretization
* it's not necessary.. The paths are retrieved at the end of concretizaion
* Don't find replacements for external packages.
* Test root of the DAG if external
* No reason not to test if the root of the DAG is external when external
packages are now first class citizens!
* Create `external` property for Spec (for external_path and external_module)
* Allow users to specify external package paths relative to spack
* Canonicalize external package paths so that users may specify their
locations relative to spack's directory.
* Update tests to use new external_path and external properly.
* skip license hooks on external
- Spack doesn't need eggs -- it manages its own directories
- Simplify install layout and reduce sys.path searches by installing all
packages flat (eggs are deprecated for wheels, and this is also what
wheels do).
- We now supply the --single-version-externally-managed argument to
`setup.py install` for setuptools packages and setuptools.
- modify packages to only use setuptools args if setuptools is an
immediate dependency
- Remove setuptools from packages that do not need it.
- Some packages use setuptools *only* when certain args (likeb
'develop' or 'bdist') are supplied to setup.py, and they specifically
do not use setuptools for installation.
- Spack never calls setup.py this way, so just removing the setuptools
dependency works for these packages.
* fetch git submodules recursively
This is useful if the submodules have submodules themselves. On
the other hand doing a recursive update doesn't hurt if there
is only one level.
* fetch submodules with depth=1 as well (fix#2190)
* use git submodule with depth only for git>=1.8.4
- Spack install would previously fail if it could not load a package for
the thing being uninstalled.
- This reworks uninstall to handle cases where the package is no longer
known, e.g.:
a) the package has been renamed or is no longer in Spack
b) the repository the package came from is no longer registered in
repos.yaml
- gcc on macOS says it's version 4.2.1, but it's really clang, and it's
actually the *same* clang as the system clang.
- It also doesn't respond with a full path when called with
--print-file-name=libstdc++.dylib, which is expected from gcc in abi.py.
Instead, it gives a relative path and _gcc_compiler_compare doesn't
understand what to do with it. This results in errors like:
```
lib/spack/spack/abi.py, line 71, in _gcc_get_libstdcxx_version
libpath = os.readlink(output.strip())
OSError: [Errno 2] No such file or directory: 'libstdc++.dylib'
```
- This commit does two things:
1. Ignore any gcc that's actually clang in abi.py. We can probably do
better than this, but it's not clear there is a need to, since we
should handle the compiler as clang, not gcc.
2. Don't auto-detect any "gcc" that is actually clang anymore. Ignore
it and expect people to use clang (which is the default macOS
compiler anyway).
Users can still add fake gccs to their compilers.yaml if they want, but
it's discouraged.
* Checksum code wasn't opening binary files as binary.
- Fixes Python 3 issue where files are opened as unicode text by default,
and decoding fails for binary blobs.
* Simplify fetch test parametrization.
* - add tests for URL fetching and checksumming.
- fix coverage on interface functions in FetchStrategy superclass
- add some extra crypto tests.