* Parse modules in a way that works for both lmod and tcl
* added test and made method more robust
* refactoring for pythonic clarity
* Improved detection of 'module' shell function + refactored module utilities into spack.util.module_cmd
* Improved regex to reject nested parentheses we are not prepared to handle
* make tests backwards compatible with python 2.6
* Improved regex to account for sh being aliased to bash and used in bash module definition on some systems
* Improve test compatibility with lmod
* Added error for None module_cmd
* Add test for get_module_cmd_from_which()
Add test for get_module_cmd_from_which().
Add -c argument to Popen call to typeset -f module in get_module_cmd_from_bash().
* Increased detection options
Included BASH_FUNC_module() variable outside of typeset as a detection option
This should work on bash even in restricted_shell mode
Kept the typeset detection as an option in case the module function is not exported in bash
Also added try statements to tests, with environment recreation in finally blocks.
* More tests added; some hackiness
* increased test coverage for util/module_cmd
* Code changes to enable system config scope in /etc
Files will go in either /etc/spack or /etc/spack/<platform>
Required minor changes to conftest.
* Updated documentation to match new config scope
- previous code called `which` on $EDITOR, but that doesn't work for
EDITORs like `emacs -nw` or `emacsclient -t -nw`.
- This patch just trusts EDITOR if it is set (same as previous
behavior), and only uses the defaults if it's not.
* issue 4492: added xfailing test, added owner to DependencyMap
* DependencyMap.concrete checks if we have unconditional dependencies
This fixes#4492 and #4107 using some heuristics to avoid an infinite
recursion among Spec.satisfies, Spec.concrete and DependencyMap.concrete
The idea, as suggested by @becker33, is to check just for unconditional
dependencies. This is not covering the whole space and a package with
just conditional dependencies can still fail in the same way. It should
cover though all the **real** packages we have in our repo so far.
* Check for CRAYPE_VERSION instead of path
Architecture tests would fail on Cray since it would not find
the expected path. To make the test correctly work on Cray search
for the CRAYPE version instead.
* Catch SystemExit error in case flake8 not in path
On shared systems having flake8 can involve starting own virtual env.
Skip the test if no flake8 is found to avoid failure reporting.
* Add compatibility to 1.5 svnadmin create
The flag added is needed to correctly create svn repos on NERSC systems.
This could be unnecessary for other sites. I'd like to see others
test before this change gets merged.
- Skip spack flake8 test when flake8 is not installed.
- Fix parsing of dashes in specs broken by new help parser.
- use argparse.REMAINDER instead of narg='?'
- don't interpret parts of specs like -mpi as arguments.
* Initial version of the namd package
* Modified charm to consider compile against intel/intel-mpi
* Correction of namd to compile with intel-mkl and intel compiler
* Adding inclue64 in the prefix
* adding property for the build directory
* removing useless function build
* During install, remove prior unfinished installs
If a user performs an installation which fails, in some cases the
install prefix is still present, and the stage path may also be
present. With this commit, unless the user specifies
'--keep-prefix', installs are guaranteed to begin with a clean
slate. The database is used to decide whether an install finished,
since a database record is not added until the end of the install
process.
* test updates
* repair_partial uses keep_prefix and keep_stage
* use of mock stage object to ensure that stage is destroyed when it should be destroyed (and otherwise not)
* add --restage option to 'install' command; when this option is not set, the default is to reuse a stage if it is found.
- Add a `spack gpg` subcommand in anticipation of signed binaries.
- GPG keys are stored in var/spack/gpg, and the spack gpg command manages them.
- Docs are included on the command.
* Touch up string expansion.
I'm chasing this:
```
$ (module purge; spack install perl %gcc/5.4.0)
==> Error: No installed spec matches the hash: '%s'
```
There's something deeper going on, but the error message isn't helpful.
After this change it tells me this:
```
$ (module purge; spack install perl %gcc/5.4.0)
==> Error: No installed spec matches the hash: '5.4.0'
```
Which is weird because `5.4.0` is not a hash... Whatever is going on here, the error message needs to be fixed.
* Flake8 whitespace
* fix parser
* Removed xfails
* cleaned up debug print statements
* make use of these changes in gcc
* Added comment explaining unreachable line, line left for added protection
* Sphinx no longer supports Python 2.6
* Update vendored sphinxcontrib.programoutput from 0.9.0 to 0.10.0
* Documentation cannot be built in parallel
* Let Travis install programoutput for us
* Remove vendored sphinxcontrib-programoutput
Recent updates to the sphinx package prevent the vendored version
from being found in sys.path. We don't vendor sphinx, so it doesn't
make sense to vendor sphinxcontrib-programoutput either.
PR #3367 inadvertently changed the semantics of _find_recursive and
_find_non_recursive so that the returned list are not ordered as the
input search list. This commit restores the original semantic, and adds
tests to verify it.
Added DFLAGS to the `make.inc` file being written.
These macros are also added to the language specific variables
like CFLAGS, CXXFLAGS and FCFLAGS. Changed `spec.satisfies('foo')`
with `'foo' in spec` in `intel-mkl`, see #4135. Added a basic
build interface to `intel-mpi`.
It seems that parse_anonymous_spec may fail if more than one part
(variant, version range, etc.) is given to the function. Added tests to
code against to fix the problem in #4144.
- Full help is now only generated lazily, when needed.
- Executing specific commands doesn't require loading all of them.
- All commands are only loaded if we need them for help.
- There is now short and long help:
- short help (spack help) shows only basic spack options
- long help (spack help -a) shows all spack options
- Both divide help on commands into high-level sections
- Commands now specify attributes from which help is auto-generated:
- description: used in help to describe the command.
- section: help section
- level: short or long
- Clean up command descriptions
- Add a `spack docs` command to open full documentation
in the browser.
- move `spack doc` command to `spack pydoc` for clarity
- Add a `spack --spec` command to show documentation on
the spec syntax.
* SV variants are evaluated correctly in `when=` statements fixes#4113
The problem here was tricky:
```python
spec.satisfies(other)
```
changes already the MV variants in others into SV variants (where
necessary) if spec is concrete. If it is not concrete it does
nothing because we may be acting at a pure syntactical level.
When evaluating a `when=` keyword spec is for sure not concrete
as it is in the middle of the concretization process. In this case we
have to trigger manually the substitution in other to not end up
comparing a MV variant "foo=bar" to a SV variant "foo=bar" and having
False in return. Which is wrong.
* sv variants: improved error message for typos in "when=" statements
Modifications:
- added support for multi-valued variants
- refactored code related to variants into variant.py
- added new generic features to AutotoolsPackage that leverage multi-valued variants
- modified openmpi to use new features
- added unit tests for the new semantics
This allows people on systems that don't have all the fetchers to still
run Spack tests. Mark tests that require git, subversion, or mercurial to
be skipped if they're not installed.
* Filter all system paths introduced by dependencies from PATH
* Make sure path filtering works *even* for trailing slashes
* Revert some of the changes to `filter_system_paths`
* Yes, `bin64` is a real thing (sigh)
* add tests: /usr, /usr/, /usr/local/../bin, etc.
* Convert from rST to Google-style docstrings
The required hash of a submodule might point to the
non-HEAD commit of the current main branch and hence
would lead to a "no such remote ref" at checkout in
a shallow submodule.
## Motivation
Python installations are both important and unfortunately inconsistent. Depending on the Python version, OS, and the strength of the Earth's magnetic field when it was installed, the name of the Python executable, directory containing its libraries, library names, and the directory containing its headers can vary drastically.
I originally got into this mess with #3274, where I discovered that Boost could not be built with Python 3 because the executable is called `python3` and we were telling it to use `python`. I got deeper into this mess when I started hacking on #3140, where I discovered just how difficult it is to find the location and name of the Python libraries and headers.
Currently, half of the packages that depend on Python and need to know this information jump through hoops to determine the correct information. The other half are hard-coded to use `python`, `spec['python'].prefix.lib`, and `spec['python'].prefix.include`. Obviously, none of these packages would work for Python 3, and there's no reason to duplicate the effort. The Python package itself should contain all of the information necessary to use it properly. This is in line with the recent work by @alalazo and @davydden with respect to `spec['blas'].libs` and friends.
## Prefix
For most packages in Spack, we assume that the installation directory is `spec['python'].prefix`. This generally works for anything installed with Spack, but gets complicated when we include external packages. Python is a commonly used external package (it needs to be installed just to run Spack). If it was installed with Homebrew, `which python` would return `/usr/local/bin/python`, and most users would erroneously assume that `/usr/local` is the installation directory. If you peruse through #2173, you'll immediately see why this is not the case. Homebrew actually installs Python in `/usr/local/Cellar/python/2.7.12_2` and symlinks the executable to `/usr/local/bin/python`. `PYTHONHOME` (and presumably most things that need to know where Python is installed) needs to be set to the actual installation directory, not `/usr/local`.
Normally I would say, "sounds like user error, make sure to use the real installation directory in your `packages.yaml`". But I think we can make a special case for Python. That's what we decided in #2173 anyway. If we change our minds, I would be more than happy to simplify things.
To solve this problem, I created a `spec['python'].home` attribute that works the same way as `spec['python'].prefix` but queries Python to figure out where it was actually installed. @tgamblin Is there any way to overwrite `spec['python'].prefix`? I think it's currently immutable.
## Command
In general, Python 2 comes with both `python` and `python2` commands, while Python 3 only comes with a `python3` command. But this is up to the OS developers. For example, `/usr/bin/python` on Gentoo is actually Python 3. Worse yet, if someone is using an externally installed Python, all 3 commands may exist in the same directory! Here's what I'm thinking:
If the spec is for Python 3, try searching for the `python3` command.
If the spec is for Python 2, try searching for the `python2` command.
If neither are found, try searching for the `python` command.
## Libraries
Spack installs Python libraries in `spec['python'].prefix.lib`. Except on openSUSE 13, where it installs to `spec['python'].prefix.lib64` (see #2295 and #2253). On my CentOS 6 machine, the Python libraries are installed in `/usr/lib64`. Both need to work.
The libraries themselves change name depending on OS and Python version. For Python 2.7 on macOS, I'm seeing:
```
lib/libpython2.7.dylib
```
For Python 3.6 on CentOS 6, I'm seeing:
```
lib/libpython3.so
lib/libpython3.6m.so.1.0
lib/libpython3.6m.so -> lib/libpython3.6m.so.1.0
```
Notice the `m` after the version number. Yeah, that's a thing.
## Headers
In Python 2.7, I'm seeing:
```
include/python2.7/pyconfig.h
```
In Python 3.6, I'm seeing:
```
include/python3.6m/pyconfig.h
```
It looks like all Python 3 installations have this `m`. Tested with Python 3.2 and 3.6 on macOS and CentOS 6
Spack has really nice support for libraries (`find_libraries` and `LibraryList`), but nothing for headers. Fixed.
When a compiler was not found a stacktrace was displayed to user because
there were three arguments to be substituted in a string with only two
substitutions to be done.
Fixes#4026#1167 updated Database.reindex to keep old installation records to
support external packages. However, when a user manually removes a
prefix and reindexes this kept the records so the packages were
still installed according to "spack find" etc. This adds a check
for non-external packages to ensure they are properly installed
according to the directory layout.
- add Version.__format__ to support new-style formatting.
- Python3 doesn't handle this well -- it delegates to
object.__format__(), which raises an error for fancy format strings.
- not sure why it doesn't call str(self).__format__ instead, but that's
hwo things are.
* Properly ignore flake8 F811 redefinition errors
* Add unit tests for flake8 command
* Allow spack flake8 to work on systems with older git
* Skip flake8 unit tests for Python 2.6 and 3.3
* treats correctly a change from `explicit=False` to `explicit=True` in an external package DB entry.
* added unit tests
* fixed issues raised by @tgamblin . In particular the PR is no more hash-changing for packages that are not external.
* added a test to check correctness of a spec/yaml round-trip for things that involve an external
* Don't find external module path at each step of concretization
* it's not necessary.. The paths are retrieved at the end of concretizaion
* Don't find replacements for external packages.
* Test root of the DAG if external
* No reason not to test if the root of the DAG is external when external
packages are now first class citizens!
* Create `external` property for Spec (for external_path and external_module)
* Allow users to specify external package paths relative to spack
* Canonicalize external package paths so that users may specify their
locations relative to spack's directory.
* Update tests to use new external_path and external properly.
* skip license hooks on external
- Spack doesn't need eggs -- it manages its own directories
- Simplify install layout and reduce sys.path searches by installing all
packages flat (eggs are deprecated for wheels, and this is also what
wheels do).
- We now supply the --single-version-externally-managed argument to
`setup.py install` for setuptools packages and setuptools.
- modify packages to only use setuptools args if setuptools is an
immediate dependency
- Remove setuptools from packages that do not need it.
- Some packages use setuptools *only* when certain args (likeb
'develop' or 'bdist') are supplied to setup.py, and they specifically
do not use setuptools for installation.
- Spack never calls setup.py this way, so just removing the setuptools
dependency works for these packages.
* fetch git submodules recursively
This is useful if the submodules have submodules themselves. On
the other hand doing a recursive update doesn't hurt if there
is only one level.
* fetch submodules with depth=1 as well (fix#2190)
* use git submodule with depth only for git>=1.8.4
- Spack install would previously fail if it could not load a package for
the thing being uninstalled.
- This reworks uninstall to handle cases where the package is no longer
known, e.g.:
a) the package has been renamed or is no longer in Spack
b) the repository the package came from is no longer registered in
repos.yaml
- gcc on macOS says it's version 4.2.1, but it's really clang, and it's
actually the *same* clang as the system clang.
- It also doesn't respond with a full path when called with
--print-file-name=libstdc++.dylib, which is expected from gcc in abi.py.
Instead, it gives a relative path and _gcc_compiler_compare doesn't
understand what to do with it. This results in errors like:
```
lib/spack/spack/abi.py, line 71, in _gcc_get_libstdcxx_version
libpath = os.readlink(output.strip())
OSError: [Errno 2] No such file or directory: 'libstdc++.dylib'
```
- This commit does two things:
1. Ignore any gcc that's actually clang in abi.py. We can probably do
better than this, but it's not clear there is a need to, since we
should handle the compiler as clang, not gcc.
2. Don't auto-detect any "gcc" that is actually clang anymore. Ignore
it and expect people to use clang (which is the default macOS
compiler anyway).
Users can still add fake gccs to their compilers.yaml if they want, but
it's discouraged.
* Checksum code wasn't opening binary files as binary.
- Fixes Python 3 issue where files are opened as unicode text by default,
and decoding fails for binary blobs.
* Simplify fetch test parametrization.
* - add tests for URL fetching and checksumming.
- fix coverage on interface functions in FetchStrategy superclass
- add some extra crypto tests.
* Package install remove prior unfinished installs
Depending on how spack is terminated in the middle of building a
package it may leave a partially installed package in the install
prefix. Originally Spack treated the package as being installed if
the prefix was present, in which case the user would have to
manually remove the installation prefix before restarting an
install. This commit adds a more thorough check to ensure that a
package is actually installed. If the installation prefix is present
but Spack determines that the install did not complete, it removes
the installation prefix and starts a new install; if the user has
enabled --keep-prefix, then Spack reverts to its old behavior.
* Added test for partial install handling
* Added test for restoring DB
* Style fixes
* Restoring 2.6 compatibility
* Relocated repair logic to separate function
* If --keep-prefix is set, package installs will continue an install from an existing prefix if one is present
* check metadata consistency when continuing partial install
* Added --force option to make spack reinstall a package (and all dependencies) from scratch
* Updated bash completion; removed '-f' shorthand for '--force' for install command
* dont use multiple write modes for completion file
* Add tests to mercurial package
* Add support for --insecure with mercurial fetching
* Install man pages and tab-completion scripts
* Add tests and latest version for all deps
* Flake8 fix
* Use certifi module to find CA certificate
* Flake8 fix
* Unset PYTHONPATH when running hg
* svn_fetch should use to svn-test, not hg-test
* Drop Python 3 support in Mercurial
Python 3 support is a work in progress and isn't currently
recommended:
https://www.mercurial-scm.org/wiki/SupportedPythonVersions
* Test both secure and insecure hg fetching
* Test both secure and insecure git and svn fetching
`set_executable` now checks if a user/group.other had read permission
on a file and if it does then it sets the corresponding executable
bit.
See #1483.
Fixes#2587
The concretizer falls back on using the root architecture (followed
by the default system architecture) to fill in unspecified arch
properties for a spec. It failed to check cases where the root could
be None.
* Remove fake URLs from Spack
* Ignore long lines for URLs that start with ftp:
* Preliminary changes to version regexes
* New redesign of version regexes
* Allow letters in version-only
* Fix detection of versions that end in Final
* Rearrange a few regexes and add examples
* Add tests for common download repositories
* Add test cases for common tarball naming schemes
* Finalize version regexes
* spack url test -> spack url summary
* Clean up comments
* Rearrange suffix checks
* Use query strings for name detection
* Remove no longer necessary url_for_version functions
* Strip off extraneous information after package name
* Add one more test
* Dot in square brackets does not need to be escaped
* Move renaming outside of parse_name_offset
* Fix versions for a couple more packages
* Fix flake8 and doc tests
* Correctly parse Python, Lua, and Bio++ package names
* Use effective URLs for mfem
* Add checksummed version to mitos
* Remove url_for_version from STAR-CCM+ package
* Revert changes to version numbers with underscores and dashes
* Fix name detection for tbb
* Correctly parse Ruby gems
* Reverted mfem back to shortened URLs.
* Updated instructions for better security
* Remove preferred=True from newest version
* Add tests for new `spack url list` flags
* Add tests for strip_name_suffixes
* Add unit tests for version separators
* Fix bugs related to parseable name but in parseable version
* Remove dead code, update docstring
* Ignore 'binary' at end of version string
* Remove platform from version
* Flip libedit version numbers
* Re-support weird NCO alpha/beta versions
* Rebase and remove one new fake URL
* Add / to beginning of regex to avoid picking up similarly named packages
* Ignore weird tar versions
* Fix bug in url parse --spider when no versions found
* Less strict version matching for spack versions
* Don't rename Python packages
* Be a little more selective, version must begin with a digit
* Re-add fake URLs
* Fix up several other packages
* Ignore more file endings
* Add parsing support for Miniconda
* Update tab completion
* XFAILS are now PASSES for 2 web tests
- _spider in web.py was actually failing to spider deeper than a certain
point.
- Fixed multiprocessing pools to not use daemons and to allow recursive
spawning.
- Added detailed tests for spidering and for finding archive versions.
- left some xfail URL finding exercises for the reader.
- Fix noqa annotations for some @when decorators
- Clean up spec_syntax tests: don't dependend on DB order.
- spec_syntax hash parsing tests were strongly dependent on the order the
DB was traversed.
- Tests now specifically grab the specs they want from the mock DB.
- Tests are more readable as a result.
- Add Python3 versions to Travis tests.
1. Fix#2807: Can't depend on virtual and non-virtual package
- This is tested by test_my_dep_depends_on_provider_of_my_virtual_dep in
the concretize.py test.
- This was actually working in the test suite, but it depended on the
order the dependencies were resolved in. Resolving non-virtual then
virtual worked, but virtual, then non-virtual did not.
- Problem was that an unnecessary copy was made of a spec that already
had some dependencies set up, and the copy lost half of some of the
dependency relationships. This caused the "can'd depend on X twice
error".
- Fix by eliminating unnecessary copy and ensuring that dep parameter of
_merge_dependency is always safe to own -- i.e. it's a defensive copy
from somewhere else.
2. Fix bug and simplify concretization of deptypes.
- deptypes weren't being accumulated; they were being set on each
DependencySpec. This could cause concretization to get into an infinite
loop.
- Fixed by accumulating deptypes in DependencySpec.update_deptypes()
- Also simplified deptype normalization logic: deptypes are now merged in
constrain() like everything else -- there is no need to merge them
specially or to look at dpeendents in _merge_dependency().
- Add some docstrings to deptype tests.
- Get rid of pkgsort() usage for preferred variants.
- Concretization is now entirely based on key-based sorting.
- Remove PreferredPackages class and various spec cmp() methods.
- Replace with PackagePrefs class that implements a key function for
sorting according to packages.yaml.
- Clear package pref caches on config test.
- Explicit compare methods instead of total_ordering in Version.
- Our total_ordering backport wasn't making Python 3 happy for some
reason.
- Python 3's functools.total_ordering and spelling the operators out
fixes the problem.
- Fix unicode issues with spec hashes, json, & YAML
- Try to use str everywhere and avoid unicode objects in python 2.
- Remove ascii encoding assumption from spack_yaml
- proc.communicate() returns bytes; convert to str before adding.
- Fix various byte string/unicode issues for Python 2/3 support
- Need to decode subprocess output as utf-8 in from_sourcing_files.
- Fix comments in strify()
- convert print, StringIO, except as, octals, izip
- convert print statement to print function
- convert StringIO to six.StringIO
- remove usage of csv reader in Spec, in favor of simple regex
- csv reader only does byte strings
- convert 0755 octal literals to 0o755
- convert `except Foo, e` to `except Foo as e`
- fix a few places `str` is used.
- may need to switch everything to str later.
- convert iteritems usages to use six.iteritems
- fix urllib and HTMLParser
- port metaclasses to use six.with_metaclass
- More octal literal conversions for Python 2/3
- Fix a new octal literal.
- Convert `basestring` to `six.string_types`
- Convert xrange -> range
- Fix various issues with encoding, iteritems, and Python3 semantics.
- Convert contextlib.nested to explicitly nexted context managers.
- Convert use of filter() to list comprehensions.
- Replace reduce() with list comprehensions.
- Clean up composite: replace inspect.ismethod() with callable()
- Python 3 doesn't have "method" objects; inspect.ismethod returns False.
- Need to use callable in Composite to make it work.
- Update colify to use future division.
- Fix zip() usages that need to be lists.
- Python3: Use line-buffered logging instead of unbuffered.
- Python3 raises an error with unbuffered I/O
- See https://bugs.python.org/issue17404
- Update YAML version to support Python 3
- Python 3 support for ordereddict backport
- Exclude Python3 YAML from version tests.
- Vendor six into Spack.
- Make Python version-check tests work with Python 3
- Add ability to add version check exceptions with '# nopyqver' line
comments.
* Run python setup.py test if --run-tests
* Attempt to import the Python module after installation
* Add testing support to numpy and scipy
* Remove duplicated comments
* Update to new run-tests callback methodology
* Remove unrelated changes for another PR
* perl: make extendable and add Module::Build package
* perl: allow 'spack create' to identify perl packages from their contents
* perl-module-build: fix indenting of package docstring
* perl: split install() method for extensions into phases
* perl: auto-detect build method (Makefile.PL vs Build.PL) and define a 'check' method
* PerlPackage: use import statements similar to those in AutotoolsPackage
* PerlModule: fix detection of Build.PL
* PerlPackageTemplate: remove extraneous lines to avoid flake8 warnings
* PerlPackageTemplate: split into separate templates for Makefile.PL and Build.PL
* PerlPackage: add cross-references to docstrings
* AutotoolsPackage: fix ambiguous cross-references to avoid errors in doc tests
* PerlbuildPackageTemplate: depend on perl-module-build if Build.PL exists
- Spack find would fail with "unknown namespace" for some queries when a
package from an unknown namespace was installed.
- Solve by being conservative: assume unknown packages are NOT providers
of virtual dependencies.
- deactivate -a wouldn't work if the installation's package was no longer
available.
- Fix installed_extensions_for so that it doesn't need to look at the
package.py file.
This fixes the problem described in #3374, which describes `spack find` ignore explicit/implicit.
I believe that this was broken in #2626.
This restores the behavior of implicit/explicit for me.
I believe that it does not screw anything else up, but ....
* Order listed compiler sections
"spack compiler list" output compiler sections in an arbitrary order.
With this commit compiler sections are ordered primarily by compiler
name and then by operating system and target.
* Compiler search lists config files with compilers
If a compiler entry is already defined in a configuration file that
the user does not know about, they may be confused when that compiler
is not added by "spack compiler find". This commit adds a message at
the end of "spack compiler find" to inform the user of the locations
of all config files where compilers are defined.
Fixes#1476
Concretization uses compilers defined in config files and if those
are not available defaults to searching typical paths where the
detected operating system would have a compiler. If there is an OS
update, the detected OS can change; in this case all compilers
defined in the config files would no longer match (because they would
be associated with the previous OS version). The error message in
this case was too vague. This commit adds logic for detecting when it
is likely that the OS has been updated (in particular when that
affects compiler concretization) and improves the information provided
to the user in the error message.
* Dont propagate flags between different compilers
Fixes#2786
Previously when a spec had no parents with an equivalent compiler,
Spack would default to adding the compiler flags associated with the
root of the DAG. This eliminates that default.
* added test for compiler flag propagation
* simplify compiler flag propagation logic
Fixes#3428
Users can run 'spack compiler find' to automatically initialize their
compilers.yaml configuration file. It also turns out that Spack will
implicitly initialize the compilers configuration file as part of
detecting compilers if none are found (so if a user were to attempt to
concretize a spec without running 'spack compiler find' it would not
fail). However, in this case Spack was overlooking its own implicit
initialization of the config files and would report that no new
compilers were found. This commit removes implicit initialization when
the user calls 'spack compiler find'.
This did not surface until #2999 because the 'spack compiler' command
defaulted to using a scope 'user/platform' that was not accounted for
in get_compiler_config (where the implicit initialization logic
predates the addition of this new scope); #2999 removed the scope
specification when checking through config files, leading to the
implicit initialization.
Previously, this would fail with a NoSuchMethodError:
class Package(object):
# this is the default implementation
def some_method(self):
...
class Foo(Package):
@when('platform=cray')
def some_method(self):
...
@when('platform=linux')
def some_method(self):
...
This fixes the implementation of `@when` so that the superclass method
will be invoked when no subclass method matches.
Adds tests to ensure this works, as well.