* Properly ignore flake8 F811 redefinition errors
* Add unit tests for flake8 command
* Allow spack flake8 to work on systems with older git
* Skip flake8 unit tests for Python 2.6 and 3.3
* treats correctly a change from `explicit=False` to `explicit=True` in an external package DB entry.
* added unit tests
* fixed issues raised by @tgamblin . In particular the PR is no more hash-changing for packages that are not external.
* added a test to check correctness of a spec/yaml round-trip for things that involve an external
* Don't find external module path at each step of concretization
* it's not necessary.. The paths are retrieved at the end of concretizaion
* Don't find replacements for external packages.
* Test root of the DAG if external
* No reason not to test if the root of the DAG is external when external
packages are now first class citizens!
* Create `external` property for Spec (for external_path and external_module)
* Allow users to specify external package paths relative to spack
* Canonicalize external package paths so that users may specify their
locations relative to spack's directory.
* Update tests to use new external_path and external properly.
* skip license hooks on external
- Spack doesn't need eggs -- it manages its own directories
- Simplify install layout and reduce sys.path searches by installing all
packages flat (eggs are deprecated for wheels, and this is also what
wheels do).
- We now supply the --single-version-externally-managed argument to
`setup.py install` for setuptools packages and setuptools.
- modify packages to only use setuptools args if setuptools is an
immediate dependency
- Remove setuptools from packages that do not need it.
- Some packages use setuptools *only* when certain args (likeb
'develop' or 'bdist') are supplied to setup.py, and they specifically
do not use setuptools for installation.
- Spack never calls setup.py this way, so just removing the setuptools
dependency works for these packages.
* fetch git submodules recursively
This is useful if the submodules have submodules themselves. On
the other hand doing a recursive update doesn't hurt if there
is only one level.
* fetch submodules with depth=1 as well (fix#2190)
* use git submodule with depth only for git>=1.8.4
- Spack install would previously fail if it could not load a package for
the thing being uninstalled.
- This reworks uninstall to handle cases where the package is no longer
known, e.g.:
a) the package has been renamed or is no longer in Spack
b) the repository the package came from is no longer registered in
repos.yaml
- gcc on macOS says it's version 4.2.1, but it's really clang, and it's
actually the *same* clang as the system clang.
- It also doesn't respond with a full path when called with
--print-file-name=libstdc++.dylib, which is expected from gcc in abi.py.
Instead, it gives a relative path and _gcc_compiler_compare doesn't
understand what to do with it. This results in errors like:
```
lib/spack/spack/abi.py, line 71, in _gcc_get_libstdcxx_version
libpath = os.readlink(output.strip())
OSError: [Errno 2] No such file or directory: 'libstdc++.dylib'
```
- This commit does two things:
1. Ignore any gcc that's actually clang in abi.py. We can probably do
better than this, but it's not clear there is a need to, since we
should handle the compiler as clang, not gcc.
2. Don't auto-detect any "gcc" that is actually clang anymore. Ignore
it and expect people to use clang (which is the default macOS
compiler anyway).
Users can still add fake gccs to their compilers.yaml if they want, but
it's discouraged.
* Checksum code wasn't opening binary files as binary.
- Fixes Python 3 issue where files are opened as unicode text by default,
and decoding fails for binary blobs.
* Simplify fetch test parametrization.
* - add tests for URL fetching and checksumming.
- fix coverage on interface functions in FetchStrategy superclass
- add some extra crypto tests.
* Package install remove prior unfinished installs
Depending on how spack is terminated in the middle of building a
package it may leave a partially installed package in the install
prefix. Originally Spack treated the package as being installed if
the prefix was present, in which case the user would have to
manually remove the installation prefix before restarting an
install. This commit adds a more thorough check to ensure that a
package is actually installed. If the installation prefix is present
but Spack determines that the install did not complete, it removes
the installation prefix and starts a new install; if the user has
enabled --keep-prefix, then Spack reverts to its old behavior.
* Added test for partial install handling
* Added test for restoring DB
* Style fixes
* Restoring 2.6 compatibility
* Relocated repair logic to separate function
* If --keep-prefix is set, package installs will continue an install from an existing prefix if one is present
* check metadata consistency when continuing partial install
* Added --force option to make spack reinstall a package (and all dependencies) from scratch
* Updated bash completion; removed '-f' shorthand for '--force' for install command
* dont use multiple write modes for completion file
* Add tests to mercurial package
* Add support for --insecure with mercurial fetching
* Install man pages and tab-completion scripts
* Add tests and latest version for all deps
* Flake8 fix
* Use certifi module to find CA certificate
* Flake8 fix
* Unset PYTHONPATH when running hg
* svn_fetch should use to svn-test, not hg-test
* Drop Python 3 support in Mercurial
Python 3 support is a work in progress and isn't currently
recommended:
https://www.mercurial-scm.org/wiki/SupportedPythonVersions
* Test both secure and insecure hg fetching
* Test both secure and insecure git and svn fetching
`set_executable` now checks if a user/group.other had read permission
on a file and if it does then it sets the corresponding executable
bit.
See #1483.
Fixes#2587
The concretizer falls back on using the root architecture (followed
by the default system architecture) to fill in unspecified arch
properties for a spec. It failed to check cases where the root could
be None.
* Remove fake URLs from Spack
* Ignore long lines for URLs that start with ftp:
* Preliminary changes to version regexes
* New redesign of version regexes
* Allow letters in version-only
* Fix detection of versions that end in Final
* Rearrange a few regexes and add examples
* Add tests for common download repositories
* Add test cases for common tarball naming schemes
* Finalize version regexes
* spack url test -> spack url summary
* Clean up comments
* Rearrange suffix checks
* Use query strings for name detection
* Remove no longer necessary url_for_version functions
* Strip off extraneous information after package name
* Add one more test
* Dot in square brackets does not need to be escaped
* Move renaming outside of parse_name_offset
* Fix versions for a couple more packages
* Fix flake8 and doc tests
* Correctly parse Python, Lua, and Bio++ package names
* Use effective URLs for mfem
* Add checksummed version to mitos
* Remove url_for_version from STAR-CCM+ package
* Revert changes to version numbers with underscores and dashes
* Fix name detection for tbb
* Correctly parse Ruby gems
* Reverted mfem back to shortened URLs.
* Updated instructions for better security
* Remove preferred=True from newest version
* Add tests for new `spack url list` flags
* Add tests for strip_name_suffixes
* Add unit tests for version separators
* Fix bugs related to parseable name but in parseable version
* Remove dead code, update docstring
* Ignore 'binary' at end of version string
* Remove platform from version
* Flip libedit version numbers
* Re-support weird NCO alpha/beta versions
* Rebase and remove one new fake URL
* Add / to beginning of regex to avoid picking up similarly named packages
* Ignore weird tar versions
* Fix bug in url parse --spider when no versions found
* Less strict version matching for spack versions
* Don't rename Python packages
* Be a little more selective, version must begin with a digit
* Re-add fake URLs
* Fix up several other packages
* Ignore more file endings
* Add parsing support for Miniconda
* Update tab completion
* XFAILS are now PASSES for 2 web tests
- _spider in web.py was actually failing to spider deeper than a certain
point.
- Fixed multiprocessing pools to not use daemons and to allow recursive
spawning.
- Added detailed tests for spidering and for finding archive versions.
- left some xfail URL finding exercises for the reader.
- Fix noqa annotations for some @when decorators
- Clean up spec_syntax tests: don't dependend on DB order.
- spec_syntax hash parsing tests were strongly dependent on the order the
DB was traversed.
- Tests now specifically grab the specs they want from the mock DB.
- Tests are more readable as a result.
- Add Python3 versions to Travis tests.
1. Fix#2807: Can't depend on virtual and non-virtual package
- This is tested by test_my_dep_depends_on_provider_of_my_virtual_dep in
the concretize.py test.
- This was actually working in the test suite, but it depended on the
order the dependencies were resolved in. Resolving non-virtual then
virtual worked, but virtual, then non-virtual did not.
- Problem was that an unnecessary copy was made of a spec that already
had some dependencies set up, and the copy lost half of some of the
dependency relationships. This caused the "can'd depend on X twice
error".
- Fix by eliminating unnecessary copy and ensuring that dep parameter of
_merge_dependency is always safe to own -- i.e. it's a defensive copy
from somewhere else.
2. Fix bug and simplify concretization of deptypes.
- deptypes weren't being accumulated; they were being set on each
DependencySpec. This could cause concretization to get into an infinite
loop.
- Fixed by accumulating deptypes in DependencySpec.update_deptypes()
- Also simplified deptype normalization logic: deptypes are now merged in
constrain() like everything else -- there is no need to merge them
specially or to look at dpeendents in _merge_dependency().
- Add some docstrings to deptype tests.
- Get rid of pkgsort() usage for preferred variants.
- Concretization is now entirely based on key-based sorting.
- Remove PreferredPackages class and various spec cmp() methods.
- Replace with PackagePrefs class that implements a key function for
sorting according to packages.yaml.
- Clear package pref caches on config test.
- Explicit compare methods instead of total_ordering in Version.
- Our total_ordering backport wasn't making Python 3 happy for some
reason.
- Python 3's functools.total_ordering and spelling the operators out
fixes the problem.
- Fix unicode issues with spec hashes, json, & YAML
- Try to use str everywhere and avoid unicode objects in python 2.
- Remove ascii encoding assumption from spack_yaml
- proc.communicate() returns bytes; convert to str before adding.
- Fix various byte string/unicode issues for Python 2/3 support
- Need to decode subprocess output as utf-8 in from_sourcing_files.
- Fix comments in strify()
- convert print, StringIO, except as, octals, izip
- convert print statement to print function
- convert StringIO to six.StringIO
- remove usage of csv reader in Spec, in favor of simple regex
- csv reader only does byte strings
- convert 0755 octal literals to 0o755
- convert `except Foo, e` to `except Foo as e`
- fix a few places `str` is used.
- may need to switch everything to str later.
- convert iteritems usages to use six.iteritems
- fix urllib and HTMLParser
- port metaclasses to use six.with_metaclass
- More octal literal conversions for Python 2/3
- Fix a new octal literal.
- Convert `basestring` to `six.string_types`
- Convert xrange -> range
- Fix various issues with encoding, iteritems, and Python3 semantics.
- Convert contextlib.nested to explicitly nexted context managers.
- Convert use of filter() to list comprehensions.
- Replace reduce() with list comprehensions.
- Clean up composite: replace inspect.ismethod() with callable()
- Python 3 doesn't have "method" objects; inspect.ismethod returns False.
- Need to use callable in Composite to make it work.
- Update colify to use future division.
- Fix zip() usages that need to be lists.
- Python3: Use line-buffered logging instead of unbuffered.
- Python3 raises an error with unbuffered I/O
- See https://bugs.python.org/issue17404
- Update YAML version to support Python 3
- Python 3 support for ordereddict backport
- Exclude Python3 YAML from version tests.
- Vendor six into Spack.
- Make Python version-check tests work with Python 3
- Add ability to add version check exceptions with '# nopyqver' line
comments.
* Run python setup.py test if --run-tests
* Attempt to import the Python module after installation
* Add testing support to numpy and scipy
* Remove duplicated comments
* Update to new run-tests callback methodology
* Remove unrelated changes for another PR
* perl: make extendable and add Module::Build package
* perl: allow 'spack create' to identify perl packages from their contents
* perl-module-build: fix indenting of package docstring
* perl: split install() method for extensions into phases
* perl: auto-detect build method (Makefile.PL vs Build.PL) and define a 'check' method
* PerlPackage: use import statements similar to those in AutotoolsPackage
* PerlModule: fix detection of Build.PL
* PerlPackageTemplate: remove extraneous lines to avoid flake8 warnings
* PerlPackageTemplate: split into separate templates for Makefile.PL and Build.PL
* PerlPackage: add cross-references to docstrings
* AutotoolsPackage: fix ambiguous cross-references to avoid errors in doc tests
* PerlbuildPackageTemplate: depend on perl-module-build if Build.PL exists
- Spack find would fail with "unknown namespace" for some queries when a
package from an unknown namespace was installed.
- Solve by being conservative: assume unknown packages are NOT providers
of virtual dependencies.
- deactivate -a wouldn't work if the installation's package was no longer
available.
- Fix installed_extensions_for so that it doesn't need to look at the
package.py file.
This fixes the problem described in #3374, which describes `spack find` ignore explicit/implicit.
I believe that this was broken in #2626.
This restores the behavior of implicit/explicit for me.
I believe that it does not screw anything else up, but ....
* Order listed compiler sections
"spack compiler list" output compiler sections in an arbitrary order.
With this commit compiler sections are ordered primarily by compiler
name and then by operating system and target.
* Compiler search lists config files with compilers
If a compiler entry is already defined in a configuration file that
the user does not know about, they may be confused when that compiler
is not added by "spack compiler find". This commit adds a message at
the end of "spack compiler find" to inform the user of the locations
of all config files where compilers are defined.
Fixes#1476
Concretization uses compilers defined in config files and if those
are not available defaults to searching typical paths where the
detected operating system would have a compiler. If there is an OS
update, the detected OS can change; in this case all compilers
defined in the config files would no longer match (because they would
be associated with the previous OS version). The error message in
this case was too vague. This commit adds logic for detecting when it
is likely that the OS has been updated (in particular when that
affects compiler concretization) and improves the information provided
to the user in the error message.
* Dont propagate flags between different compilers
Fixes#2786
Previously when a spec had no parents with an equivalent compiler,
Spack would default to adding the compiler flags associated with the
root of the DAG. This eliminates that default.
* added test for compiler flag propagation
* simplify compiler flag propagation logic
Fixes#3428
Users can run 'spack compiler find' to automatically initialize their
compilers.yaml configuration file. It also turns out that Spack will
implicitly initialize the compilers configuration file as part of
detecting compilers if none are found (so if a user were to attempt to
concretize a spec without running 'spack compiler find' it would not
fail). However, in this case Spack was overlooking its own implicit
initialization of the config files and would report that no new
compilers were found. This commit removes implicit initialization when
the user calls 'spack compiler find'.
This did not surface until #2999 because the 'spack compiler' command
defaulted to using a scope 'user/platform' that was not accounted for
in get_compiler_config (where the implicit initialization logic
predates the addition of this new scope); #2999 removed the scope
specification when checking through config files, leading to the
implicit initialization.
Previously, this would fail with a NoSuchMethodError:
class Package(object):
# this is the default implementation
def some_method(self):
...
class Foo(Package):
@when('platform=cray')
def some_method(self):
...
@when('platform=linux')
def some_method(self):
...
This fixes the implementation of `@when` so that the superclass method
will be invoked when no subclass method matches.
Adds tests to ensure this works, as well.
* default scope for config command is made consistent with cmd/__init__ default
* dont specify a scope when looking for compilers with a matching spec (since compiler concretization is scope-independent)
* config edit should default to platform-specific file only for compilers
* when duplicate compiler specs are detected, the exception raised now points the user to the files where the duplicates appear
* updated error message to emphasize that a spec is duplicated (since multiple specs can reference the same compiler)
* 'spack compilers' is now also broken down into sections by os and target
* Added tests for new compiler methods
Modifications:
- `dump_packages` copies build dependencies into `$prefix/.spack`, as well as the link/run dependencies that we already copied there.
- fake installs copy dependency packages into `$prefix/.spack` as well
- Added a new interface for Specs to pass build information
- Calls forwarded from Spec to Package are now explicit
- Added descriptor within Spec to manage forwarding
- Added state in Spec to maintain query information
- Modified a few packages (the one involved in spack install pexsi) to showcase changes
- This uses an object wrapper to `spec` to implement the `libs` sub-calls.
- wrapper is returned from `__getitem__` only if spec is concrete
- allows packagers to access build information easily
It seems the tests in `packages.py` were running just because we had a specific order of execution. This should fix the problem, and make the test_suite more resilient to running order.
- Fix format printing to match command line for hashes and full name formats
- Update spack graph to use new format
- Changed format string signifier for hashes from `$#` to `$/`
Modules generated by the module creation machinery currently print out
a notice that warnts the user that things are being autoloaded. In
some situations those warnings are problematic. See #2754 for
discussion.
This is a first cut at optionally disabling the warning messages:
- adds a helper tothe EnvModule base class that encapsulates the
config file variable;
- adds a method to the base class that provides a default (empty)
code fragment for generating a warning message;
- passes the warning fragment into the bit that formats the autoload
string;
- adds specialized autload_warner() methods in the tcl and lmod
subclasses;; and finally
- touches up the autoload_format strings in the specialized classes.
Add the ability to the modules generation process to blacklist
packages that were installed implicitly. One can still whitelist
modules that were installed implicitly.
This changes adds a `blacklist_implicts` boolean as a peer to the
`whitelist` and `blacklist` arrays, e.g.:
```
modules:
enable::
- lmod
lmod:
whitelist:
- 'lua'
- 'py-setuptools'
blacklist:
- '%gcc@4.8.3'
blacklist_implicits: True
```
It adds a small helper in `spec.py` and then touches up the package
filtering code in `modules.py`.