Commit graph

4680 commits

Author SHA1 Message Date
Robert Cohn
ab018c2081
intel-oneapi packages: support root installs (#23401)
When installing OneAPI packages as root (e.g. in a container), the
installer places cache files in /var/intel/installercache that
interfere with future Spack installs. This ensures that when
running an installation as a root user that this is removed.
2021-05-05 14:00:34 -07:00
Massimiliano Culpo
8a65bcb7c9
archspec: updated external dependency (#23311)
Added support for Apple M1, extended support
for zen3 with more compiler flags.
2021-05-03 22:27:37 -07:00
Massimiliano Culpo
10389b2b51
Use Python's built-in machinery to import compilers (#23290) 2021-05-03 22:26:48 -07:00
Michael Kuhn
21ad8d4372
cmd: improve shell support help message (#23410)
Users sometimes set up Spack's shell support but still call `bin/spack`,
which results in the help message showing up again.
2021-05-03 18:28:28 -04:00
Harmen Stoppels
9c1c7ab6ca
Use an environment variable to set the default stacktrace behavior (#23357) 2021-05-03 16:22:30 +02:00
Massimiliano Culpo
5b12568c4f
Make Spack able to apply gz compressed remote patches (#22823)
Modified ncbi-rmblastn to retrieve patches from remote
2021-04-28 17:00:58 +02:00
BenWeber42
5cb5aac57e
Fix intersection if a version is a prefix of another (#22941)
* Added test for version intersections when one is prefix of another

* Fix version intersection for prefixes
2021-04-28 08:28:09 -06:00
Harmen Stoppels
3f4c9aeca7
Read colorization from environment variable, if command line is not set (#23130) 2021-04-28 13:03:25 +00:00
Massimiliano Culpo
985e101507
Import hooks using Python's built-in machinery (#23288)
The function we coded in Spack to load Python modules with arbitrary
names from a file seem to have issues with local imports. For
loading hooks though it is unnecessary to use such functions, since
we don't care to bind a custom name to a module nor we have to load
it from an unknown location.

This PR thus modifies spack.hook in the following ways:

- Use __import__ instead of spack.util.imp.load_source (this
  addresses #20005)
- Sync module docstring with all the hooks we have
- Avoid using memoization in a module function
- Marked with a leading underscore all the names that are supposed
  to stay local
2021-04-27 16:55:07 -07:00
Zack Galbreath
295377b2b4
Don't report configure errors to CDash for successful packages (#23286)
Convert configure errors detected by our log scraper into warnings when
the package being installed reports that it was successful.
2021-04-27 12:20:32 -06:00
Tim Haines
c23ffd89ff
Dyninst: Add dependencies for v11.0.0 (#23121)
Also update the mpileaks unit test to avoid a conflict on CentOS 6
where Dyninst >=11.0.0 no longer builds due to a compiler version
conflict.
2021-04-26 13:53:53 -07:00
George Hartzell
6d789a5835
docs: be more precise on what spack add ... does (#23204)
This is as much a question as it is a minor fine-tuning of the docs.  I've been known to add things to an environment by editing the `spack.yaml` file directly.  When I read the previous version of this sentence, I was afraid that `spack add` was actually doing *two* things, modifying the `spack.yaml` and updating something else that defined the roots of the Environment.  A bit of experimentation suggests that editing the `spack.yaml` file is sufficient to change the roots.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-04-23 13:29:19 +00:00
Tamara Dahlgren
6f25e5242e
Docs: Updated copyrights in files still using 2020 as ending year (#23215) 2021-04-22 22:23:09 -07:00
Massimiliano Culpo
3325eff486
ASP-based solve: minimize compiler mismatches (#23016)
fixes #22718

Instead of trying to maximize the number of
matches (preferred behavior), try to minimize
the number of mismatches (unwanted behavior).
2021-04-21 01:02:43 -07:00
Massimiliano Culpo
9a473d6ab3
ASP-based solver: suppress warnings when constructing facts (#23090)
fixes #22786

Trying to get optimization flags for a specific target from
a compiler may trigger warnings. In the context of constructing
facts for the ASP-based solver we don't want to show these
warnings to the user, so here we simply ignore them.
2021-04-21 01:02:10 -07:00
Chris White
58a897be0e
check for package in spec not variant (#23157) 2021-04-20 18:13:15 -06:00
Glenn Johnson
09e80604f5
Catch rstudio based URL for cran attribute in create.py (#23072) 2021-04-20 19:03:42 -05:00
Vanessasaurus
b9a2b1c096
Fixing typo tty.fail -> tty.die and monitor docstrings (#23152)
This isn't a significant issue, but I noticed that the docstring incorrectly references "tty.fail" and I wanted to quickly fix it to reflect the correct command, tty.die. I also wanted to fix the docstrings to not be large clumps, to what @tgamblin suggested after I wrote this - having one line at the top that is a quick summary, and more verbose after that.
2021-04-20 14:53:30 -07:00
Vanessasaurus
e6de04d149
docs: spack does not have a variant debug for libelf (#23021)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-04-16 09:29:42 +02:00
Zack Galbreath
080d9b094f
Return non-zero from CDash reporter when errors are detected (#22962) 2021-04-15 15:11:08 -06:00
Vanessasaurus
7f91c1a510
Merge pull request #21930 from vsoch/add/spack-monitor
This provides initial support for [spack monitor](https://github.com/spack/spack-monitor), a web application that stores information and analysis about Spack installations.  Spack can now contact a monitor server and upload analysis -- even after a build is already done.

Specifically, this adds:
- [x] monitor options for `spack install`
- [x] `spack analyze` command
- [x] hook architecture for analyzers
- [x] separate build logs (in addition to the existing combined log)
- [x] docs for spack analyze
- [x] reworked developer docs, with hook docs
- [x] analyzers for:
  - [x] config args
  - [x] environment variables
  - [x] installed files
  - [x] libabigail

There is a lot more information in the docs contained in this PR, so consult those for full details on this feature.

Additional tests will be added in a future PR.
2021-04-15 00:38:36 -07:00
vsoch
613348ec90 Use gethostname() instead of getfqdn() for lock debug mode
In debug mode, processes taking an exclusive lock write out their node name to
the lock file. We were using `getfqdn()` for this, but it seems to produce
inconsistent results when used from within some github actions containers.

We get this error because getfqdn() seems to return a short name in one place
and a fully qualified name in another:

```
  File "/home/runner/work/spack/spack/lib/spack/spack/test/llnl/util/lock.py", line 1211, in p1
    assert lock.host == self.host
AssertionError: assert 'fv-az290-764....cloudapp.net' == 'fv-az290-764'
  - fv-az290-764.internal.cloudapp.net
  + fv-az290-764
!!!!!!!!!!!!!!!!!!!! Interrupted: stopping after 1 failures !!!!!!!!!!!!!!!!!!!!
== 1 failed, 2547 passed, 7 skipped, 22 xfailed, 2 xpassed in 1238.67 seconds ==
```

This seems to stem from https://bugs.python.org/issue5004.

We don't really need to get a fully qualified hostname for debugging, so use
`gethostname()` because its results are more consistent. This seems to fix the
issue.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2021-04-15 00:01:41 -07:00
Gregory Becker
393542064d updates for new tutorial
update s3 bucket
update tutorial branch
2021-04-14 23:53:07 -07:00
Tiziano Müller
dee030618f
Documentation: update intel-parallel-studio instructions (#22248)
* Clarify stub compiler definition in compilers.yaml
* Update explanation of why stub compiler definition is needed
* Add note about required module definition when using Spack-installed
  intel-parallel-studio as intel-compiler
* Add suggestion about updating package config preferences based on
  choice of variants when installing intel-parallel-studio to avoid
  reinstallation
2021-04-13 13:31:14 -07:00
Tiziano Müller
a580788d86
intel-parallel-studio: fix vtune installation for 2020+ (#22255)
vtune_amplifier got renamed to vtune_profiler for the 2020+ suite
2021-04-13 13:01:58 +02:00
Peter Scheibel
51df9b0c9c
Externals with merged prefixes (#22653)
We remove system paths from search variables like PATH and 
from -L options because they may contain many packages and
could interfere with Spack-built packages. External packages 
may be installed to prefixes that are not actually system paths 
but are still "merged" in the sense that many other packages are
installed there. To avoid conflicts, this PR places all external
packages at the end of search paths.
2021-04-12 11:19:29 +02:00
Massimiliano Culpo
215d194482
ASP-based solver: assign OS correctly with inheritance from parent (#22896)
fixes #22871

When in presence of multiple choices for the operating system
we were lacking a rule to derive the node OS if it was
inherited.
2021-04-11 01:01:09 -06:00
Peter Scheibel
f624ce0834
Build process output: handle UTF-8 for python 3.x to 3.7 (#22888)
We set LC_ALL=C to encourage a build process to generate ASCII
output (so our logger daemon can decode it). Most packages
respect this but it appears that intel-oneapi-compilers does
not in some cases (see #22813). This reads the output of the build
process as UTF-8, which still works if the build process respects
LC_ALL=C but also works if the process generates UTF-8 output.

For Python >= 3.7 all files are opened with UTF-8 encoding by
default. Python 2 does not support the encoding argument on
'open', so to support Python 2 the files would have to be
opened in byte mode and explicitly decoded (as a side note,
this would be the only way to handle other encodings without
being informed of them in advance).
2021-04-09 18:10:01 -07:00
Todd Gamblin
19b6d3589a
bugfix: spack config blame should print all lines of config (#22598)
* bugfix: fix representation of null in spack_yaml output

Nulls were previously printed differently by `spack config blame config`
and `spack config get config`.  Fix this in the `spack_yaml` dumpers.

* bugfix: `spack config blame` should print all lines of config

`spack config blame` was not printing all lines of configuration because
there were no annotations for empty lines in the YAML dump output. Fix
this by removing empty lines.
2021-04-08 16:37:16 +02:00
Toyohisa Kameyama
d805be02ec
autotools: ensure config.guess and config.sub are writeable before patching them (#19837) 2021-04-08 14:07:43 +02:00
Robert Cohn
c8b4414230
[oneapi] fix mkl deps, externally installed, and docs (#22607) 2021-04-07 10:31:08 -06:00
Harmen Stoppels
bbc666a1d2
meson: added variants, changed defaults for the build system (#22715)
- Use debugoptimized as default build type, just like RelWithDebInfo for cmake
- Do not strip by default, and add a default_library variant which conveniently support both shared and static
2021-04-06 17:57:31 +02:00
Harmen Stoppels
0fcda35a71
spack location: fix usage without args (#22755) 2021-04-06 08:17:58 +00:00
Adam J. Stewart
3336fff229
Remove erroneous warnings about quotes for from_source_file (#22767) 2021-04-06 07:13:54 +02:00
Zack Galbreath
7cc2db1b4e
Check against a list of known-broken specs in ci generate (#22690)
* Strip leading slash from S3 key in url_exists()

* Check against a list of known-broken specs in `ci generate`
2021-04-02 17:40:47 -06:00
Harmen Stoppels
69d123a1a0
Document unzip (#22723) 2021-04-02 20:56:24 +00:00
Todd Gamblin
0d387678b7
concretizer: improve display of optimization criteria (#22433)
By default, clingo doesn't show any optimization criteria (maximized or
minimized sums) if the set they aggregate is empty. Per the clingo
mailing list, we can get around that by adding, e.g.:

```
 #minimize{ 0@2 : #true }.
```

for the 2nd criterion. This forces clingo to print out the criterion but
does not affect the optimization.

This PR adds directives as above for all of our optimization criteria, as
well as facts with descriptions of each criterion,like this:

```
opt_criterion(2, "number of non-default variants")
```

We use facts in `concretize.lp` rather than hard-coding these in `asp.py`
so that the names can be maintained in the same place as the other
optimization criteria.

The now-displayed weights and the names are used to display optimization
output like this:

```console
(spackle):solver> spack solve --show opt zlib
==> Best of 0 answers.
==> Optimization Criteria:
  Priority  Criterion                                            Value
  1         version weight                                           0
  2         number of non-default variants (roots)                   0
  3         multi-valued variants + preferred providers for roots    0
  4         number of non-default variants (non-roots)               0
  5         number of non-default providers (non-roots)              0
  6         count of non-root multi-valued variants                  0
  7         compiler matches + number of nodes                       1
  8         version badness                                          0
  9         non-preferred compilers                                  0
  10        target matches                                           0
  11        non-preferred targets                                    0

zlib@1.2.11%apple-clang@12.0.0+optimize+pic+shared arch=darwin-catalina-skylake
```

Note that this is all hidden behind a `--show opt` option to `spack
solve`. Optimization weights are no longer shown by default, but you can
at least inspect them and more easily understand what is going on.

- [x] always show optimization criteria in `clingo` output
- [x] add `opt_criterion()` facts for all optimizationc criteria
- [x] make display of opt criteria optional in `spack solve`
- [x] rework how optimization criteria are displayed, and add a `--show opt`
      optiong to `spack solve`
2021-04-02 08:54:49 +00:00
Greg Becker
fb062428f9 add CachedCMakePackage for using CMake initial config files
CachedCMakePackage is a CMakePackage subclass for using CMake initial
cache. This feature of CMake allows packages to increase reproducibility,
especially between spack builds and manual builds. It also allows
packages to sidestep certain parsing bugs in extremely long cmake
commands, and to avoid system limits on the length of the command line.

Co-authored by: Chris White <white238@llnl.gov>
2021-04-01 20:06:39 -07:00
Todd Gamblin
fc48c63355 Revert "CachedCMakePackage for using *.cmake initial config files (#19316)""
This reverts commit 7daf582357.
2021-04-01 20:06:39 -07:00
Elizabeth Fischer
82e97124c8
bugfix: compiler wrappers should handle extra spaces between arguments (#22725)
In the face of two consecutive spaces in the command line, the compiler wrapper would skip all remaining arguments, causing problems building py-scipy with Intel compiler. This PR solves the problem.

* Fixed compiler wrapper in the face of extra spaces between arguments

Co-authored-by: Elizabeth Fischer <elizabeth.fischer@alaska.edu>
2021-04-01 18:39:06 +00:00
Greg Becker
7daf582357 CachedCMakePackage for using *.cmake initial config files (#19316)"
Original commit message:
This feature of CMake allows packages to increase reproducibility, especially between
Spack- and manual builds. It also allows packages to sidestep certain parsing bugs in
extremely long ``cmake`` commands, and to avoid system limits on the length of the
command line.

Adding:
Co-authored by: Chris White <white238@llnl.gov>

This reverts commit c4f0a3cf6c.
2021-03-31 18:38:22 -07:00
Chris White
c4f0a3cf6c Revert "CachedCMakePackage for using *.cmake initial config files (#19316)"
This reverts commit 764c170530.
2021-03-31 18:34:45 -07:00
Greg Becker
764c170530
CachedCMakePackage for using *.cmake initial config files (#19316)
CachedCMakePackage is a specialized class for packages built using CMake initial cache.

This feature of CMake allows packages to increase reproducibility, especially between
Spack- and manual builds. It also allows packages to sidestep certain parsing bugs in
extremely long ``cmake`` commands, and to avoid system limits on the length of the
command line.
2021-03-31 23:55:19 +00:00
Todd Gamblin
cf9adfd748
hotfix: make ifx work with autoconf <= 2.69 in Spack (#22683)
Autoconf before 2.70 will erroneously pass ifx's -loopopt argument to the
linker, requiring all packages to use autoconf 2.70 or newer to use ifx.

This is a hotfix enabling ifx to be used in Spack. Instead of bothering
to upgrade autoconf for every package, we'll just strip out the
problematic flag if we're in `ld` mode.

- [x] Add a conditional to the `cc` wrapper to skip `-loopopt` in `ld`
      mode. This can probably be generalized in the future to strip more
      things (e.g., via an environment variable we can constrol from
      Spack) but it's good enough for now.

- [x] Add a test ensuring that `-loopopt` arguments are stripped in link
      mode, but not in compile mode.
2021-03-31 21:47:38 +00:00
Todd Gamblin
a1d9a56a43 specs: remove "or ''" from Spec comparisons
Since `lazy_lexicographic_ordering` handles `None` comparison for us, we
don't need to adjust the spec comparators to return empty strings or
other type-specific empty types. We can just leverage the None-awareness
of `lazy_lexicographic_ordering`.

- [x] remove "or ''" from `_cmp_iter` in `Spec`
- [x] remove setting of `self.namespace` to `''` in `MockPackage`
2021-03-31 14:39:23 -07:00
Todd Gamblin
01a6adb5f7 specs: use lazy lexicographic comparison instead of key_ordering
We have been using the `@llnl.util.lang.key_ordering` decorator for specs
and most of their components. This leverages the fact that in Python,
tuple comparison is lexicographic. It allows you to implement a
`_cmp_key` method on your class, and have `__eq__`, `__lt__`, etc.
implemented automatically using that key. For example, you might use
tuple keys to implement comparison, e.g.:

```python
class Widget:
    # author implements this
    def _cmp_key(self):
        return (
            self.a,
            self.b,
            (self.c, self.d),
            self.e
        )

    # operators are generated by @key_ordering
    def __eq__(self, other):
        return self._cmp_key() == other._cmp_key()

    def __lt__(self):
        return self._cmp_key() < other._cmp_key()

    # etc.
```

The issue there for simple comparators is that we have to bulid the
tuples *and* we have to generate all the values in them up front. When
implementing comparisons for large data structures, this can be costly.

This PR replaces `@key_ordering` with a new decorator,
`@lazy_lexicographic_ordering`. Lazy lexicographic comparison maps the
tuple comparison shown above to generator functions. Instead of comparing
based on pre-constructed tuple keys, users of this decorator can compare
using elements from a generator. So, you'd write:

```python
@lazy_lexicographic_ordering
class Widget:
    def _cmp_iter(self):
        yield a
        yield b
        def cd_fun():
            yield c
            yield d
        yield cd_fun
        yield e

    # operators are added by decorator (but are a bit more complex)

There are no tuples that have to be pre-constructed, and the generator
does not have to complete. Instead of tuples, we simply make functions
that lazily yield what would've been in the tuple. If a yielded value is
a `callable`, the comparison functions will call it and recursively
compar it. The comparator just walks the data structure like you'd expect
it to.

The ``@lazy_lexicographic_ordering`` decorator handles the details of
implementing comparison operators, and the ``Widget`` implementor only
has to worry about writing ``_cmp_iter``, and making sure the elements in
it are also comparable.

Using this PR shaves another 1.5 sec off the runtime of `spack buildcache
list`, and it also speeds up Spec comparison by about 30%. The runtime
improvement comes mostly from *not* calling `hash()` `_cmp_iter()`.
2021-03-31 14:39:23 -07:00
Todd Gamblin
fd12cba18b specs: speed up traversal by avoiding redundant canonicalization 2021-03-31 14:39:23 -07:00
Harmen Stoppels
1db6cd5d16
Make -j flag less exceptional (#22360)
* Make -j flag less exceptional

The -j flag in spack behaves differently from make, ctest, ninja, etc,
because it caps the number of jobs to an arbitrary number 16.
Spack will behave like other tools if `spack install` uses a reasonable
default, and `spack install -j <num>` *overrides* that default.

This will be particularly useful for Spack usage outside of a traditional
HPC context and for HPC centers that encourage users to compile on
login nodes with many cores instead of on compute nodes, which has
become increasingly common as individual nodes have more cores.

This maintains the existing default value of min(num_cpus, 16). However, 
as it is right now, Spack does a poor job at determining the number of 
cpus on linux, since it doesn't take cgroups into account. This is
particularly problematic when using distributed builds with slurm. This PR
also introduces `spack.util.cpus.cpus_available()` to consolidate
knowledge on determining the number of available cores, and improves
core detection for linux. This should also improve core detection for Docker/
Kubernetes, which also use cgroups.
2021-03-30 12:03:50 -07:00
Harmen Stoppels
b848fab3ec
SpackCommand objects can set global args (#22318)
This commit extends the API of the __call__ method of the
SpackCommand class to permit passing global arguments 
like those interposed between the main "spack" command 
and the subsequent subcommand.

The functionality is used to fix an issue where running

```spack -e . location -b some_package```

ends up printing the name of the environment instead of 
the build directory of the package, because the location arg 
parser also stores this value as `arg.env`.
2021-03-30 18:47:36 +02:00
Massimiliano Culpo
c3bab11ee1
Bootstrapping: swap store before configuration (#22631)
fixes #22294

A combination of the swapping order for global variables and
the fact that most of them are lazily evaluated resulted in
custom install tree not being taken into account if clingo
had to be bootstrapped.

This commit fixes that particular issue, but a broader refactor
may be needed to ensure that similar situations won't affect us
in the future.
2021-03-30 17:23:32 +02:00