Merge branch 'externals' into crayport

This commit is contained in:
Gregory Becker 2016-01-04 10:46:25 -08:00
commit ff82e41404
23 changed files with 1061 additions and 190 deletions

View file

@ -357,7 +357,7 @@ Spack, you can simply run ``spack compiler add`` with the path to
where the compiler is installed. For example:: where the compiler is installed. For example::
$ spack compiler add /usr/local/tools/ic-13.0.079 $ spack compiler add /usr/local/tools/ic-13.0.079
==> Added 1 new compiler to /Users/gamblin2/.spackconfig ==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
intel@13.0.079 intel@13.0.079
Or you can run ``spack compiler add`` with no arguments to force Or you can run ``spack compiler add`` with no arguments to force
@ -367,7 +367,7 @@ installed, but you know that new compilers have been added to your
$ module load gcc-4.9.0 $ module load gcc-4.9.0
$ spack compiler add $ spack compiler add
==> Added 1 new compiler to /Users/gamblin2/.spackconfig ==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
gcc@4.9.0 gcc@4.9.0
This loads the environment module for gcc-4.9.0 to get it into the This loads the environment module for gcc-4.9.0 to get it into the
@ -398,27 +398,34 @@ Manual compiler configuration
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If auto-detection fails, you can manually configure a compiler by If auto-detection fails, you can manually configure a compiler by
editing your ``~/.spackconfig`` file. You can do this by running editing your ``~/.spack/compilers.yaml`` file. You can do this by running
``spack config edit``, which will open the file in your ``$EDITOR``. ``spack config edit compilers``, which will open the file in your ``$EDITOR``.
Each compiler configuration in the file looks like this:: Each compiler configuration in the file looks like this::
... ...
[compiler "intel@15.0.0"] chaos_5_x86_64_ib:
cc = /usr/local/bin/icc-15.0.024-beta ...
cxx = /usr/local/bin/icpc-15.0.024-beta intel@15.0.0:
f77 = /usr/local/bin/ifort-15.0.024-beta cc: /usr/local/bin/icc-15.0.024-beta
fc = /usr/local/bin/ifort-15.0.024-beta cxx: /usr/local/bin/icpc-15.0.024-beta
... f77: /usr/local/bin/ifort-15.0.024-beta
fc: /usr/local/bin/ifort-15.0.024-beta
...
The chaos_5_x86_64_ib string is an architecture string, and multiple
compilers can be listed underneath an architecture. The architecture
string may be replaced with the string 'all' to signify compilers that
work on all architectures.
For compilers, like ``clang``, that do not support Fortran, put For compilers, like ``clang``, that do not support Fortran, put
``None`` for ``f77`` and ``fc``:: ``None`` for ``f77`` and ``fc``::
[compiler "clang@3.3svn"] clang@3.3svn:
cc = /usr/bin/clang cc: /usr/bin/clang
cxx = /usr/bin/clang++ cxx: /usr/bin/clang++
f77 = None f77: None
fc = None fc: None
Once you save the file, the configured compilers will show up in the Once you save the file, the configured compilers will show up in the
list displayed by ``spack compilers``. list displayed by ``spack compilers``.

View file

@ -205,12 +205,11 @@ And, if you want to remove a mirror, just remove it by name::
Mirror precedence Mirror precedence
---------------------------- ----------------------------
Adding a mirror really just adds a section in ``~/.spackconfig``:: Adding a mirror really just adds a section in ``~/.spack/mirrors.yaml``::
[mirror "local_filesystem"] mirrors:
url = file:///Users/gamblin2/spack-mirror-2014-06-24 - local_filesystem: file:///Users/gamblin2/spack-mirror-2014-06-24
[mirror "remote_server"] - remote_server: https://example.com/some/web-hosted/directory/spack-mirror-2014-06-24
url = https://example.com/some/web-hosted/directory/spack-mirror-2014-06-24
If you want to change the order in which mirrors are searched for If you want to change the order in which mirrors are searched for
packages, you can edit this file and reorder the sections. Spack will packages, you can edit this file and reorder the sections. Spack will

View file

@ -632,7 +632,7 @@ Default
revision instead. revision instead.
Revisions Revisions
Add ``hg`` and ``revision``parameters: Add ``hg`` and ``revision`` parameters:
.. code-block:: python .. code-block:: python
@ -1524,6 +1524,70 @@ This is useful when you want to know exactly what Spack will do when
you ask for a particular spec. you ask for a particular spec.
``Concretization Policies``
~~~~~~~~~~~~~~~~~~~~~~~~~~~
A user may have certain perferrences for how packages should
be concretized on their system. For example, one user may prefer packages
built with OpenMPI and the Intel compiler. Another user may prefer
packages be built with MVAPICH and GCC.
Spack's ``preferred`` configuration can be used to set defaults for sites or users.
Spack uses this configuration to make decisions about which compilers, package
versions, depends_on, and variants it should prefer during concretization.
The preferred configuration can be controlled by editing the
``~/.spack/preferred.yaml`` file for user configuations, or the
Here's an example preferred.yaml file:
.. code-block:: sh
preferred:
dyninst:
compiler: gcc@4.9
variants: +debug
gperftools:
version: 2.2, 2.4, 2.3
all:
compiler: gcc@4.4.7, gcc@4.6:, intel, clang, pgi
providers:
mpi: mvapich, mpich, openmpi
At a high level, this example is specifying how packages should be
concretized. The dyninst package should prefer using gcc 4.9 and
be built with debug options. The gperftools package should prefer version
2.2 over 2.4. Every package on the system should prefer mvapich for
its MPI and gcc 4.4.7 (except for Dyninst, which perfers gcc 4.9).
These options are used to fill in implicit defaults. Any of them can be overwritten
on the command line if explicitly requested.
Each preferred.yaml file begin with the string ``preferred:`` and
each subsequent entry is indented underneath it. The next layer contains
package names or the special string ``all`` (which applies to
every package). Underneath each package name is
one or more components: ``compiler``, ``variants``, ``version``,
or ``providers``. Each component has an ordered list of spec
``constraints``, with earlier entries in the list being prefered over
latter entries.
Sometimes a package installation may have constraints that forbid
the first concretization rule, in which case Spack will use the first
legal concretization rule. Going back to the example, if a user
requests gperftools 2.3 or latter, then Spack will install version 2.4
as the 2.4 version of gperftools is preferred over 2.3.
An explicit concretization rule in the preferred section will always
take preference over unlisted concretizations. In the above example,
xlc isn't listed in the compiler list. Every listed compiler from
gcc to pgi will thus be preferred over the xlc compiler.
The syntax for the ``providers`` section differs slightly from other
concretization rules. A provider lists a value that packages may
``depend_on`` (e.g, mpi) and a list of rules for fulfilling that
dependency.
.. _install-method: .. _install-method:
Implementing the ``install`` method Implementing the ``install`` method

View file

@ -54,88 +54,78 @@ more elements to the list to indicate where your own site's temporary
directory is. directory is.
.. _concretization-policies: External Packages
~~~~~~~~~~~~~~~~~~~~~
It's possible for Spack to use certain externally-installed
packages rather than always rebuilding packages. This may be desirable
if machines ship with system packages, such as a customized MPI
that should be used instead of Spack building its own MPI.
Concretization policies External packages are configured through the ``packages.yaml`` file found
---------------------------- in a Spack installation's ``etc/spack/`` or a user's ``~/.spack/``
directory. Here's an example of an external configuration::
When a user asks for a package like ``mpileaks`` to be installed, .. code-block:: yaml
Spack has to make decisions like what version should be installed,
what compiler to use, and how its dependencies should be configured.
This process is called *concretization*, and it's covered in detail in
:ref:`its own section <abstract-and-concrete>`.
The default concretization policies are in the packages:
:py:mod:`spack.concretize` module, specifically in the - openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib:
:py:class:`spack.concretize.DefaultConcretizer` class. These are the path: /opt/openmpi-1.4.3
important methods used in the concretization process: - openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug:
path: /opt/openmpi-1.4.3-debug
- openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib:
path: /opt/openmpi-1.6.5-intel
* :py:meth:`concretize_version(self, spec) <spack.concretize.DefaultConcretizer.concretize_version>` This example lists three installations of OpenMPI, one built with gcc,
* :py:meth:`concretize_architecture(self, spec) <spack.concretize.DefaultConcretizer.concretize_architecture>` one built with gcc and debug information, and another built with OpenMPI.
* :py:meth:`concretize_compiler(self, spec) <spack.concretize.DefaultConcretizer.concretize_compiler>` If Spack is asked to build a package that uses one of these MPIs as a
* :py:meth:`choose_provider(self, spec, providers) <spack.concretize.DefaultConcretizer.choose_provider>` dependency, it link the package to the pre-installed OpenMPI in
the given directory.
The first three take a :py:class:`Spec <spack.spec.Spec>` object and Each ``packages.yaml`` should begin with a ``packages:`` token, followed
modify it by adding constraints for the version. For example, if the by a list of package specs. Specs in the ``packages.yaml`` have at most
input spec had a version range like `1.0:5.0.3`, then the one ``path`` tag, which specifies the top-level directory where the
``concretize_version`` method should set the spec's version to a spec is installed.
*single* version in that range. Likewise, ``concretize_architecture``
selects an architecture when the input spec does not have one, and
``concretize_compiler`` needs to set both a concrete compiler and a
concrete compiler version.
``choose_provider()`` affects how concrete implementations are chosen Each spec should be as well-defined as reasonably possible. If a
based on a virtual dependency spec. The input spec is some virtual package lacks a spec component, such as missing a compiler or
dependency and the ``providers`` index is a :py:class:`ProviderIndex package version, then Spack will guess the missing component based
<spack.packages.ProviderIndex>` object. The ``ProviderIndex`` maps on its most-favored packages, and it may guess incorrectly.
the virtual spec to specs for possible implementations, and
``choose_provider()`` should simply choose one of these. The
``concretize_*`` methods will be called on the chosen implementation
later, so there is no need to fully concretize the spec when returning
it.
The ``DefaultConcretizer`` is intended to provide sensible defaults All package versions and compilers listed in ``packages.yaml`` should
for each policy, but there are certain choices that it can't know have entries in Spack's packages and compiler configuration, even
about. For example, one site might prefer ``OpenMPI`` over ``MPICH``, the package and compiler may not actually be used.
or another might prefer an old version of some packages. These types
of special cases can be integrated with custom concretizers.
Writing a custom concretizer The packages configuration can tell Spack to use an external location
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ for certain package versions, but it does not restrict Spack to using
external packages. In the above example, if an OpenMPI 1.8.4 became
available Spack may choose to start building and linking with that version
rather than continue using the pre-installed OpenMPI versions.
To write your own concretizer, you need only subclass To prevent this, the ``packages.yaml`` configuration also allows packages
``DefaultConcretizer`` and override the methods you want to change. to be flagged as non-buildable. The previous example could be modified to
For example, you might write a class like this to change *only* the be::
``concretize_version()`` behavior:
.. code-block:: python .. code-block:: yaml
from spack.concretize import DefaultConcretizer packages:
- openmpi:
nobuild: True
- openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib:
path: /opt/openmpi-1.4.3
- openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug:
path: /opt/openmpi-1.4.3-debug
- openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib:
path: /opt/openmpi-1.6.5-intel
class MyConcretizer(DefaultConcretizer): The addition of the ``nobuild`` flag tells Spack that it should never build
def concretize_version(self, spec): its own version of OpenMPI, and it will instead always rely on a pre-built
# implement custom logic here. OpenMPI. Similar to ``path``, ``nobuild`` is specified as a property under
a spec and will prevent building of anything that satisfies that spec.
Once you have written your custom concretizer, you can make Spack use
it by editing ``globals.py``. Find this part of the file:
.. code-block:: python
#
# This controls how things are concretized in spack.
# Replace it with a subclass if you want different
# policies.
#
concretizer = DefaultConcretizer()
Set concretizer to *your own* class instead of the default:
.. code-block:: python
concretizer = MyConcretizer()
The next time you run Spack, your changes should take effect.
The ``nobuild`` does not need to be paired with external packages.
It could also be used alone to forbid versions of packages that may be
buggy or otherwise undesirable.
Profiling Profiling
~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~

View file

@ -70,6 +70,20 @@
from spack.directory_layout import YamlDirectoryLayout from spack.directory_layout import YamlDirectoryLayout
install_layout = YamlDirectoryLayout(install_path) install_layout = YamlDirectoryLayout(install_path)
#
# This controls how packages are sorted when trying to choose
# the most preferred package. More preferred packages are sorted
# first.
#
from spack.preferred_packages import PreferredPackages
pkgsort = PreferredPackages()
#
# This tests ABI compatibility between packages
#
from spack.abi import ABI
abi = ABI()
# #
# This controls how things are concretized in spack. # This controls how things are concretized in spack.
# Replace it with a subclass if you want different # Replace it with a subclass if you want different

128
lib/spack/spack/abi.py Normal file
View file

@ -0,0 +1,128 @@
##############################################################################
# Copyright (c) 2015, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import spack
import spack.spec
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
from llnl.util.lang import memoized
class ABI(object):
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""
def architecture_compatible(self, parent, child):
"""Returns true iff the parent and child specs have ABI compatible architectures."""
return not parent.architecture or not child.architecture or parent.architecture == child.architecture
@memoized
def _gcc_get_libstdcxx_version(self, version):
"""Returns gcc ABI compatibility info by getting the library version of
a compiler's libstdc++.so or libgcc_s.so"""
spec = CompilerSpec("gcc", version)
compilers = spack.compilers.compilers_for_spec(spec)
if not compilers:
return None
compiler = compilers[0]
rungcc = None
libname = None
output = None
if compiler.cxx:
rungcc = Executable(compiler.cxx)
libname = "libstdc++.so"
elif compiler.cc:
rungcc = Executable(compiler.cc)
libname = "libgcc_s.so"
else:
return None
try:
output = rungcc("--print-file-name=%s" % libname, return_output=True)
except ProcessError, e:
return None
if not output:
return None
libpath = os.readlink(output.strip())
if not libpath:
return None
return os.path.basename(libpath)
@memoized
def _gcc_compiler_compare(self, pversion, cversion):
"""Returns true iff the gcc version pversion and cversion
are ABI compatible."""
plib = self._gcc_get_libstdcxx_version(pversion)
clib = self._gcc_get_libstdcxx_version(cversion)
if not plib or not clib:
return False
return plib == clib
def _intel_compiler_compare(self, pversion, cversion):
"""Returns true iff the intel version pversion and cversion
are ABI compatible"""
#Test major and minor versions. Ignore build version.
if (len(pversion.version) < 2 or len(cversion.version) < 2):
return False
return (pversion.version[0] == cversion.version[0]) and \
(pversion.version[1] == cversion.version[1])
def compiler_compatible(self, parent, child, **kwargs):
"""Returns true iff the compilers for parent and child specs are ABI compatible"""
if not parent.compiler or not child.compiler:
return True
if parent.compiler.name != child.compiler.name:
#Different compiler families are assumed ABI incompatible
return False
if kwargs.get('loose', False):
return True
for pversion in parent.compiler.versions:
for cversion in child.compiler.versions:
#For a few compilers use specialized comparisons. Otherwise
# match on version match.
if pversion.satisfies(cversion):
return True
elif parent.compiler.name == "gcc" and \
self._gcc_compiler_compare(pversion, cversion):
return True
elif parent.compiler.name == "intel" and \
self._intel_compiler_compare(pversion, cversion):
return True
return False
def compatible(self, parent, child, **kwargs):
"""Returns true iff a parent and child spec are ABI compatible"""
loosematch = kwargs.get('loose', False)
return self.architecture_compatible(parent, child) and \
self.compiler_compatible(parent, child, loose=loosematch)

View file

@ -75,8 +75,8 @@ def mirror_add(args):
if url.startswith('/'): if url.startswith('/'):
url = 'file://' + url url = 'file://' + url
mirror_dict = { args.name : url } newmirror = [ { args.name : url } ]
spack.config.add_to_mirror_config({ args.name : url }) spack.config.add_to_mirror_config(newmirror)
def mirror_remove(args): def mirror_remove(args):
@ -90,15 +90,15 @@ def mirror_remove(args):
def mirror_list(args): def mirror_list(args):
"""Print out available mirrors to the console.""" """Print out available mirrors to the console."""
sec_names = spack.config.get_mirror_config() mirrors = spack.config.get_mirror_config()
if not sec_names: if not mirrors:
tty.msg("No mirrors configured.") tty.msg("No mirrors configured.")
return return
max_len = max(len(s) for s in sec_names) max_len = max(len(name) for name,path in mirrors)
fmt = "%%-%ds%%s" % (max_len + 4) fmt = "%%-%ds%%s" % (max_len + 4)
for name, val in sec_names.iteritems(): for name, val in mirrors:
print fmt % (name, val) print fmt % (name, val)

View file

@ -33,12 +33,16 @@
TODO: make this customizable and allow users to configure TODO: make this customizable and allow users to configure
concretization policies. concretization policies.
""" """
import spack
import spack.spec import spack.spec
import spack.compilers import spack.compilers
import spack.architecture import spack.architecture
import spack.error import spack.error
from spack.version import * from spack.version import *
from functools import partial
from spec import DependencyMap
from itertools import chain
from spack.config import *
class DefaultConcretizer(object): class DefaultConcretizer(object):
@ -47,10 +51,112 @@ class DefaultConcretizer(object):
default concretization strategies, or you can override all of them. default concretization strategies, or you can override all of them.
""" """
def _find_other_spec(self, spec, condition):
"""Searches the dag from spec in an intelligent order and looks
for a spec that matches a condition"""
dagiter = chain(spec.traverse(direction='parents'), spec.traverse(direction='children'))
found = next((x for x in dagiter if x is not spec and condition(x)), None)
if found:
return found
dagiter = chain(spec.traverse(direction='parents'), spec.traverse(direction='children'))
searched = list(dagiter)
found = next((x for x in spec.root.traverse() if x not in searched and x is not spec and condition(x)), None)
if found:
return found
if condition(spec):
return spec
return None
def _valid_virtuals_and_externals(self, spec):
"""Returns a list of spec/external-path pairs for both virtuals and externals
that can concretize this spec."""
# Get a list of candidate packages that could satisfy this spec
packages = []
if spec.virtual:
providers = spack.db.providers_for(spec)
if not providers:
raise UnsatisfiableProviderSpecError(providers[0], spec)
spec_w_preferred_providers = self._find_other_spec(spec, \
lambda(x): spack.pkgsort.spec_has_preferred_provider(x.name, spec.name))
if not spec_w_preferred_providers:
spec_w_preferred_providers = spec
provider_cmp = partial(spack.pkgsort.provider_compare, spec_w_preferred_providers.name, spec.name)
packages = sorted(providers, cmp=provider_cmp)
else:
if spec.external:
return False
packages = [spec]
# For each candidate package, if it has externals add those to the candidates
# if it's a nobuild, then only add the externals.
result = []
all_compilers = spack.compilers.all_compilers()
for pkg in packages:
externals = spec_externals(pkg)
buildable = not is_spec_nobuild(pkg)
if buildable:
result.append((pkg, None))
if externals:
sorted_externals = sorted(externals, cmp=lambda a,b: a[0].__cmp__(b[0]))
for external in sorted_externals:
if external[0].satisfies(spec):
result.append(external)
if not result:
raise NoBuildError(spec)
return result
def concretize_virtual_and_external(self, spec):
"""From a list of candidate virtual and external packages, concretize to one that
is ABI compatible with the rest of the DAG."""
candidates = self._valid_virtuals_and_externals(spec)
if not candidates:
return False
#Find the another spec in the dag that has a compiler. We'll use that
# spec to test compiler compatibility.
other_spec = self._find_other_spec(spec, lambda(x): x.compiler)
if not other_spec:
other_spec = spec.root
#Choose an ABI-compatible candidate, or the first match otherwise.
candidate = None
if other_spec:
candidate = next((c for c in candidates if spack.abi.compatible(c[0], other_spec)), None)
if not candidate:
#Try a looser ABI matching
candidate = next((c for c in candidates if spack.abi.compatible(c[0], other_spec, loose=True)), None)
if not candidate:
#No ABI matches. Pick the top choice based on the orignal preferences.
candidate = candidates[0]
external = candidate[1]
candidate_spec = candidate[0]
#Refine this spec to the candidate.
changed = False
if spec.virtual:
spec._replace_with(candidate_spec)
changed = True
if spec._dup(candidate_spec, deps=False, cleardeps=False):
changed = True
if not spec.external and external:
spec.external = external
changed = True
#If we're external then trim the dependencies
if external and spec.dependencies:
changed = True
spec.dependencies = DependencyMap()
return changed
def concretize_version(self, spec): def concretize_version(self, spec):
"""If the spec is already concrete, return. Otherwise take """If the spec is already concrete, return. Otherwise take
the most recent available version, and default to the package's the preferred version from spackconfig, and default to the package's
version if there are no avaialble versions. version if there are no available versions.
TODO: In many cases we probably want to look for installed TODO: In many cases we probably want to look for installed
versions of each package and use an installed version versions of each package and use an installed version
@ -68,12 +174,14 @@ def concretize_version(self, spec):
# If there are known available versions, return the most recent # If there are known available versions, return the most recent
# version that satisfies the spec # version that satisfies the spec
pkg = spec.package pkg = spec.package
cmp_versions = partial(spack.pkgsort.version_compare, spec.name)
valid_versions = sorted( valid_versions = sorted(
[v for v in pkg.versions [v for v in pkg.versions
if any(v.satisfies(sv) for sv in spec.versions)]) if any(v.satisfies(sv) for sv in spec.versions)],
cmp=cmp_versions)
if valid_versions: if valid_versions:
spec.versions = ver([valid_versions[-1]]) spec.versions = ver([valid_versions[0]])
else: else:
# We don't know of any SAFE versions that match the given # We don't know of any SAFE versions that match the given
# spec. Grab the spec's versions and grab the highest # spec. Grab the spec's versions and grab the highest
@ -140,55 +248,59 @@ def concretize_compiler(self, spec):
"""If the spec already has a compiler, we're done. If not, then take """If the spec already has a compiler, we're done. If not, then take
the compiler used for the nearest ancestor with a compiler the compiler used for the nearest ancestor with a compiler
spec and use that. If the ancestor's compiler is not spec and use that. If the ancestor's compiler is not
concrete, then give it a valid version. If there is no concrete, then used the preferred compiler as specified in
ancestor with a compiler, use the system default compiler. spackconfig.
Intuition: Use the system default if no package that depends on Intuition: Use the spackconfig default if no package that depends on
this one has a strict compiler requirement. Otherwise, try to this one has a strict compiler requirement. Otherwise, try to
build with the compiler that will be used by libraries that build with the compiler that will be used by libraries that
link to this one, to maximize compatibility. link to this one, to maximize compatibility.
""" """
all_compilers = spack.compilers.all_compilers() all_compilers = spack.compilers.all_compilers()
if (spec.compiler and if (spec.compiler and
spec.compiler.concrete and spec.compiler.concrete and
spec.compiler in all_compilers): spec.compiler in all_compilers):
return False return False
try: #Find the another spec that has a compiler, or the root if none do
nearest = next(p for p in spec.traverse(direction='parents') other_spec = self._find_other_spec(spec, lambda(x) : x.compiler)
if p.compiler is not None).compiler if not other_spec:
other_spec = spec.root
if not nearest in all_compilers: other_compiler = other_spec.compiler
# Take the newest compiler that saisfies the spec assert(other_spec)
matches = sorted(spack.compilers.find(nearest))
if not matches: # Check if the compiler is already fully specified
raise UnavailableCompilerVersionError(nearest) if other_compiler in all_compilers:
spec.compiler = other_compiler.copy()
# copy concrete version into nearest spec return True
nearest.versions = matches[-1].versions.copy()
assert(nearest.concrete) # Filter the compilers into a sorted list based on the compiler_order from spackconfig
compiler_list = all_compilers if not other_compiler else spack.compilers.find(other_compiler)
spec.compiler = nearest.copy() cmp_compilers = partial(spack.pkgsort.compiler_compare, other_spec.name)
matches = sorted(compiler_list, cmp=cmp_compilers)
except StopIteration: if not matches:
spec.compiler = spack.compilers.default_compiler().copy() raise UnavailableCompilerVersionError(other_compiler)
# copy concrete version into other_compiler
spec.compiler = matches[0].copy()
assert(spec.compiler.concrete)
return True # things changed. return True # things changed.
def choose_provider(self, spec, providers): def choose_provider(self, package_spec, spec, providers):
"""This is invoked for virtual specs. Given a spec with a virtual name, """This is invoked for virtual specs. Given a spec with a virtual name,
say "mpi", and a list of specs of possible providers of that spec, say "mpi", and a list of specs of possible providers of that spec,
select a provider and return it. select a provider and return it.
""" """
assert(spec.virtual) assert(spec.virtual)
assert(providers) assert(providers)
provider_cmp = partial(spack.pkgsort.provider_compare, package_spec.name, spec.name)
sorted_providers = sorted(providers, cmp=provider_cmp)
first_key = sorted_providers[0]
index = spack.spec.index_specs(providers) return first_key
first_key = sorted(index.keys())[0]
latest_version = sorted(index[first_key])[-1]
return latest_version
class UnavailableCompilerVersionError(spack.error.SpackError): class UnavailableCompilerVersionError(spack.error.SpackError):
@ -206,3 +318,12 @@ class NoValidVersionError(spack.error.SpackError):
def __init__(self, spec): def __init__(self, spec):
super(NoValidVersionError, self).__init__( super(NoValidVersionError, self).__init__(
"There are no valid versions for %s that match '%s'" % (spec.name, spec.versions)) "There are no valid versions for %s that match '%s'" % (spec.name, spec.versions))
class NoBuildError(spack.error.SpackError):
"""Raised when a package is configured with the nobuild option, but
no satisfactory external versions can be found"""
def __init__(self, spec):
super(NoBuildError, self).__init__(
"The spec '%s' is configured as nobuild, and no matching external installs were found" % spec.name)

View file

@ -89,15 +89,19 @@
import os import os
import exceptions import exceptions
import sys import sys
import copy
from external.ordereddict import OrderedDict import inspect
from llnl.util.lang import memoized import glob
import imp
import spack.spec
import spack.error import spack.error
from llnl.util.lang import memoized
from external import yaml from external import yaml
from external.yaml.error import MarkedYAMLError from external.yaml.error import MarkedYAMLError
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp from llnl.util.filesystem import mkdirp
import copy
_config_sections = {} _config_sections = {}
class _ConfigCategory: class _ConfigCategory:
@ -114,8 +118,10 @@ def __init__(self, n, f, m):
_ConfigCategory('compilers', 'compilers.yaml', True) _ConfigCategory('compilers', 'compilers.yaml', True)
_ConfigCategory('mirrors', 'mirrors.yaml', True) _ConfigCategory('mirrors', 'mirrors.yaml', True)
_ConfigCategory('preferred', 'preferred.yaml', True)
_ConfigCategory('view', 'views.yaml', True) _ConfigCategory('view', 'views.yaml', True)
_ConfigCategory('order', 'orders.yaml', True) _ConfigCategory('preferred', 'preferred.yaml', True)
_ConfigCategory('packages', 'packages.yaml', True)
"""Names of scopes and their corresponding configuration files.""" """Names of scopes and their corresponding configuration files."""
config_scopes = [('site', os.path.join(spack.etc_path, 'spack')), config_scopes = [('site', os.path.join(spack.etc_path, 'spack')),
@ -156,25 +162,27 @@ def _merge_dicts(d1, d2):
"""Recursively merges two configuration trees, with entries """Recursively merges two configuration trees, with entries
in d2 taking precedence over d1""" in d2 taking precedence over d1"""
if not d1: if not d1:
return d2.copy() return copy.copy(d2)
if not d2: if not d2:
return d1 return d1
for key2, val2 in d2.iteritems(): if (type(d1) is list) and (type(d2) is list):
if not key2 in d1: d1.extend(d2)
d1[key2] = val2 return d1
continue
val1 = d1[key2] if (type(d1) is dict) and (type(d2) is dict):
if isinstance(val1, dict) and isinstance(val2, dict): for key2, val2 in d2.iteritems():
d1[key2] = _merge_dicts(val1, val2) if not key2 in d1:
continue d1[key2] = val2
if isinstance(val1, list) and isinstance(val2, list): elif type(d1[key2]) is dict and type(val2) is dict:
val1.extend(val2) d1[key2] = _merge_dicts(d1[key2], val2)
seen = set() elif (type(d1) is list) and (type(d2) is list):
d1[key2] = [ x for x in val1 if not (x in seen or seen.add(x)) ] d1.extend(d2)
continue else:
d1[key2] = val2 d1[key2] = val2
return d1 return d1
return d2
def get_config(category_name): def get_config(category_name):
@ -227,10 +235,64 @@ def get_compilers_config(arch=None):
def get_mirror_config(): def get_mirror_config():
"""Get the mirror configuration from config files""" """Get the mirror configuration from config files as a list of name/location tuples"""
return get_config('mirrors') return [x.items()[0] for x in get_config('mirrors')]
def get_preferred_config():
"""Get the preferred configuration from config files"""
return get_config('preferred')
@memoized
def get_packages_config():
"""Get the externals configuration from config files"""
package_config = get_config('packages')
if not package_config:
return {}
indexed_packages = {}
for p in package_config:
package_name = spack.spec.Spec(p.keys()[0]).name
if package_name not in indexed_packages:
indexed_packages[package_name] = []
indexed_packages[package_name].append({ spack.spec.Spec(key) : val for key, val in p.iteritems() })
return indexed_packages
def is_spec_nobuild(spec):
"""Return true if the spec pkgspec is configured as nobuild"""
allpkgs = get_packages_config()
name = spec.name
if not name in allpkgs:
return False
for itm in allpkgs[name]:
for pkg,conf in itm.iteritems():
if pkg.satisfies(spec):
if conf.get('nobuild', False):
return True
return False
def spec_externals(spec):
"""Return a list of spec, directory pairs for each external location for spec"""
allpkgs = get_packages_config()
name = spec.name
spec_locations = []
if not name in allpkgs:
return []
for itm in allpkgs[name]:
for pkg,conf in itm.iteritems():
if not pkg.satisfies(spec):
continue
path = conf.get('path', None)
if not path:
continue
spec_locations.append( (pkg, path) )
return spec_locations
def get_config_scope_dirname(scope): def get_config_scope_dirname(scope):
"""For a scope return the config directory""" """For a scope return the config directory"""
global config_scopes global config_scopes
@ -303,7 +365,7 @@ def add_to_mirror_config(addition_dict, scope=None):
def add_to_compiler_config(addition_dict, scope=None, arch=None): def add_to_compiler_config(addition_dict, scope=None, arch=None):
"""Add compilerss to the configuration files""" """Add compilers to the configuration files"""
if not arch: if not arch:
arch = spack.architecture.sys_type() arch = spack.architecture.sys_type()
add_to_config('compilers', { str(arch) : addition_dict }, scope) add_to_config('compilers', { str(arch) : addition_dict }, scope)

View file

@ -187,6 +187,14 @@ def hidden_file_paths(self):
def relative_path_for_spec(self, spec): def relative_path_for_spec(self, spec):
_check_concrete(spec) _check_concrete(spec)
if spec.external:
return spec.external
enabled_variants = (
'-' + v.name for v in spec.variants.values()
if v.enabled)
dir_name = "%s-%s-%s" % ( dir_name = "%s-%s-%s" % (
spec.name, spec.name,
spec.version, spec.version,

View file

@ -752,6 +752,9 @@ def do_install(self,
if not self.spec.concrete: if not self.spec.concrete:
raise ValueError("Can only install concrete packages.") raise ValueError("Can only install concrete packages.")
if self.spec.external:
return
if os.path.exists(self.prefix): if os.path.exists(self.prefix):
tty.msg("%s is already installed in %s." % (self.name, self.prefix)) tty.msg("%s is already installed in %s." % (self.name, self.prefix))
return return

View file

@ -0,0 +1,177 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import spack
from spack.version import *
class PreferredPackages(object):
_default_order = {'compiler' : [ 'gcc', 'intel', 'clang', 'pgi', 'xlc' ] }, #Arbitrary, but consistent
def __init__(self):
self.preferred = spack.config.get_preferred_config()
self._spec_for_pkgname_cache = {}
#Given a package name, sort component (e.g, version, compiler, ...), and
# a second_key (used by providers), return the list
def _order_for_package(self, pkgname, component, second_key, test_all=True):
pkglist = [pkgname]
if test_all:
pkglist.append('all')
for pkg in pkglist:
if not pkg in self.preferred:
continue
orders = self.preferred[pkg]
if not type(orders) is dict:
continue
if not component in orders:
continue
order = orders[component]
if type(order) is dict:
if not second_key in order:
continue;
order = order[second_key]
if not type(order) is str:
tty.die('Expected version list in preferred config, but got %s' % str(order))
order_list = order.split(',')
return [s.strip() for s in order_list]
return []
# A generic sorting function. Given a package name and sort
# component, return less-than-0, 0, or greater-than-0 if
# a is respectively less-than, equal to, or greater than b.
def _component_compare(self, pkgname, component, a, b, reverse_natural_compare, second_key):
orderlist = self._order_for_package(pkgname, component, second_key)
a_in_list = str(a) in orderlist
b_in_list = str(b) in orderlist
if a_in_list and not b_in_list:
return -1
elif b_in_list and not a_in_list:
return 1
cmp_a = None
cmp_b = None
reverse = None
if not a_in_list and not b_in_list:
cmp_a = a
cmp_b = b
reverse = -1 if reverse_natural_compare else 1
else:
cmp_a = orderlist.index(str(a))
cmp_b = orderlist.index(str(b))
reverse = 1
if cmp_a < cmp_b:
return -1 * reverse
elif cmp_a > cmp_b:
return 1 * reverse
else:
return 0
# A sorting function for specs. Similar to component_compare, but
# a and b are considered to match entries in the sorting list if they
# satisfy the list component.
def _spec_compare(self, pkgname, component, a, b, reverse_natural_compare, second_key):
specs = self._spec_for_pkgname(pkgname, component, second_key)
a_index = None
b_index = None
reverse = -1 if reverse_natural_compare else 1
for i, cspec in enumerate(specs):
if a_index == None and cspec.satisfies(a):
a_index = i
if b_index:
break
if b_index == None and cspec.satisfies(b):
b_index = i
if a_index:
break
if a_index != None and b_index == None: return -1
elif a_index == None and b_index != None: return 1
elif a_index != None and b_index == a_index: return -1 * cmp(a, b)
elif a_index != None and b_index != None and a_index != b_index: return cmp(a_index, b_index)
else: return cmp(a, b) * reverse
# Given a sort order specified by the pkgname/component/second_key, return
# a list of CompilerSpecs, VersionLists, or Specs for that sorting list.
def _spec_for_pkgname(self, pkgname, component, second_key):
key = (pkgname, component, second_key)
if not key in self._spec_for_pkgname_cache:
pkglist = self._order_for_package(pkgname, component, second_key)
if not pkglist:
if component in self._default_order:
pkglist = self._default_order[component]
if component == 'compiler':
self._spec_for_pkgname_cache[key] = [spack.spec.CompilerSpec(s) for s in pkglist]
elif component == 'version':
self._spec_for_pkgname_cache[key] = [VersionList(s) for s in pkglist]
else:
self._spec_for_pkgname_cache[key] = [spack.spec.Spec(s) for s in pkglist]
return self._spec_for_pkgname_cache[key]
def provider_compare(self, pkgname, provider_str, a, b):
"""Return less-than-0, 0, or greater than 0 if a is respecively less-than, equal-to, or
greater-than b. A and b are possible implementations of provider_str.
One provider is less-than another if it is preferred over the other.
For example, provider_compare('scorep', 'mpi', 'mvapich', 'openmpi') would return -1 if
mvapich should be preferred over openmpi for scorep."""
return self._spec_compare(pkgname, 'providers', a, b, False, provider_str)
def spec_has_preferred_provider(self, pkgname, provider_str):
"""Return True iff the named package has a list of preferred provider"""
return bool(self._order_for_package(pkgname, 'providers', provider_str, False))
def version_compare(self, pkgname, a, b):
"""Return less-than-0, 0, or greater than 0 if version a of pkgname is
respecively less-than, equal-to, or greater-than version b of pkgname.
One version is less-than another if it is preferred over the other."""
return self._spec_compare(pkgname, 'version', a, b, True, None)
def variant_compare(self, pkgname, a, b):
"""Return less-than-0, 0, or greater than 0 if variant a of pkgname is
respecively less-than, equal-to, or greater-than variant b of pkgname.
One variant is less-than another if it is preferred over the other."""
return self._component_compare(pkgname, 'variant', a, b, False, None)
def architecture_compare(self, pkgname, a, b):
"""Return less-than-0, 0, or greater than 0 if architecture a of pkgname is
respecively less-than, equal-to, or greater-than architecture b of pkgname.
One architecture is less-than another if it is preferred over the other."""
return self._component_compare(pkgname, 'architecture', a, b, False, None)
def compiler_compare(self, pkgname, a, b):
"""Return less-than-0, 0, or greater than 0 if compiler a of pkgname is
respecively less-than, equal-to, or greater-than compiler b of pkgname.
One compiler is less-than another if it is preferred over the other."""
return self._spec_compare(pkgname, 'compiler', a, b, False, None)

View file

@ -419,6 +419,7 @@ def __init__(self, spec_like, *dep_like, **kwargs):
# package.py files for. # package.py files for.
self._normal = kwargs.get('normal', False) self._normal = kwargs.get('normal', False)
self._concrete = kwargs.get('concrete', False) self._concrete = kwargs.get('concrete', False)
self.external = None
# This allows users to construct a spec DAG with literals. # This allows users to construct a spec DAG with literals.
# Note that given two specs a and b, Spec(a) copies a, but # Note that given two specs a and b, Spec(a) copies a, but
@ -426,7 +427,7 @@ def __init__(self, spec_like, *dep_like, **kwargs):
for dep in dep_like: for dep in dep_like:
spec = dep if isinstance(dep, Spec) else Spec(dep) spec = dep if isinstance(dep, Spec) else Spec(dep)
self._add_dependency(spec) self._add_dependency(spec)
# #
# Private routines here are called by the parser when building a spec. # Private routines here are called by the parser when building a spec.
@ -754,12 +755,11 @@ def _concretize_helper(self, presets=None, visited=None):
# Concretize virtual dependencies last. Because they're added # Concretize virtual dependencies last. Because they're added
# to presets below, their constraints will all be merged, but we'll # to presets below, their constraints will all be merged, but we'll
# still need to select a concrete package later. # still need to select a concrete package later.
if not self.virtual: changed |= any(
changed |= any( (spack.concretizer.concretize_architecture(self),
(spack.concretizer.concretize_architecture(self), spack.concretizer.concretize_compiler(self),
spack.concretizer.concretize_compiler(self), spack.concretizer.concretize_version(self),
spack.concretizer.concretize_version(self), spack.concretizer.concretize_variants(self)))
spack.concretizer.concretize_variants(self)))
presets[self.name] = self presets[self.name] = self
visited.add(self.name) visited.add(self.name)
@ -792,21 +792,18 @@ def _expand_virtual_packages(self):
a problem. a problem.
""" """
changed = False changed = False
while True: done = False
virtuals =[v for v in self.traverse() if v.virtual] while not done:
if not virtuals: done = True
return changed for spec in list(self.traverse()):
if spack.concretizer.concretize_virtual_and_external(spec):
done = False
changed = True
for spec in virtuals: # If there are duplicate providers or duplicate provider deps, this
providers = spack.db.providers_for(spec) # consolidates them and merge constraints.
concrete = spack.concretizer.choose_provider(spec, providers) changed |= self.normalize(force=True)
concrete = concrete.copy() return changed
spec._replace_with(concrete)
changed = True
# If there are duplicate providers or duplicate provider deps, this
# consolidates them and merge constraints.
changed |= self.normalize(force=True)
def concretize(self): def concretize(self):
@ -833,7 +830,6 @@ def concretize(self):
self._concretize_helper()) self._concretize_helper())
changed = any(changes) changed = any(changes)
force=True force=True
self._concrete = True self._concrete = True
@ -1029,7 +1025,7 @@ def _normalize_helper(self, visited, spec_deps, provider_index):
# if we descend into a virtual spec, there's nothing more # if we descend into a virtual spec, there's nothing more
# to normalize. Concretize will finish resolving it later. # to normalize. Concretize will finish resolving it later.
if self.virtual: if self.virtual or self.external:
return False return False
# Combine constraints from package deps with constraints from # Combine constraints from package deps with constraints from
@ -1349,15 +1345,26 @@ def _dup(self, other, **kwargs):
Whether deps should be copied too. Set to false to copy a Whether deps should be copied too. Set to false to copy a
spec but not its dependencies. spec but not its dependencies.
""" """
# We don't count dependencies as changes here
changed = True
if hasattr(self, 'name'):
changed = (self.name != other.name and self.versions != other.versions and \
self.architecture != other.architecture and self.compiler != other.compiler and \
self.variants != other.variants and self._normal != other._normal and \
self.concrete != other.concrete and self.external != other.external)
# Local node attributes get copied first. # Local node attributes get copied first.
self.name = other.name self.name = other.name
self.versions = other.versions.copy() self.versions = other.versions.copy()
self.architecture = other.architecture self.architecture = other.architecture
self.compiler = other.compiler.copy() if other.compiler else None self.compiler = other.compiler.copy() if other.compiler else None
self.dependents = DependencyMap() if kwargs.get('cleardeps', True):
self.dependencies = DependencyMap() self.dependents = DependencyMap()
self.dependencies = DependencyMap()
self.variants = other.variants.copy() self.variants = other.variants.copy()
self.variants.spec = self self.variants.spec = self
self.external = other.external
# If we copy dependencies, preserve DAG structure in the new spec # If we copy dependencies, preserve DAG structure in the new spec
if kwargs.get('deps', True): if kwargs.get('deps', True):
@ -1375,6 +1382,8 @@ def _dup(self, other, **kwargs):
# Since we preserved structure, we can copy _normal safely. # Since we preserved structure, we can copy _normal safely.
self._normal = other._normal self._normal = other._normal
self._concrete = other._concrete self._concrete = other._concrete
self.external = other.external
return changed
def copy(self, **kwargs): def copy(self, **kwargs):
@ -1506,14 +1515,28 @@ def format(self, format_string='$_$@$%@$+$=', **kwargs):
in the format string. The format strings you can provide are:: in the format string. The format strings you can provide are::
$_ Package name $_ Package name
$@ Version $@ Version with '@' prefix
$% Compiler $% Compiler with '%' prefix
$%@ Compiler & compiler version $%@ Compiler with '%' prefix & compiler version with '@' prefix
$+ Options $+ Options
$= Architecture $= Architecture with '=' prefix
$# 7-char prefix of DAG hash $# 7-char prefix of DAG hash with '-' prefix
$$ $ $$ $
You can also use full-string versions, which leave off the prefixes:
${PACKAGE} Package name
${VERSION} Version
${COMPILER} Full compiler string
${COMPILERNAME} Compiler name
${COMPILERVER} Compiler version
${OPTIONS} Options
${ARCHITECTURE} Architecture
${SHA1} Dependencies 8-char sha1 prefix
${SPACK_ROOT} The spack root directory
${SPACK_INSTALL} The default spack install directory, ${SPACK_PREFIX}/opt
Optionally you can provide a width, e.g. $20_ for a 20-wide name. Optionally you can provide a width, e.g. $20_ for a 20-wide name.
Like printf, you can provide '-' for left justification, e.g. Like printf, you can provide '-' for left justification, e.g.
$-20_ for a left-justified name. $-20_ for a left-justified name.
@ -1529,7 +1552,8 @@ def format(self, format_string='$_$@$%@$+$=', **kwargs):
color = kwargs.get('color', False) color = kwargs.get('color', False)
length = len(format_string) length = len(format_string)
out = StringIO() out = StringIO()
escape = compiler = False named = escape = compiler = False
named_str = fmt = ''
def write(s, c): def write(s, c):
if color: if color:
@ -1569,9 +1593,12 @@ def write(s, c):
elif c == '#': elif c == '#':
out.write('-' + fmt % (self.dag_hash(7))) out.write('-' + fmt % (self.dag_hash(7)))
elif c == '$': elif c == '$':
if fmt != '': if fmt != '%s':
raise ValueError("Can't use format width with $$.") raise ValueError("Can't use format width with $$.")
out.write('$') out.write('$')
elif c == '{':
named = True
named_str = ''
escape = False escape = False
elif compiler: elif compiler:
@ -1585,6 +1612,43 @@ def write(s, c):
out.write(c) out.write(c)
compiler = False compiler = False
elif named:
if not c == '}':
if i == length - 1:
raise ValueError("Error: unterminated ${ in format: '%s'"
% format_string)
named_str += c
continue;
if named_str == 'PACKAGE':
write(fmt % self.name, '@')
if named_str == 'VERSION':
if self.versions and self.versions != _any_version:
write(fmt % str(self.versions), '@')
elif named_str == 'COMPILER':
if self.compiler:
write(fmt % self.compiler, '%')
elif named_str == 'COMPILERNAME':
if self.compiler:
write(fmt % self.compiler.name, '%')
elif named_str == 'COMPILERVER':
if self.compiler:
write(fmt % self.compiler.versions, '%')
elif named_str == 'OPTIONS':
if self.variants:
write(fmt % str(self.variants), '+')
elif named_str == 'ARCHITECTURE':
if self.architecture:
write(fmt % str(self.architecture), '=')
elif named_str == 'SHA1':
if self.dependencies:
out.write(fmt % str(self.dep_hash(8)))
elif named_str == 'SPACK_ROOT':
out.write(fmt % spack.prefix)
elif named_str == 'SPACK_INSTALL':
out.write(fmt % spack.install_path)
named = False
elif c == '$': elif c == '$':
escape = True escape = True
if i == length - 1: if i == length - 1:
@ -1601,6 +1665,40 @@ def dep_string(self):
return ''.join("^" + dep.format() for dep in self.sorted_deps()) return ''.join("^" + dep.format() for dep in self.sorted_deps())
def __cmp__(self, other):
#Package name sort order is not configurable, always goes alphabetical
if self.name != other.name:
return cmp(self.name, other.name)
#Package version is second in compare order
pkgname = self.name
if self.versions != other.versions:
return spack.pkgsort.version_compare(pkgname,
self.versions, other.versions)
#Compiler is third
if self.compiler != other.compiler:
return spack.pkgsort.compiler_compare(pkgname,
self.compiler, other.compiler)
#Variants
if self.variants != other.variants:
return spack.pkgsort.variant_compare(pkgname,
self.variants, other.variants)
#Architecture
if self.architecture != other.architecture:
return spack.pkgsort.architecture_compare(pkgname,
self.architecture, other.architecture)
#Dependency is not configurable
if self.dep_hash() != other.dep_hash():
return -1 if self.dep_hash() < other.dep_hash() else 1
#Equal specs
return 0
def __str__(self): def __str__(self):
return self.format() + self.dep_string() return self.format() + self.dep_string()
@ -1710,6 +1808,7 @@ def spec(self):
spec.variants = VariantMap(spec) spec.variants = VariantMap(spec)
spec.architecture = None spec.architecture = None
spec.compiler = None spec.compiler = None
spec.external = None
spec.dependents = DependencyMap() spec.dependents = DependencyMap()
spec.dependencies = DependencyMap() spec.dependencies = DependencyMap()

View file

@ -26,6 +26,7 @@
import re import re
import shutil import shutil
import tempfile import tempfile
import sys
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import * from llnl.util.filesystem import *
@ -344,9 +345,7 @@ def destroy(self):
def _get_mirrors(): def _get_mirrors():
"""Get mirrors from spack configuration.""" """Get mirrors from spack configuration."""
config = spack.config.get_mirror_config() return [path for name, path in spack.config.get_mirror_config()]
return [val for name, val in config.iteritems()]
def ensure_access(file=spack.stage_path): def ensure_access(file=spack.stage_path):

View file

@ -192,3 +192,31 @@ def test_compiler_inheritance(self):
# TODO: not exactly the syntax I would like. # TODO: not exactly the syntax I would like.
self.assertTrue(spec['libdwarf'].compiler.satisfies('clang')) self.assertTrue(spec['libdwarf'].compiler.satisfies('clang'))
self.assertTrue(spec['libelf'].compiler.satisfies('clang')) self.assertTrue(spec['libelf'].compiler.satisfies('clang'))
def test_external_package(self):
spec = Spec('externaltool')
spec.concretize()
self.assertEqual(spec['externaltool'].external, '/path/to/external_tool')
self.assertFalse('externalprereq' in spec)
self.assertTrue(spec['externaltool'].compiler.satisfies('gcc'))
def test_nobuild_package(self):
got_error = False
spec = Spec('externaltool%clang')
try:
spec.concretize()
except spack.concretize.NoBuildError:
got_error = True
self.assertTrue(got_error)
def test_external_and_virtual(self):
spec = Spec('externaltest')
spec.concretize()
self.assertTrue(spec['externaltool'].external, '/path/to/external_tool')
self.assertTrue(spec['stuff'].external, '/path/to/external_virtual_gcc')
self.assertTrue(spec['externaltool'].compiler.satisfies('gcc'))
self.assertTrue(spec['stuff'].compiler.satisfies('gcc'))

View file

@ -0,0 +1,13 @@
packages:
- externaltool:
nobuild: True
- externaltool@1.0%gcc@4.5.0:
path: /path/to/external_tool
- externalvirtual@2.0%clang@3.3:
path: /path/to/external_virtual_clang
nobuild: True
- externalvirtual@1.0%gcc@4.5.0:
path: /path/to/external_virtual_gcc
nobuild: True

View file

@ -0,0 +1,34 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Externalprereq(Package):
homepage = "http://somewhere.com"
url = "http://somewhere.com/prereq-1.0.tar.gz"
version('1.4', 'f1234567890abcdef1234567890abcde')
def install(self, spec, prefix):
pass

View file

@ -0,0 +1,37 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Externaltest(Package):
homepage = "http://somewhere.com"
url = "http://somewhere.com/test-1.0.tar.gz"
version('1.0', '1234567890abcdef1234567890abcdef')
depends_on('stuff')
depends_on('externaltool')
def install(self, spec, prefix):
pass

View file

@ -0,0 +1,36 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Externaltool(Package):
homepage = "http://somewhere.com"
url = "http://somewhere.com/tool-1.0.tar.gz"
version('1.0', '1234567890abcdef1234567890abcdef')
depends_on('externalprereq')
def install(self, spec, prefix):
pass

View file

@ -0,0 +1,37 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Externalvirtual(Package):
homepage = "http://somewhere.com"
url = "http://somewhere.com/stuff-1.0.tar.gz"
version('1.0', '1234567890abcdef1234567890abcdef')
version('2.0', '234567890abcdef1234567890abcdef1')
provides('stuff')
def install(self, spec, prefix):
pass

View file

@ -38,6 +38,7 @@ class Mpich(Package):
version('3.0.2', 'foobarbaz') version('3.0.2', 'foobarbaz')
version('3.0.1', 'foobarbaz') version('3.0.1', 'foobarbaz')
version('3.0', 'foobarbaz') version('3.0', 'foobarbaz')
version('1.0', 'foobarbas')
provides('mpi@:3', when='@3:') provides('mpi@:3', when='@3:')
provides('mpi@:1', when='@:1') provides('mpi@:1', when='@:1')

View file

@ -46,6 +46,7 @@ def setup_dependent_environment(self, module, spec, dep_spec):
os.environ['MPICH_F77'] = 'f77' os.environ['MPICH_F77'] = 'f77'
os.environ['MPICH_F90'] = 'f90' os.environ['MPICH_F90'] = 'f90'
module.mpicc = join_path(self.prefix.bin, 'mpicc')
def install(self, spec, prefix): def install(self, spec, prefix):
config_args = ["--prefix=" + prefix, config_args = ["--prefix=" + prefix,

View file

@ -11,10 +11,17 @@ class Mvapich2(Package):
version('2.0', '9fbb68a4111a8b6338e476dc657388b4', version('2.0', '9fbb68a4111a8b6338e476dc657388b4',
url='http://mvapich.cse.ohio-state.edu/download/mvapich/mv2/mvapich2-2.0.tar.gz') url='http://mvapich.cse.ohio-state.edu/download/mvapich/mv2/mvapich2-2.0.tar.gz')
version('2.1', '0095ceecb19bbb7fb262131cb9c2cdd6',
url='http://mvapich.cse.ohio-state.edu/download/mvapich/mv2/mvapich2-2.1.tar.gz')
provides('mpi@:2.2', when='@1.9') # MVAPICH2-1.9 supports MPI 2.2 provides('mpi@:2.2', when='@1.9') # MVAPICH2-1.9 supports MPI 2.2
provides('mpi@:3.0', when='@2.0') # MVAPICH2-2.0 supports MPI 3.0 provides('mpi@:3.0', when='@2.0') # MVAPICH2-2.0 supports MPI 3.0
variant('psm', default=False, description="build with psm")
variant('pmi', default=False, description="build with pmi")
depends_on('pmgr_collective', when='+pmi')
def install(self, spec, prefix): def install(self, spec, prefix):
# we'll set different configure flags depending on our environment # we'll set different configure flags depending on our environment
@ -80,7 +87,13 @@ def install(self, spec, prefix):
configure_args.append("--with-device=ch3:psm") configure_args.append("--with-device=ch3:psm")
else: else:
# throw this flag on IB systems # throw this flag on IB systems
configure_args.append("--with-device=ch3:mrail", "--with-rdma=gen2") configure_args.append("--with-device=ch3:mrail")
configure_args.append("--with-rdma=gen2")
if "+pmi" in spec:
configure_args.append("--with-pmi=pmgr_collective" % spec['pmgr_collective'].prefix)
else:
configure_args.append("--with-pmi=slurm")
# TODO: shared-memory build # TODO: shared-memory build
@ -93,7 +106,7 @@ def install(self, spec, prefix):
"--enable-f77", "--enable-fc", "--enable-cxx", "--enable-f77", "--enable-fc", "--enable-cxx",
"--enable-shared", "--enable-sharedlibs=gcc", "--enable-shared", "--enable-sharedlibs=gcc",
"--enable-debuginfo", "--enable-debuginfo",
"--with-pm=no", "--with-pmi=slurm", "--with-pm=no",
"--enable-romio", "--with-file-system=lustre+nfs+ufs", "--enable-romio", "--with-file-system=lustre+nfs+ufs",
"--disable-mpe", "--without-mpe", "--disable-mpe", "--without-mpe",
"--disable-silent-rules", "--disable-silent-rules",