re-merged mainline develop

This commit is contained in:
Gregory Becker 2016-05-27 13:13:19 -07:00
commit 9dad7c2ace
27 changed files with 1241 additions and 802 deletions

View file

@ -8,6 +8,9 @@
# - E221: multiple spaces before operator # - E221: multiple spaces before operator
# - E241: multiple spaces after , # - E241: multiple spaces after ,
# #
# Let people use terse Python features:
# - E731 : lambda expressions
#
# Spack allows wildcard imports: # Spack allows wildcard imports:
# - F403: disable wildcard import # - F403: disable wildcard import
# #
@ -16,5 +19,5 @@
# - F999: name name be undefined or undefined from star imports. # - F999: name name be undefined or undefined from star imports.
# #
[flake8] [flake8]
ignore = E221,E241,F403,F821,F999 ignore = E221,E241,E731,F403,F821,F999
max-line-length = 79 max-line-length = 79

View file

@ -102,8 +102,8 @@ that the packages is installed:
==> adept-utils is already installed in /home/gamblin2/spack/opt/chaos_5_x86_64_ib/gcc@4.4.7/adept-utils@1.0-5adef8da. ==> adept-utils is already installed in /home/gamblin2/spack/opt/chaos_5_x86_64_ib/gcc@4.4.7/adept-utils@1.0-5adef8da.
==> Trying to fetch from https://github.com/hpc/mpileaks/releases/download/v1.0/mpileaks-1.0.tar.gz ==> Trying to fetch from https://github.com/hpc/mpileaks/releases/download/v1.0/mpileaks-1.0.tar.gz
######################################################################## 100.0% ######################################################################## 100.0%
==> Staging archive: /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7=chaos_5_x86_64_ib-59f6ad23/mpileaks-1.0.tar.gz ==> Staging archive: /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7 arch=chaos_5_x86_64_ib-59f6ad23/mpileaks-1.0.tar.gz
==> Created stage in /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7=chaos_5_x86_64_ib-59f6ad23. ==> Created stage in /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7 arch=chaos_5_x86_64_ib-59f6ad23.
==> No patches needed for mpileaks. ==> No patches needed for mpileaks.
==> Building mpileaks. ==> Building mpileaks.
@ -132,10 +132,10 @@ sites, as installing a version that one user needs will not disrupt
existing installations for other users. existing installations for other users.
In addition to different versions, Spack can customize the compiler, In addition to different versions, Spack can customize the compiler,
compile-time options (variants), and platform (for cross compiles) of compile-time options (variants), compiler flags, and platform (for
an installation. Spack is unique in that it can also configure the cross compiles) of an installation. Spack is unique in that it can
*dependencies* a package is built with. For example, two also configure the *dependencies* a package is built with. For example,
configurations of the same version of a package, one built with boost two configurations of the same version of a package, one built with boost
1.39.0, and the other version built with version 1.43.0, can coexist. 1.39.0, and the other version built with version 1.43.0, can coexist.
This can all be done on the command line using the *spec* syntax. This can all be done on the command line using the *spec* syntax.
@ -246,6 +246,12 @@ Packages are divided into groups according to their architecture and
compiler. Within each group, Spack tries to keep the view simple, and compiler. Within each group, Spack tries to keep the view simple, and
only shows the version of installed packages. only shows the version of installed packages.
``spack find`` can filter the package list based on the package name, spec, or
a number of properties of their installation status. For example, missing
dependencies of a spec can be shown with ``-m``, packages which were
explicitly installed with ``spack install <package>`` can be singled out with
``-e`` and those which have been pulled in only as dependencies with ``-E``.
In some cases, there may be different configurations of the *same* In some cases, there may be different configurations of the *same*
version of a package installed. For example, there are two version of a package installed. For example, there are two
installations of of ``libdwarf@20130729`` above. We can look at them installations of of ``libdwarf@20130729`` above. We can look at them
@ -328,6 +334,11 @@ of libelf would look like this:
-- chaos_5_x86_64_ib / gcc@4.4.7 -------------------------------- -- chaos_5_x86_64_ib / gcc@4.4.7 --------------------------------
libdwarf@20130729-d9b90962 libdwarf@20130729-d9b90962
We can also search for packages that have a certain attribute. For example,
``spack find -l libdwarf +debug`` will show only installations of libdwarf
with the 'debug' compile-time option enabled, while ``spack find -l +debug``
will find every installed package with a 'debug' compile-time option enabled.
The full spec syntax is discussed in detail in :ref:`sec-specs`. The full spec syntax is discussed in detail in :ref:`sec-specs`.
@ -457,6 +468,26 @@ For compilers, like ``clang``, that do not support Fortran, put
Once you save the file, the configured compilers will show up in the Once you save the file, the configured compilers will show up in the
list displayed by ``spack compilers``. list displayed by ``spack compilers``.
You can also add compiler flags to manually configured compilers. The
valid flags are ``cflags``, ``cxxflags``, ``fflags``, ``cppflags``,
``ldflags``, and ``ldlibs``. For example,::
...
chaos_5_x86_64_ib:
...
intel@15.0.0:
cc: /usr/local/bin/icc-15.0.024-beta
cxx: /usr/local/bin/icpc-15.0.024-beta
f77: /usr/local/bin/ifort-15.0.024-beta
fc: /usr/local/bin/ifort-15.0.024-beta
cppflags: -O3 -fPIC
...
These flags will be treated by spack as if they were enterred from
the command line each time this compiler is used. The compiler wrappers
then inject those flags into the compiler command. Compiler flags
enterred from the command line will be discussed in more detail in the
following section.
.. _sec-specs: .. _sec-specs:
@ -474,7 +505,7 @@ the full syntax of specs.
Here is an example of a much longer spec than we've seen thus far:: Here is an example of a much longer spec than we've seen thus far::
mpileaks @1.2:1.4 %gcc@4.7.5 +debug -qt =bgqos_0 ^callpath @1.1 %gcc@4.7.2 mpileaks @1.2:1.4 %gcc@4.7.5 +debug -qt arch=bgq_os ^callpath @1.1 %gcc@4.7.2
If provided to ``spack install``, this will install the ``mpileaks`` If provided to ``spack install``, this will install the ``mpileaks``
library at some version between ``1.2`` and ``1.4`` (inclusive), library at some version between ``1.2`` and ``1.4`` (inclusive),
@ -492,8 +523,12 @@ More formally, a spec consists of the following pieces:
* ``%`` Optional compiler specifier, with an optional compiler version * ``%`` Optional compiler specifier, with an optional compiler version
(``gcc`` or ``gcc@4.7.3``) (``gcc`` or ``gcc@4.7.3``)
* ``+`` or ``-`` or ``~`` Optional variant specifiers (``+debug``, * ``+`` or ``-`` or ``~`` Optional variant specifiers (``+debug``,
``-qt``, or ``~qt``) ``-qt``, or ``~qt``) for boolean variants
* ``=`` Optional architecture specifier (``bgqos_0``) * ``name=<value>`` Optional variant specifiers that are not restricted to
boolean variants
* ``name=<value>`` Optional compiler flag specifiers. Valid flag names are
``cflags``, ``cxxflags``, ``fflags``, ``cppflags``, ``ldflags``, and ``ldlibs``.
* ``arch=<value>`` Optional architecture specifier (``arch=bgq_os``)
* ``^`` Dependency specs (``^callpath@1.1``) * ``^`` Dependency specs (``^callpath@1.1``)
There are two things to notice here. The first is that specs are There are two things to notice here. The first is that specs are
@ -573,7 +608,7 @@ compilers, variants, and architectures just like any other spec.
Specifiers are associated with the nearest package name to their left. Specifiers are associated with the nearest package name to their left.
For example, above, ``@1.1`` and ``%gcc@4.7.2`` associates with the For example, above, ``@1.1`` and ``%gcc@4.7.2`` associates with the
``callpath`` package, while ``@1.2:1.4``, ``%gcc@4.7.5``, ``+debug``, ``callpath`` package, while ``@1.2:1.4``, ``%gcc@4.7.5``, ``+debug``,
``-qt``, and ``=bgqos_0`` all associate with the ``mpileaks`` package. ``-qt``, and ``arch=bgq_os`` all associate with the ``mpileaks`` package.
In the diagram above, ``mpileaks`` depends on ``mpich`` with an In the diagram above, ``mpileaks`` depends on ``mpich`` with an
unspecified version, but packages can depend on other packages with unspecified version, but packages can depend on other packages with
@ -629,22 +664,25 @@ based on site policies.
Variants Variants
~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~
.. Note:: Variants are named options associated with a particular package. They are
optional, as each package must provide default values for each variant it
Variants are not yet supported, but will be in the next Spack makes available. Variants can be specified using
release (0.9), due in Q2 2015. a flexible parameter syntax ``name=<value>``. For example,
``spack install libelf debug=True`` will install libelf build with debug
Variants are named options associated with a particular package, and flags. The names of particular variants available for a package depend on
they can be turned on or off. For example, above, supplying what was provided by the package author. ``spack into <package>`` will
``+debug`` causes ``mpileaks`` to be built with debug flags. The
names of particular variants available for a package depend on what
was provided by the package author. ``spack info <package>`` will
provide information on what build variants are available. provide information on what build variants are available.
Depending on the package a variant may be on or off by default. For For compatibility with earlier versions, variants which happen to be
``mpileaks`` here, ``debug`` is off by default, and we turned it on boolean in nature can be specified by a syntax that represents turning
with ``+debug``. If a package is on by default you can turn it off by options on and off. For example, in the previous spec we could have
either adding ``-name`` or ``~name`` to the spec. supplied ``libelf +debug`` with the same effect of enabling the debug
compile time option for the libelf package.
Depending on the package a variant may have any default value. For
``libelf`` here, ``debug`` is ``False`` by default, and we turned it on
with ``debug=True`` or ``+debug``. If a package is ``True`` by default
you can turn it off by either adding ``-name`` or ``~name`` to the spec.
There are two syntaxes here because, depending on context, ``~`` and There are two syntaxes here because, depending on context, ``~`` and
``-`` may mean different things. In most shells, the following will ``-`` may mean different things. In most shells, the following will
@ -656,7 +694,7 @@ result in the shell performing home directory substitution:
mpileaks~debug # use this instead mpileaks~debug # use this instead
If there is a user called ``debug``, the ``~`` will be incorrectly If there is a user called ``debug``, the ``~`` will be incorrectly
expanded. In this situation, you would want to write ``mpileaks expanded. In this situation, you would want to write ``libelf
-debug``. However, ``-`` can be ambiguous when included after a -debug``. However, ``-`` can be ambiguous when included after a
package name without spaces: package name without spaces:
@ -671,12 +709,35 @@ package, not a request for ``mpileaks`` built without ``debug``
options. In this scenario, you should write ``mpileaks~debug`` to options. In this scenario, you should write ``mpileaks~debug`` to
avoid ambiguity. avoid ambiguity.
When spack normalizes specs, it prints them out with no spaces and When spack normalizes specs, it prints them out with no spaces boolean
uses only ``~`` for disabled variants. We allow ``-`` and spaces on variants using the backwards compatibility syntax and uses only ``~``
the command line is provided for convenience and legibility. for disabled boolean variants. We allow ``-`` and spaces on the command
line is provided for convenience and legibility.
Architecture specifier Compiler Flags
~~~~~~~~~~~~~~~~~~~~~~~
Compiler flags are specified using the same syntax as non-boolean variants,
but fulfill a different purpose. While the function of a variant is set by
the package, compiler flags are used by the compiler wrappers to inject
flags into the compile line of the build. Additionally, compiler flags are
inherited by dependencies. ``spack install libdwarf cppflags=\"-g\"`` will
install both libdwarf and libelf with the ``-g`` flag injected into their
compile line.
Notice that the value of the compiler flags must be escape quoted on the
command line. From within python files, the same spec would be specified
``libdwarf cppflags="-g"``. This is necessary because of how the shell
handles the quote symbols.
The six compiler flags are injected in the order of implicit make commands
in gnu autotools. If all flags are set, the order is
``$cppflags $cflags|$cxxflags $ldflags command $ldlibs`` for C and C++ and
``$fflags $cppflags $ldflags command $ldlibs`` for fortran.
Architecture specifiers
~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~
.. Note:: .. Note::
@ -684,12 +745,9 @@ Architecture specifier
Architecture specifiers are part of specs but are not yet Architecture specifiers are part of specs but are not yet
functional. They will be in Spack version 1.0, due in Q3 2015. functional. They will be in Spack version 1.0, due in Q3 2015.
The architecture specifier starts with a ``=`` and also comes after The architecture specifier looks identical to a variant specifier for a
some package name within a spec. It allows a user to specify a non-boolean variant. The architecture can be specified only using the
particular architecture for the package to be built. This is mostly reserved name ``arch`` (``arch=bgq_os``).
used for architectures that need cross-compilation, and in most cases,
users will not need to specify the architecture when they install a
package.
.. _sec-virtual-dependencies: .. _sec-virtual-dependencies:
@ -767,6 +825,23 @@ any MPI implementation will do. If another package depends on
error. Likewise, if you try to plug in some package that doesn't error. Likewise, if you try to plug in some package that doesn't
provide MPI, Spack will raise an error. provide MPI, Spack will raise an error.
Specifying Specs by Hash
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Complicated specs can become cumbersome to enter on the command line,
especially when many of the qualifications are necessary to
distinguish between similar installs, for example when using the
``uninstall`` command. To avoid this, when referencing an existing spec,
Spack allows you to reference specs by their hash. We previously
discussed the spec hash that Spack computes. In place of a spec in any
command, substitute ``/<hash>`` where ``<hash>`` is any amount from
the beginning of a spec hash. If the given spec hash is sufficient
to be unique, Spack will replace the reference with the spec to which
it refers. Otherwise, it will prompt for a more qualified hash.
Note that this will not work to reinstall a depencency uninstalled by
``spack uninstall -f``.
.. _spack-providers: .. _spack-providers:
``spack providers`` ``spack providers``
@ -996,8 +1071,8 @@ than one installed package matches it), then Spack will warn you:
$ spack load libelf $ spack load libelf
==> Error: Multiple matches for spec libelf. Choose one: ==> Error: Multiple matches for spec libelf. Choose one:
libelf@0.8.13%gcc@4.4.7=chaos_5_x86_64_ib libelf@0.8.13%gcc@4.4.7 arch=chaos_5_x86_64_ib
libelf@0.8.13%intel@15.0.0=chaos_5_x86_64_ib libelf@0.8.13%intel@15.0.0 arch=chaos_5_x86_64_ib
You can either type the ``spack load`` command again with a fully You can either type the ``spack load`` command again with a fully
qualified argument, or you can add just enough extra constraints to qualified argument, or you can add just enough extra constraints to
@ -1276,7 +1351,7 @@ You can find extensions for your Python installation like this:
.. code-block:: sh .. code-block:: sh
$ spack extensions python $ spack extensions python
==> python@2.7.8%gcc@4.4.7=chaos_5_x86_64_ib-703c7a96 ==> python@2.7.8%gcc@4.4.7 arch=chaos_5_x86_64_ib-703c7a96
==> 36 extensions: ==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six py-basemap py-libxml2 py-pil py-pytz py-six
@ -1366,9 +1441,9 @@ installation:
.. code-block:: sh .. code-block:: sh
$ spack activate py-numpy $ spack activate py-numpy
==> Activated extension py-setuptools@11.3.1%gcc@4.4.7=chaos_5_x86_64_ib-3c74eb69 for python@2.7.8%gcc@4.4.7. ==> Activated extension py-setuptools@11.3.1%gcc@4.4.7 arch=chaos_5_x86_64_ib-3c74eb69 for python@2.7.8%gcc@4.4.7.
==> Activated extension py-nose@1.3.4%gcc@4.4.7=chaos_5_x86_64_ib-5f70f816 for python@2.7.8%gcc@4.4.7. ==> Activated extension py-nose@1.3.4%gcc@4.4.7 arch=chaos_5_x86_64_ib-5f70f816 for python@2.7.8%gcc@4.4.7.
==> Activated extension py-numpy@1.9.1%gcc@4.4.7=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7. ==> Activated extension py-numpy@1.9.1%gcc@4.4.7 arch=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7.
Several things have happened here. The user requested that Several things have happened here. The user requested that
``py-numpy`` be activated in the ``python`` installation it was built ``py-numpy`` be activated in the ``python`` installation it was built
@ -1383,7 +1458,7 @@ packages listed as activated:
.. code-block:: sh .. code-block:: sh
$ spack extensions python $ spack extensions python
==> python@2.7.8%gcc@4.4.7=chaos_5_x86_64_ib-703c7a96 ==> python@2.7.8%gcc@4.4.7 arch=chaos_5_x86_64_ib-703c7a96
==> 36 extensions: ==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six py-basemap py-libxml2 py-pil py-pytz py-six
@ -1431,7 +1506,7 @@ dependencies, you can use ``spack activate -f``:
.. code-block:: sh .. code-block:: sh
$ spack activate -f py-numpy $ spack activate -f py-numpy
==> Activated extension py-numpy@1.9.1%gcc@4.4.7=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7. ==> Activated extension py-numpy@1.9.1%gcc@4.4.7 arch=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7.
.. _spack-deactivate: .. _spack-deactivate:

View file

@ -1,6 +1,6 @@
.. _site-configuration: .. _configuration:
Site configuration Configuration
=================================== ===================================
.. _temp-space: .. _temp-space:
@ -55,7 +55,7 @@ directory is.
External Packages External Packages
~~~~~~~~~~~~~~~~~~~~~ ----------------------------
Spack can be configured to use externally-installed Spack can be configured to use externally-installed
packages rather than building its own packages. This may be desirable packages rather than building its own packages. This may be desirable
if machines ship with system packages, such as a customized MPI if machines ship with system packages, such as a customized MPI
@ -70,15 +70,15 @@ directory. Here's an example of an external configuration:
packages: packages:
openmpi: openmpi:
paths: paths:
openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib: /opt/openmpi-1.4.3 openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel openmpi@1.6.5%intel@10.1 arch=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with gcc, This example lists three installations of OpenMPI, one built with gcc,
one built with gcc and debug information, and another built with Intel. one built with gcc and debug information, and another built with Intel.
If Spack is asked to build a package that uses one of these MPIs as a If Spack is asked to build a package that uses one of these MPIs as a
dependency, it will use the the pre-installed OpenMPI in dependency, it will use the the pre-installed OpenMPI in
the given directory. the given directory.
Each ``packages.yaml`` begins with a ``packages:`` token, followed Each ``packages.yaml`` begins with a ``packages:`` token, followed
by a list of package names. To specify externals, add a ``paths`` by a list of package names. To specify externals, add a ``paths``
@ -108,9 +108,9 @@ be:
packages: packages:
openmpi: openmpi:
paths: paths:
openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib: /opt/openmpi-1.4.3 openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel openmpi@1.6.5%intel@10.1 arch=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel
buildable: False buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build The addition of the ``buildable`` flag tells Spack that it should never build
@ -118,13 +118,73 @@ its own version of OpenMPI, and it will instead always rely on a pre-built
OpenMPI. Similar to ``paths``, ``buildable`` is specified as a property under OpenMPI. Similar to ``paths``, ``buildable`` is specified as a property under
a package name. a package name.
The ``buildable`` does not need to be paired with external packages. The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be It could also be used alone to forbid packages that may be
buggy or otherwise undesirable. buggy or otherwise undesirable.
Concretization Preferences
--------------------------------
Spack can be configured to prefer certain compilers, package
versions, depends_on, and variants during concretization.
The preferred configuration can be controlled via the
``~/.spack/packages.yaml`` file for user configuations, or the
``etc/spack/packages.yaml`` site configuration.
Here's an example packages.yaml file that sets preferred packages:
.. code-block:: sh
packages:
dyninst:
compiler: [gcc@4.9]
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
all:
compiler: [gcc@4.4.7, gcc@4.6:, intel, clang, pgi]
providers:
mpi: [mvapich, mpich, openmpi]
At a high level, this example is specifying how packages should be
concretized. The dyninst package should prefer using gcc 4.9 and
be built with debug options. The gperftools package should prefer version
2.2 over 2.4. Every package on the system should prefer mvapich for
its MPI and gcc 4.4.7 (except for Dyninst, which overrides this by preferring gcc 4.9).
These options are used to fill in implicit defaults. Any of them can be overwritten
on the command line if explicitly requested.
Each packages.yaml file begins with the string ``packages:`` and
package names are specified on the next level. The special string ``all``
applies settings to each package. Underneath each package name is
one or more components: ``compiler``, ``variants``, ``version``,
or ``providers``. Each component has an ordered list of spec
``constraints``, with earlier entries in the list being preferred over
later entries.
Sometimes a package installation may have constraints that forbid
the first concretization rule, in which case Spack will use the first
legal concretization rule. Going back to the example, if a user
requests gperftools 2.3 or later, then Spack will install version 2.4
as the 2.4 version of gperftools is preferred over 2.3.
An explicit concretization rule in the preferred section will always
take preference over unlisted concretizations. In the above example,
xlc isn't listed in the compiler list. Every listed compiler from
gcc to pgi will thus be preferred over the xlc compiler.
The syntax for the ``provider`` section differs slightly from other
concretization rules. A provider lists a value that packages may
``depend_on`` (e.g, mpi) and a list of rules for fulfilling that
dependency.
Profiling Profiling
~~~~~~~~~~~~~~~~~~~~~ ------------------
Spack has some limited built-in support for profiling, and can report Spack has some limited built-in support for profiling, and can report
statistics using standard Python timing tools. To use this feature, statistics using standard Python timing tools. To use this feature,
@ -133,7 +193,7 @@ supply ``-p`` to Spack on the command line, before any subcommands.
.. _spack-p: .. _spack-p:
``spack -p`` ``spack -p``
^^^^^^^^^^^^^^^^^^ ~~~~~~~~~~~~~~~~~
``spack -p`` output looks like this: ``spack -p`` output looks like this:

View file

@ -31,14 +31,21 @@ platform, all on the command line.
# Specify a compiler (and its version), with % # Specify a compiler (and its version), with %
$ spack install mpileaks@1.1.2 %gcc@4.7.3 $ spack install mpileaks@1.1.2 %gcc@4.7.3
# Add special compile-time options with + # Add special compile-time options by name
$ spack install mpileaks@1.1.2 %gcc@4.7.3 debug=True
# Add special boolean compile-time options with +
$ spack install mpileaks@1.1.2 %gcc@4.7.3 +debug $ spack install mpileaks@1.1.2 %gcc@4.7.3 +debug
# Cross-compile for a different architecture with = # Add compiler flags using the conventional names
$ spack install mpileaks@1.1.2 =bgqos_0 $ spack install mpileaks@1.1.2 %gcc@4.7.3 cppflags=\"-O3 -floop-block\"
Users can specify as many or few options as they care about. Spack # Cross-compile for a different architecture with arch=
will fill in the unspecified values with sensible defaults. $ spack install mpileaks@1.1.2 arch=bgqos_0
Users can specify as many or few options as they care about. Spack
will fill in the unspecified values with sensible defaults. The two listed
syntaxes for variants are identical when the value is boolean.
Customize dependencies Customize dependencies

View file

@ -47,7 +47,7 @@ Table of Contents
basic_usage basic_usage
packaging_guide packaging_guide
mirrors mirrors
site_configuration configuration
developer_guide developer_guide
command_index command_index
package_list package_list

View file

@ -1221,11 +1221,13 @@ just as easily provide a version range:
depends_on("libelf@0.8.2:0.8.4:") depends_on("libelf@0.8.2:0.8.4:")
Or a requirement for a particular variant: Or a requirement for a particular variant or compiler flags:
.. code-block:: python .. code-block:: python
depends_on("libelf@0.8+debug") depends_on("libelf@0.8+debug")
depends_on('libelf debug=True')
depends_on('libelf cppflags="-fPIC")
Both users *and* package authors can use the same spec syntax to refer Both users *and* package authors can use the same spec syntax to refer
to different package configurations. Users use the spec syntax on the to different package configurations. Users use the spec syntax on the
@ -1623,21 +1625,21 @@ the user runs ``spack install`` and the time the ``install()`` method
is called. The concretized version of the spec above might look like is called. The concretized version of the spec above might look like
this:: this::
mpileaks@2.3%gcc@4.7.3=linux-ppc64 mpileaks@2.3%gcc@4.7.3 arch=linux-ppc64
^callpath@1.0%gcc@4.7.3+debug=linux-ppc64 ^callpath@1.0%gcc@4.7.3+debug arch=linux-ppc64
^dyninst@8.1.2%gcc@4.7.3=linux-ppc64 ^dyninst@8.1.2%gcc@4.7.3 arch=linux-ppc64
^libdwarf@20130729%gcc@4.7.3=linux-ppc64 ^libdwarf@20130729%gcc@4.7.3 arch=linux-ppc64
^libelf@0.8.11%gcc@4.7.3=linux-ppc64 ^libelf@0.8.11%gcc@4.7.3 arch=linux-ppc64
^mpich@3.0.4%gcc@4.7.3=linux-ppc64 ^mpich@3.0.4%gcc@4.7.3 arch=linux-ppc64
.. graphviz:: .. graphviz::
digraph { digraph {
"mpileaks@2.3\n%gcc@4.7.3\n=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n=linux-ppc64" "mpileaks@2.3\n%gcc@4.7.3\n arch=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n arch=linux-ppc64"
"mpileaks@2.3\n%gcc@4.7.3\n=linux-ppc64" -> "callpath@1.0\n%gcc@4.7.3+debug\n=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n=linux-ppc64" "mpileaks@2.3\n%gcc@4.7.3\n arch=linux-ppc64" -> "callpath@1.0\n%gcc@4.7.3+debug\n arch=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n arch=linux-ppc64"
"callpath@1.0\n%gcc@4.7.3+debug\n=linux-ppc64" -> "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" "callpath@1.0\n%gcc@4.7.3+debug\n arch=linux-ppc64" -> "dyninst@8.1.2\n%gcc@4.7.3\n arch=linux-ppc64"
"dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" -> "libdwarf@20130729\n%gcc@4.7.3\n=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n=linux-ppc64" "dyninst@8.1.2\n%gcc@4.7.3\n arch=linux-ppc64" -> "libdwarf@20130729\n%gcc@4.7.3\n arch=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n arch=linux-ppc64"
"dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n=linux-ppc64" "dyninst@8.1.2\n%gcc@4.7.3\n arch=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n arch=linux-ppc64"
} }
Here, all versions, compilers, and platforms are filled in, and there Here, all versions, compilers, and platforms are filled in, and there
@ -1648,8 +1650,8 @@ point will Spack call the ``install()`` method for your package.
Concretization in Spack is based on certain selection policies that Concretization in Spack is based on certain selection policies that
tell Spack how to select, e.g., a version, when one is not specified tell Spack how to select, e.g., a version, when one is not specified
explicitly. Concretization policies are discussed in more detail in explicitly. Concretization policies are discussed in more detail in
:ref:`site-configuration`. Sites using Spack can customize them to :ref:`configuration`. Sites using Spack can customize them to match
match the preferences of their own users. the preferences of their own users.
.. _spack-spec: .. _spack-spec:
@ -1666,9 +1668,9 @@ running ``spack spec``. For example:
^libdwarf ^libdwarf
^libelf ^libelf
dyninst@8.0.1%gcc@4.7.3=linux-ppc64 dyninst@8.0.1%gcc@4.7.3 arch=linux-ppc64
^libdwarf@20130729%gcc@4.7.3=linux-ppc64 ^libdwarf@20130729%gcc@4.7.3 arch=linux-ppc64
^libelf@0.8.13%gcc@4.7.3=linux-ppc64 ^libelf@0.8.13%gcc@4.7.3 arch=linux-ppc64
This is useful when you want to know exactly what Spack will do when This is useful when you want to know exactly what Spack will do when
you ask for a particular spec. you ask for a particular spec.
@ -1682,60 +1684,8 @@ be concretized on their system. For example, one user may prefer packages
built with OpenMPI and the Intel compiler. Another user may prefer built with OpenMPI and the Intel compiler. Another user may prefer
packages be built with MVAPICH and GCC. packages be built with MVAPICH and GCC.
Spack can be configured to prefer certain compilers, package See the `documentation in the config section <concretization-preferences_>`_
versions, depends_on, and variants during concretization. for more details.
The preferred configuration can be controlled via the
``~/.spack/packages.yaml`` file for user configuations, or the
``etc/spack/packages.yaml`` site configuration.
Here's an example packages.yaml file that sets preferred packages:
.. code-block:: sh
packages:
dyninst:
compiler: [gcc@4.9]
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
all:
compiler: [gcc@4.4.7, gcc@4.6:, intel, clang, pgi]
providers:
mpi: [mvapich, mpich, openmpi]
At a high level, this example is specifying how packages should be
concretized. The dyninst package should prefer using gcc 4.9 and
be built with debug options. The gperftools package should prefer version
2.2 over 2.4. Every package on the system should prefer mvapich for
its MPI and gcc 4.4.7 (except for Dyninst, which overrides this by preferring gcc 4.9).
These options are used to fill in implicit defaults. Any of them can be overwritten
on the command line if explicitly requested.
Each packages.yaml file begins with the string ``packages:`` and
package names are specified on the next level. The special string ``all``
applies settings to each package. Underneath each package name is
one or more components: ``compiler``, ``variants``, ``version``,
or ``providers``. Each component has an ordered list of spec
``constraints``, with earlier entries in the list being preferred over
later entries.
Sometimes a package installation may have constraints that forbid
the first concretization rule, in which case Spack will use the first
legal concretization rule. Going back to the example, if a user
requests gperftools 2.3 or later, then Spack will install version 2.4
as the 2.4 version of gperftools is preferred over 2.3.
An explicit concretization rule in the preferred section will always
take preference over unlisted concretizations. In the above example,
xlc isn't listed in the compiler list. Every listed compiler from
gcc to pgi will thus be preferred over the xlc compiler.
The syntax for the ``provider`` section differs slightly from other
concretization rules. A provider lists a value that packages may
``depend_on`` (e.g, mpi) and a list of rules for fulfilling that
dependency.
.. _install-method: .. _install-method:
@ -1960,6 +1910,12 @@ the command line.
``$rpath_flag`` can be overriden on a compiler specific basis in ``$rpath_flag`` can be overriden on a compiler specific basis in
``lib/spack/spack/compilers/$compiler.py``. ``lib/spack/spack/compilers/$compiler.py``.
The compiler wrappers also pass the compiler flags specified by the user from
the command line (``cflags``, ``cxxflags``, ``fflags``, ``cppflags``, ``ldflags``,
and/or ``ldlibs``). They do not override the canonical autotools flags with the
same names (but in ALL-CAPS) that may be passed into the build by particularly
challenging package scripts.
Compiler flags Compiler flags
~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~
In rare circumstances such as compiling and running small unit tests, a package In rare circumstances such as compiling and running small unit tests, a package
@ -2206,12 +2162,12 @@ example:
def install(self, prefix): def install(self, prefix):
# Do default install # Do default install
@when('=chaos_5_x86_64_ib') @when('arch=chaos_5_x86_64_ib')
def install(self, prefix): def install(self, prefix):
# This will be executed instead of the default install if # This will be executed instead of the default install if
# the package's sys_type() is chaos_5_x86_64_ib. # the package's sys_type() is chaos_5_x86_64_ib.
@when('=bgqos_0") @when('arch=bgqos_0")
def install(self, prefix): def install(self, prefix):
# This will be executed if the package's sys_type is bgqos_0 # This will be executed if the package's sys_type is bgqos_0
@ -2801,11 +2757,11 @@ build it:
$ spack stage libelf $ spack stage libelf
==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.13.tar.gz ==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.13.tar.gz
######################################################################## 100.0% ######################################################################## 100.0%
==> Staging archive: /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64/libelf-0.8.13.tar.gz ==> Staging archive: /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3 arch=linux-ppc64/libelf-0.8.13.tar.gz
==> Created stage in /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64. ==> Created stage in /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3 arch=linux-ppc64.
$ spack cd libelf $ spack cd libelf
$ pwd $ pwd
/Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64/libelf-0.8.13 /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3 arch=linux-ppc64/libelf-0.8.13
``spack cd`` here changed he current working directory to the ``spack cd`` here changed he current working directory to the
directory containing the expanded ``libelf`` source code. There are a directory containing the expanded ``libelf`` source code. There are a

View file

@ -38,71 +38,59 @@
def setup_parser(subparser): def setup_parser(subparser):
format_group = subparser.add_mutually_exclusive_group() format_group = subparser.add_mutually_exclusive_group()
format_group.add_argument('-s', format_group.add_argument('-s', '--short',
'--short',
action='store_const', action='store_const',
dest='mode', dest='mode',
const='short', const='short',
help='Show only specs (default)') help='Show only specs (default)')
format_group.add_argument('-p', format_group.add_argument('-p', '--paths',
'--paths',
action='store_const', action='store_const',
dest='mode', dest='mode',
const='paths', const='paths',
help='Show paths to package install directories') help='Show paths to package install directories')
format_group.add_argument( format_group.add_argument(
'-d', '-d', '--deps',
'--deps',
action='store_const', action='store_const',
dest='mode', dest='mode',
const='deps', const='deps',
help='Show full dependency DAG of installed packages') help='Show full dependency DAG of installed packages')
subparser.add_argument('-l', subparser.add_argument('-l', '--long',
'--long',
action='store_true', action='store_true',
dest='long', dest='long',
help='Show dependency hashes as well as versions.') help='Show dependency hashes as well as versions.')
subparser.add_argument('-L', subparser.add_argument('-L', '--very-long',
'--very-long',
action='store_true', action='store_true',
dest='very_long', dest='very_long',
help='Show dependency hashes as well as versions.') help='Show dependency hashes as well as versions.')
subparser.add_argument('-f', subparser.add_argument('-f', '--show-flags',
'--show-flags',
action='store_true', action='store_true',
dest='show_flags', dest='show_flags',
help='Show spec compiler flags.') help='Show spec compiler flags.')
subparser.add_argument( subparser.add_argument(
'-e', '-e', '--explicit',
'--explicit',
action='store_true', action='store_true',
help='Show only specs that were installed explicitly') help='Show only specs that were installed explicitly')
subparser.add_argument( subparser.add_argument(
'-E', '-E', '--implicit',
'--implicit',
action='store_true', action='store_true',
help='Show only specs that were installed as dependencies') help='Show only specs that were installed as dependencies')
subparser.add_argument( subparser.add_argument(
'-u', '-u', '--unknown',
'--unknown',
action='store_true', action='store_true',
dest='unknown', dest='unknown',
help='Show only specs Spack does not have a package for.') help='Show only specs Spack does not have a package for.')
subparser.add_argument( subparser.add_argument(
'-m', '-m', '--missing',
'--missing',
action='store_true', action='store_true',
dest='missing', dest='missing',
help='Show missing dependencies as well as installed specs.') help='Show missing dependencies as well as installed specs.')
subparser.add_argument('-M', subparser.add_argument('-M', '--only-missing',
'--only-missing',
action='store_true', action='store_true',
dest='only_missing', dest='only_missing',
help='Show only missing dependencies.') help='Show only missing dependencies.')
subparser.add_argument('-N', subparser.add_argument('-N', '--namespace',
'--namespace',
action='store_true', action='store_true',
help='Show fully qualified package names.') help='Show fully qualified package names.')
@ -188,7 +176,32 @@ def fmt(s):
print(hsh + spec.format(format_string, color=True) + '\n') print(hsh + spec.format(format_string, color=True) + '\n')
else: else:
raise ValueError("Invalid mode for display_specs: %s. Must be one of (paths, deps, short)." % mode) # NOQA: ignore=E501 raise ValueError(
"Invalid mode for display_specs: %s. Must be one of (paths,"
"deps, short)." % mode) # NOQA: ignore=E501
def query_arguments(args):
# Check arguments
if args.explicit and args.implicit:
tty.error('You can\'t pass -E and -e options simultaneously.')
raise SystemExit(1)
# Set up query arguments.
installed, known = True, any
if args.only_missing:
installed = False
elif args.missing:
installed = any
if args.unknown:
known = False
explicit = any
if args.explicit:
explicit = True
if args.implicit:
explicit = False
q_args = {'installed': installed, 'known': known, "explicit": explicit}
return q_args
def find(parser, args): def find(parser, args):
@ -205,22 +218,7 @@ def find(parser, args):
if not query_specs: if not query_specs:
return return
# Set up query arguments. q_args = query_arguments(args)
installed, known = True, any
if args.only_missing:
installed = False
elif args.missing:
installed = any
if args.unknown:
known = False
explicit = any
if args.explicit:
explicit = False
if args.implicit:
explicit = True
q_args = {'installed': installed, 'known': known, "explicit": explicit}
# Get all the specs the user asked for # Get all the specs the user asked for
if not query_specs: if not query_specs:

View file

@ -40,7 +40,6 @@
""" """
import os import os
import time
import socket import socket
import yaml import yaml
@ -56,6 +55,7 @@
from spack.error import SpackError from spack.error import SpackError
from spack.repository import UnknownPackageError from spack.repository import UnknownPackageError
# DB goes in this directory underneath the root # DB goes in this directory underneath the root
_db_dirname = '.spack-db' _db_dirname = '.spack-db'
@ -69,10 +69,12 @@
def _autospec(function): def _autospec(function):
"""Decorator that automatically converts the argument of a single-arg """Decorator that automatically converts the argument of a single-arg
function to a Spec.""" function to a Spec."""
def converter(self, spec_like, *args, **kwargs): def converter(self, spec_like, *args, **kwargs):
if not isinstance(spec_like, spack.spec.Spec): if not isinstance(spec_like, spack.spec.Spec):
spec_like = spack.spec.Spec(spec_like) spec_like = spack.spec.Spec(spec_like)
return function(self, spec_like, *args, **kwargs) return function(self, spec_like, *args, **kwargs)
return converter return converter
@ -92,6 +94,7 @@ class InstallRecord(object):
dependents left. dependents left.
""" """
def __init__(self, spec, path, installed, ref_count=0, explicit=False): def __init__(self, spec, path, installed, ref_count=0, explicit=False):
self.spec = spec self.spec = spec
self.path = str(path) self.path = str(path)
@ -100,16 +103,19 @@ def __init__(self, spec, path, installed, ref_count=0, explicit=False):
self.explicit = explicit self.explicit = explicit
def to_dict(self): def to_dict(self):
return { 'spec' : self.spec.to_node_dict(), return {
'path' : self.path, 'spec': self.spec.to_node_dict(),
'installed' : self.installed, 'path': self.path,
'ref_count' : self.ref_count, 'installed': self.installed,
'explicit' : self.explicit } 'ref_count': self.ref_count,
'explicit': self.explicit
}
@classmethod @classmethod
def from_dict(cls, spec, dictionary): def from_dict(cls, spec, dictionary):
d = dictionary d = dictionary
return InstallRecord(spec, d['path'], d['installed'], d['ref_count'], d.get('explicit', False)) return InstallRecord(spec, d['path'], d['installed'], d['ref_count'],
d.get('explicit', False))
class Database(object): class Database(object):
@ -144,7 +150,7 @@ def __init__(self, root, db_dir=None):
# Set up layout of database files within the db dir # Set up layout of database files within the db dir
self._index_path = join_path(self._db_dir, 'index.yaml') self._index_path = join_path(self._db_dir, 'index.yaml')
self._lock_path = join_path(self._db_dir, 'lock') self._lock_path = join_path(self._db_dir, 'lock')
# Create needed directories and files # Create needed directories and files
if not os.path.exists(self._db_dir): if not os.path.exists(self._db_dir):
@ -157,17 +163,14 @@ def __init__(self, root, db_dir=None):
self.lock = Lock(self._lock_path) self.lock = Lock(self._lock_path)
self._data = {} self._data = {}
def write_transaction(self, timeout=_db_lock_timeout): def write_transaction(self, timeout=_db_lock_timeout):
"""Get a write lock context manager for use in a `with` block.""" """Get a write lock context manager for use in a `with` block."""
return WriteTransaction(self, self._read, self._write, timeout) return WriteTransaction(self, self._read, self._write, timeout)
def read_transaction(self, timeout=_db_lock_timeout): def read_transaction(self, timeout=_db_lock_timeout):
"""Get a read lock context manager for use in a `with` block.""" """Get a read lock context manager for use in a `with` block."""
return ReadTransaction(self, self._read, None, timeout) return ReadTransaction(self, self._read, None, timeout)
def _write_to_yaml(self, stream): def _write_to_yaml(self, stream):
"""Write out the databsae to a YAML file. """Write out the databsae to a YAML file.
@ -183,9 +186,9 @@ def _write_to_yaml(self, stream):
# different paths, it can't differentiate. # different paths, it can't differentiate.
# TODO: fix this before we support multiple install locations. # TODO: fix this before we support multiple install locations.
database = { database = {
'database' : { 'database': {
'installs' : installs, 'installs': installs,
'version' : str(_db_version) 'version': str(_db_version)
} }
} }
@ -194,15 +197,11 @@ def _write_to_yaml(self, stream):
except YAMLError as e: except YAMLError as e:
raise SpackYAMLError("error writing YAML database:", str(e)) raise SpackYAMLError("error writing YAML database:", str(e))
def _read_spec_from_yaml(self, hash_key, installs, parent_key=None): def _read_spec_from_yaml(self, hash_key, installs, parent_key=None):
"""Recursively construct a spec from a hash in a YAML database. """Recursively construct a spec from a hash in a YAML database.
Does not do any locking. Does not do any locking.
""" """
if hash_key not in installs:
parent = read_spec(installs[parent_key]['path'])
spec_dict = installs[hash_key]['spec'] spec_dict = installs[hash_key]['spec']
# Install records don't include hash with spec, so we add it in here # Install records don't include hash with spec, so we add it in here
@ -224,7 +223,6 @@ def _read_spec_from_yaml(self, hash_key, installs, parent_key=None):
spec._mark_concrete() spec._mark_concrete()
return spec return spec
def _read_from_yaml(self, stream): def _read_from_yaml(self, stream):
""" """
Fill database from YAML, do not maintain old data Fill database from YAML, do not maintain old data
@ -246,15 +244,15 @@ def _read_from_yaml(self, stream):
return return
def check(cond, msg): def check(cond, msg):
if not cond: raise CorruptDatabaseError(self._index_path, msg) if not cond:
raise CorruptDatabaseError(self._index_path, msg)
check('database' in yfile, "No 'database' attribute in YAML.") check('database' in yfile, "No 'database' attribute in YAML.")
# High-level file checks # High-level file checks
db = yfile['database'] db = yfile['database']
check('installs' in db, "No 'installs' in YAML DB.") check('installs' in db, "No 'installs' in YAML DB.")
check('version' in db, "No 'version' in YAML DB.") check('version' in db, "No 'version' in YAML DB.")
installs = db['installs'] installs = db['installs']
@ -277,25 +275,25 @@ def check(cond, msg):
# hashes are the same. # hashes are the same.
spec_hash = spec.dag_hash() spec_hash = spec.dag_hash()
if not spec_hash == hash_key: if not spec_hash == hash_key:
tty.warn("Hash mismatch in database: %s -> spec with hash %s" tty.warn(
% (hash_key, spec_hash)) "Hash mismatch in database: %s -> spec with hash %s" %
continue # TODO: is skipping the right thing to do? (hash_key, spec_hash))
continue # TODO: is skipping the right thing to do?
# Insert the brand new spec in the database. Each # Insert the brand new spec in the database. Each
# spec has its own copies of its dependency specs. # spec has its own copies of its dependency specs.
# TODO: would a more immmutable spec implementation simplify this? # TODO: would a more immmutable spec implementation simplify
# this?
data[hash_key] = InstallRecord.from_dict(spec, rec) data[hash_key] = InstallRecord.from_dict(spec, rec)
except Exception as e: except Exception as e:
tty.warn("Invalid database reecord:", tty.warn("Invalid database reecord:",
"file: %s" % self._index_path, "file: %s" % self._index_path,
"hash: %s" % hash_key, "hash: %s" % hash_key, "cause: %s" % str(e))
"cause: %s" % str(e))
raise raise
self._data = data self._data = data
def reindex(self, directory_layout): def reindex(self, directory_layout):
"""Build database index from scratch based from a directory layout. """Build database index from scratch based from a directory layout.
@ -320,7 +318,6 @@ def reindex(self, directory_layout):
self._data = old_data self._data = old_data
raise raise
def _check_ref_counts(self): def _check_ref_counts(self):
"""Ensure consistency of reference counts in the DB. """Ensure consistency of reference counts in the DB.
@ -342,9 +339,8 @@ def _check_ref_counts(self):
found = rec.ref_count found = rec.ref_count
if not expected == found: if not expected == found:
raise AssertionError( raise AssertionError(
"Invalid ref_count: %s: %d (expected %d), in DB %s" "Invalid ref_count: %s: %d (expected %d), in DB %s" %
% (key, found, expected, self._index_path)) (key, found, expected, self._index_path))
def _write(self): def _write(self):
"""Write the in-memory database index to its file path. """Write the in-memory database index to its file path.
@ -366,7 +362,6 @@ def _write(self):
os.remove(temp_file) os.remove(temp_file)
raise raise
def _read(self): def _read(self):
"""Re-read Database from the data in the set location. """Re-read Database from the data in the set location.
@ -381,7 +376,6 @@ def _read(self):
# reindex() takes its own write lock, so no lock here. # reindex() takes its own write lock, so no lock here.
self.reindex(spack.install_layout) self.reindex(spack.install_layout)
def _add(self, spec, path, directory_layout=None, explicit=False): def _add(self, spec, path, directory_layout=None, explicit=False):
"""Add an install record for spec at path to the database. """Add an install record for spec at path to the database.
@ -404,11 +398,11 @@ def _add(self, spec, path, directory_layout=None, explicit=False):
rec.path = path rec.path = path
else: else:
self._data[key] = InstallRecord(spec, path, True, explicit=explicit) self._data[key] = InstallRecord(spec, path, True,
explicit=explicit)
for dep in spec.dependencies.values(): for dep in spec.dependencies.values():
self._increment_ref_count(dep, directory_layout) self._increment_ref_count(dep, directory_layout)
def _increment_ref_count(self, spec, directory_layout=None): def _increment_ref_count(self, spec, directory_layout=None):
"""Recursively examine dependencies and update their DB entries.""" """Recursively examine dependencies and update their DB entries."""
key = spec.dag_hash() key = spec.dag_hash()
@ -438,28 +432,25 @@ def add(self, spec, path, explicit=False):
with self.write_transaction(): with self.write_transaction():
self._add(spec, path, explicit=explicit) self._add(spec, path, explicit=explicit)
def _get_matching_spec_key(self, spec, **kwargs): def _get_matching_spec_key(self, spec, **kwargs):
"""Get the exact spec OR get a single spec that matches.""" """Get the exact spec OR get a single spec that matches."""
key = spec.dag_hash() key = spec.dag_hash()
if not key in self._data: if key not in self._data:
match = self.query_one(spec, **kwargs) match = self.query_one(spec, **kwargs)
if match: if match:
return match.dag_hash() return match.dag_hash()
raise KeyError("No such spec in database! %s" % spec) raise KeyError("No such spec in database! %s" % spec)
return key return key
@_autospec @_autospec
def get_record(self, spec, **kwargs): def get_record(self, spec, **kwargs):
key = self._get_matching_spec_key(spec, **kwargs) key = self._get_matching_spec_key(spec, **kwargs)
return self._data[key] return self._data[key]
def _decrement_ref_count(self, spec): def _decrement_ref_count(self, spec):
key = spec.dag_hash() key = spec.dag_hash()
if not key in self._data: if key not in self._data:
# TODO: print something here? DB is corrupt, but # TODO: print something here? DB is corrupt, but
# not much we can do. # not much we can do.
return return
@ -472,7 +463,6 @@ def _decrement_ref_count(self, spec):
for dep in spec.dependencies.values(): for dep in spec.dependencies.values():
self._decrement_ref_count(dep) self._decrement_ref_count(dep)
def _remove(self, spec): def _remove(self, spec):
"""Non-locking version of remove(); does real work. """Non-locking version of remove(); does real work.
""" """
@ -491,7 +481,6 @@ def _remove(self, spec):
# query spec was passed in. # query spec was passed in.
return rec.spec return rec.spec
@_autospec @_autospec
def remove(self, spec): def remove(self, spec):
"""Removes a spec from the database. To be called on uninstall. """Removes a spec from the database. To be called on uninstall.
@ -508,7 +497,6 @@ def remove(self, spec):
with self.write_transaction(): with self.write_transaction():
return self._remove(spec) return self._remove(spec)
@_autospec @_autospec
def installed_extensions_for(self, extendee_spec): def installed_extensions_for(self, extendee_spec):
""" """
@ -519,12 +507,11 @@ def installed_extensions_for(self, extendee_spec):
try: try:
if s.package.extends(extendee_spec): if s.package.extends(extendee_spec):
yield s.package yield s.package
except UnknownPackageError as e: except UnknownPackageError:
continue continue
# skips unknown packages # skips unknown packages
# TODO: conditional way to do this instead of catching exceptions # TODO: conditional way to do this instead of catching exceptions
def query(self, query_spec=any, known=any, installed=True, explicit=any): def query(self, query_spec=any, known=any, installed=True, explicit=any):
"""Run a query on the database. """Run a query on the database.
@ -567,14 +554,14 @@ def query(self, query_spec=any, known=any, installed=True, explicit=any):
continue continue
if explicit is not any and rec.explicit != explicit: if explicit is not any and rec.explicit != explicit:
continue continue
if known is not any and spack.repo.exists(rec.spec.name) != known: if known is not any and spack.repo.exists(
rec.spec.name) != known:
continue continue
if query_spec is any or rec.spec.satisfies(query_spec): if query_spec is any or rec.spec.satisfies(query_spec):
results.append(rec.spec) results.append(rec.spec)
return sorted(results) return sorted(results)
def query_one(self, query_spec, known=any, installed=True): def query_one(self, query_spec, known=any, installed=True):
"""Query for exactly one spec that matches the query spec. """Query for exactly one spec that matches the query spec.
@ -586,10 +573,9 @@ def query_one(self, query_spec, known=any, installed=True):
assert len(concrete_specs) <= 1 assert len(concrete_specs) <= 1
return concrete_specs[0] if concrete_specs else None return concrete_specs[0] if concrete_specs else None
def missing(self, spec): def missing(self, spec):
with self.read_transaction(): with self.read_transaction():
key = spec.dag_hash() key = spec.dag_hash()
return key in self._data and not self._data[key].installed return key in self._data and not self._data[key].installed
@ -601,7 +587,10 @@ class _Transaction(object):
Timeout for lock is customizable. Timeout for lock is customizable.
""" """
def __init__(self, db, acquire_fn=None, release_fn=None,
def __init__(self, db,
acquire_fn=None,
release_fn=None,
timeout=_db_lock_timeout): timeout=_db_lock_timeout):
self._db = db self._db = db
self._timeout = timeout self._timeout = timeout
@ -636,11 +625,11 @@ def _exit(self):
class CorruptDatabaseError(SpackError): class CorruptDatabaseError(SpackError):
def __init__(self, path, msg=''): def __init__(self, path, msg=''):
super(CorruptDatabaseError, self).__init__( super(CorruptDatabaseError, self).__init__(
"Spack database is corrupt: %s. %s" %(path, msg)) "Spack database is corrupt: %s. %s" % (path, msg))
class InvalidDatabaseVersionError(SpackError): class InvalidDatabaseVersionError(SpackError):
def __init__(self, expected, found): def __init__(self, expected, found):
super(InvalidDatabaseVersionError, self).__init__( super(InvalidDatabaseVersionError, self).__init__(
"Expected database version %s but found version %s" "Expected database version %s but found version %s" %
% (expected, found)) (expected, found))

View file

@ -1,4 +1,4 @@
############################################################################## #
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. # Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory. # Produced at the Lawrence Livermore National Laboratory.
# #
@ -21,7 +21,7 @@
# You should have received a copy of the GNU Lesser General Public # You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software # License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## #
""" """
Fetch strategies are used to download source code into a staging area Fetch strategies are used to download source code into a staging area
in order to build it. They need to define the following methods: in order to build it. They need to define the following methods:
@ -75,11 +75,13 @@ def wrapper(self, *args, **kwargs):
class FetchStrategy(object): class FetchStrategy(object):
"""Superclass of all fetch strategies.""" """Superclass of all fetch strategies."""
enabled = False # Non-abstract subclasses should be enabled. enabled = False # Non-abstract subclasses should be enabled.
required_attributes = None # Attributes required in version() args. required_attributes = None # Attributes required in version() args.
class __metaclass__(type): class __metaclass__(type):
"""This metaclass registers all fetch strategies in a list.""" """This metaclass registers all fetch strategies in a list."""
def __init__(cls, name, bases, dict): def __init__(cls, name, bases, dict):
@ -126,6 +128,7 @@ def matches(cls, args):
@pattern.composite(interface=FetchStrategy) @pattern.composite(interface=FetchStrategy)
class FetchStrategyComposite(object): class FetchStrategyComposite(object):
""" """
Composite for a FetchStrategy object. Implements the GoF composite pattern. Composite for a FetchStrategy object. Implements the GoF composite pattern.
""" """
@ -134,6 +137,7 @@ class FetchStrategyComposite(object):
class URLFetchStrategy(FetchStrategy): class URLFetchStrategy(FetchStrategy):
"""FetchStrategy that pulls source code from a URL for an archive, """FetchStrategy that pulls source code from a URL for an archive,
checks the archive against a checksum,and decompresses the archive. checks the archive against a checksum,and decompresses the archive.
""" """
@ -235,12 +239,13 @@ def fetch(self):
# redirects properly. # redirects properly.
content_types = re.findall(r'Content-Type:[^\r\n]+', headers) content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
if content_types and 'text/html' in content_types[-1]: if content_types and 'text/html' in content_types[-1]:
tty.warn( tty.warn("The contents of ",
"The contents of " + self.archive_file + " look like HTML.", (self.archive_file if self.archive_file is not None
"The checksum will likely be bad. If it is, you can use", else "the archive"),
"'spack clean <package>' to remove the bad archive, then fix", " look like HTML.",
"your internet gateway issue and install again.") "The checksum will likely be bad. If it is, you can use",
"'spack clean <package>' to remove the bad archive, then",
"fix your internet gateway issue and install again.")
if save_file: if save_file:
os.rename(partial_file, save_file) os.rename(partial_file, save_file)
@ -353,6 +358,7 @@ def __str__(self):
class VCSFetchStrategy(FetchStrategy): class VCSFetchStrategy(FetchStrategy):
def __init__(self, name, *rev_types, **kwargs): def __init__(self, name, *rev_types, **kwargs):
super(VCSFetchStrategy, self).__init__() super(VCSFetchStrategy, self).__init__()
self.name = name self.name = name
@ -407,6 +413,7 @@ def __repr__(self):
class GoFetchStrategy(VCSFetchStrategy): class GoFetchStrategy(VCSFetchStrategy):
""" """
Fetch strategy that employs the `go get` infrastructure Fetch strategy that employs the `go get` infrastructure
Use like this in a package: Use like this in a package:
@ -466,6 +473,7 @@ def __str__(self):
class GitFetchStrategy(VCSFetchStrategy): class GitFetchStrategy(VCSFetchStrategy):
""" """
Fetch strategy that gets source code from a git repository. Fetch strategy that gets source code from a git repository.
Use like this in a package: Use like this in a package:
@ -586,6 +594,7 @@ def __str__(self):
class SvnFetchStrategy(VCSFetchStrategy): class SvnFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a subversion repository. """Fetch strategy that gets source code from a subversion repository.
Use like this in a package: Use like this in a package:
@ -662,6 +671,7 @@ def __str__(self):
class HgFetchStrategy(VCSFetchStrategy): class HgFetchStrategy(VCSFetchStrategy):
""" """
Fetch strategy that gets source code from a Mercurial repository. Fetch strategy that gets source code from a Mercurial repository.
Use like this in a package: Use like this in a package:
@ -806,11 +816,13 @@ def for_package_version(pkg, version):
class FetchError(spack.error.SpackError): class FetchError(spack.error.SpackError):
def __init__(self, msg, long_msg=None): def __init__(self, msg, long_msg=None):
super(FetchError, self).__init__(msg, long_msg) super(FetchError, self).__init__(msg, long_msg)
class FailedDownloadError(FetchError): class FailedDownloadError(FetchError):
"""Raised wen a download fails.""" """Raised wen a download fails."""
def __init__(self, url, msg=""): def __init__(self, url, msg=""):
@ -820,16 +832,19 @@ def __init__(self, url, msg=""):
class NoArchiveFileError(FetchError): class NoArchiveFileError(FetchError):
def __init__(self, msg, long_msg): def __init__(self, msg, long_msg):
super(NoArchiveFileError, self).__init__(msg, long_msg) super(NoArchiveFileError, self).__init__(msg, long_msg)
class NoDigestError(FetchError): class NoDigestError(FetchError):
def __init__(self, msg, long_msg=None): def __init__(self, msg, long_msg=None):
super(NoDigestError, self).__init__(msg, long_msg) super(NoDigestError, self).__init__(msg, long_msg)
class InvalidArgsError(FetchError): class InvalidArgsError(FetchError):
def __init__(self, pkg, version): def __init__(self, pkg, version):
msg = ("Could not construct a fetch strategy for package %s at " msg = ("Could not construct a fetch strategy for package %s at "
"version %s") "version %s")
@ -838,6 +853,7 @@ def __init__(self, pkg, version):
class ChecksumError(FetchError): class ChecksumError(FetchError):
"""Raised when archive fails to checksum.""" """Raised when archive fails to checksum."""
def __init__(self, message, long_msg=None): def __init__(self, message, long_msg=None):
@ -845,6 +861,7 @@ def __init__(self, message, long_msg=None):
class NoStageError(FetchError): class NoStageError(FetchError):
"""Raised when fetch operations are called before set_stage().""" """Raised when fetch operations are called before set_stage()."""
def __init__(self, method): def __init__(self, method):

File diff suppressed because it is too large Load diff

View file

@ -39,7 +39,7 @@
'svn_fetch', 'hg_fetch', 'mirror', 'modules', 'url_extrapolate', 'svn_fetch', 'hg_fetch', 'mirror', 'modules', 'url_extrapolate',
'cc', 'link_tree', 'spec_yaml', 'optional_deps', 'cc', 'link_tree', 'spec_yaml', 'optional_deps',
'make_executable', 'configure_guess', 'lock', 'database', 'make_executable', 'configure_guess', 'lock', 'database',
'namespace_trie', 'yaml', 'sbang', 'environment', 'namespace_trie', 'yaml', 'sbang', 'environment', 'cmd.find',
'cmd.uninstall', 'cmd.test_install'] 'cmd.uninstall', 'cmd.test_install']

View file

@ -0,0 +1,60 @@
##############################################################################
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import spack.cmd.find
import unittest
class Bunch(object):
def __init__(self, **kwargs):
self.__dict__.update(kwargs)
class FindTest(unittest.TestCase):
def test_query_arguments(self):
query_arguments = spack.cmd.find.query_arguments
# Default arguments
args = Bunch(only_missing=False, missing=False,
unknown=False, explicit=False, implicit=False)
q_args = query_arguments(args)
self.assertTrue('installed' in q_args)
self.assertTrue('known' in q_args)
self.assertTrue('explicit' in q_args)
self.assertEqual(q_args['installed'], True)
self.assertEqual(q_args['known'], any)
self.assertEqual(q_args['explicit'], any)
# Check that explicit works correctly
args.explicit = True
q_args = query_arguments(args)
self.assertEqual(q_args['explicit'], True)
args.explicit = False
args.implicit = True
q_args = query_arguments(args)
self.assertEqual(q_args['explicit'], False)
args.explicit = True
self.assertRaises(SystemExit, query_arguments, args)

View file

@ -25,8 +25,10 @@
from spack import * from spack import *
import sys import sys
class Dealii(Package): class Dealii(Package):
"""C++ software library providing well-documented tools to build finite element codes for a broad variety of PDEs.""" """C++ software library providing well-documented tools to build finite
element codes for a broad variety of PDEs."""
homepage = "https://www.dealii.org" homepage = "https://www.dealii.org"
url = "https://github.com/dealii/dealii/releases/download/v8.4.0/dealii-8.4.0.tar.gz" url = "https://github.com/dealii/dealii/releases/download/v8.4.0/dealii-8.4.0.tar.gz"
@ -36,7 +38,7 @@ class Dealii(Package):
variant('mpi', default=True, description='Compile with MPI') variant('mpi', default=True, description='Compile with MPI')
variant('arpack', default=True, description='Compile with Arpack and PArpack (only with MPI)') variant('arpack', default=True, description='Compile with Arpack and PArpack (only with MPI)')
variant('doc', default=False, description='Compile with documentation') variant('doc', default=False, description='Compile with documentation')
variant('gsl' , default=True, description='Compile with GSL') variant('gsl', default=True, description='Compile with GSL')
variant('hdf5', default=True, description='Compile with HDF5 (only with MPI)') variant('hdf5', default=True, description='Compile with HDF5 (only with MPI)')
variant('metis', default=True, description='Compile with Metis') variant('metis', default=True, description='Compile with Metis')
variant('netcdf', default=True, description='Compile with Netcdf (only with MPI)') variant('netcdf', default=True, description='Compile with Netcdf (only with MPI)')
@ -47,38 +49,40 @@ class Dealii(Package):
variant('trilinos', default=True, description='Compile with Trilinos (only with MPI)') variant('trilinos', default=True, description='Compile with Trilinos (only with MPI)')
# required dependencies, light version # required dependencies, light version
depends_on ("blas") depends_on("blas")
# Boost 1.58 is blacklisted, see https://github.com/dealii/dealii/issues/1591 # Boost 1.58 is blacklisted, see
# require at least 1.59 # https://github.com/dealii/dealii/issues/1591
depends_on ("boost@1.59.0:", when='~mpi') # Require at least 1.59
depends_on ("boost@1.59.0:+mpi", when='+mpi') depends_on("boost@1.59.0:+thread+system+serialization+iostreams", when='~mpi') # NOQA: ignore=E501
depends_on ("bzip2") depends_on("boost@1.59.0:+mpi+thread+system+serialization+iostreams", when='+mpi') # NOQA: ignore=E501
depends_on ("cmake") depends_on("bzip2")
depends_on ("lapack") depends_on("cmake")
depends_on ("muparser") depends_on("lapack")
depends_on ("suite-sparse") depends_on("muparser")
depends_on ("tbb") depends_on("suite-sparse")
depends_on ("zlib") depends_on("tbb")
depends_on("zlib")
# optional dependencies # optional dependencies
depends_on ("mpi", when="+mpi") depends_on("mpi", when="+mpi")
depends_on ("arpack-ng+mpi", when='+arpack+mpi') depends_on("arpack-ng+mpi", when='+arpack+mpi')
depends_on ("doxygen", when='+doc') depends_on("doxygen+graphviz", when='+doc')
depends_on ("gsl", when='@8.5.0:+gsl') depends_on("graphviz", when='+doc')
depends_on ("gsl", when='@dev+gsl') depends_on("gsl", when='@8.5.0:+gsl')
depends_on ("hdf5+mpi~cxx", when='+hdf5+mpi') #FIXME NetCDF declares dependency with ~cxx, why? depends_on("gsl", when='@dev+gsl')
depends_on ("metis@5:", when='+metis') depends_on("hdf5+mpi", when='+hdf5+mpi')
depends_on ("netcdf+mpi", when="+netcdf+mpi") depends_on("metis@5:", when='+metis')
depends_on ("netcdf-cxx", when='+netcdf+mpi') depends_on("netcdf+mpi", when="+netcdf+mpi")
depends_on ("oce", when='+oce') depends_on("netcdf-cxx", when='+netcdf+mpi')
depends_on ("p4est", when='+p4est+mpi') depends_on("oce", when='+oce')
depends_on ("petsc+mpi", when='+petsc+mpi') depends_on("p4est", when='+p4est+mpi')
depends_on ("slepc", when='+slepc+petsc+mpi') depends_on("petsc+mpi", when='+petsc+mpi')
depends_on ("trilinos", when='+trilinos+mpi') depends_on("slepc", when='+slepc+petsc+mpi')
depends_on("trilinos", when='+trilinos+mpi')
# developer dependnecies # developer dependnecies
depends_on ("numdiff", when='@dev') depends_on("numdiff", when='@dev')
depends_on ("astyle@2.04", when='@dev') depends_on("astyle@2.04", when='@dev')
def install(self, spec, prefix): def install(self, spec, prefix):
options = [] options = []
@ -96,17 +100,17 @@ def install(self, spec, prefix):
'-DDEAL_II_WITH_THREADS:BOOL=ON', '-DDEAL_II_WITH_THREADS:BOOL=ON',
'-DBOOST_DIR=%s' % spec['boost'].prefix, '-DBOOST_DIR=%s' % spec['boost'].prefix,
'-DBZIP2_DIR=%s' % spec['bzip2'].prefix, '-DBZIP2_DIR=%s' % spec['bzip2'].prefix,
# CMake's FindBlas/Lapack may pickup system's blas/lapack instead of Spack's. # CMake's FindBlas/Lapack may pickup system's blas/lapack instead
# Be more specific to avoid this. # of Spack's. Be more specific to avoid this.
# Note that both lapack and blas are provided in -DLAPACK_XYZ variables # Note that both lapack and blas are provided in -DLAPACK_XYZ.
'-DLAPACK_FOUND=true', '-DLAPACK_FOUND=true',
'-DLAPACK_INCLUDE_DIRS=%s;%s' % '-DLAPACK_INCLUDE_DIRS=%s;%s' %
(spec['lapack'].prefix.include, (spec['lapack'].prefix.include,
spec['blas'].prefix.include), spec['blas'].prefix.include),
'-DLAPACK_LIBRARIES=%s;%s' % '-DLAPACK_LIBRARIES=%s;%s' %
(join_path(spec['lapack'].prefix.lib,'liblapack.%s' % dsuf), # FIXME don't hardcode names (spec['lapack'].lapack_shared_lib,
join_path(spec['blas'].prefix.lib,'libblas.%s' % dsuf)), # FIXME don't hardcode names spec['blas'].blas_shared_lib),
'-DMUPARSER_DIR=%s ' % spec['muparser'].prefix, '-DMUPARSER_DIR=%s' % spec['muparser'].prefix,
'-DUMFPACK_DIR=%s' % spec['suite-sparse'].prefix, '-DUMFPACK_DIR=%s' % spec['suite-sparse'].prefix,
'-DTBB_DIR=%s' % spec['tbb'].prefix, '-DTBB_DIR=%s' % spec['tbb'].prefix,
'-DZLIB_DIR=%s' % spec['zlib'].prefix '-DZLIB_DIR=%s' % spec['zlib'].prefix
@ -116,33 +120,34 @@ def install(self, spec, prefix):
if '+mpi' in spec: if '+mpi' in spec:
options.extend([ options.extend([
'-DDEAL_II_WITH_MPI:BOOL=ON', '-DDEAL_II_WITH_MPI:BOOL=ON',
'-DCMAKE_C_COMPILER=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpicc'), # FIXME: avoid hardcoding mpi wrappers names '-DCMAKE_C_COMPILER=%s' % spec['mpi'].mpicc,
'-DCMAKE_CXX_COMPILER=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpic++'), '-DCMAKE_CXX_COMPILER=%s' % spec['mpi'].mpicxx,
'-DCMAKE_Fortran_COMPILER=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif90'), '-DCMAKE_Fortran_COMPILER=%s' % spec['mpi'].mpifc,
]) ])
else: else:
options.extend([ options.extend([
'-DDEAL_II_WITH_MPI:BOOL=OFF', '-DDEAL_II_WITH_MPI:BOOL=OFF',
]) ])
# Optional dependencies for which librariy names are the same as CMake variables # Optional dependencies for which librariy names are the same as CMake
for library in ('gsl','hdf5','p4est','petsc','slepc','trilinos','metis'): # variables:
for library in ('gsl', 'hdf5', 'p4est', 'petsc', 'slepc', 'trilinos', 'metis'): # NOQA: ignore=E501
if library in spec: if library in spec:
options.extend([ options.extend([
'-D{library}_DIR={value}'.format(library=library.upper(), value=spec[library].prefix), '-D%s_DIR=%s' % (library.upper(), spec[library].prefix),
'-DDEAL_II_WITH_{library}:BOOL=ON'.format(library=library.upper()) '-DDEAL_II_WITH_%s:BOOL=ON' % library.upper()
]) ])
else: else:
options.extend([ options.extend([
'-DDEAL_II_WITH_{library}:BOOL=OFF'.format(library=library.upper()) '-DDEAL_II_WITH_%s:BOOL=OFF' % library.upper()
]) ])
# doxygen # doxygen
options.extend([ options.extend([
'-DDEAL_II_COMPONENT_DOCUMENTATION=%s' % ('ON' if '+doc' in spec else 'OFF'), '-DDEAL_II_COMPONENT_DOCUMENTATION=%s' %
('ON' if '+doc' in spec else 'OFF'),
]) ])
# arpack # arpack
if '+arpack' in spec: if '+arpack' in spec:
options.extend([ options.extend([
@ -160,11 +165,13 @@ def install(self, spec, prefix):
options.extend([ options.extend([
'-DNETCDF_FOUND=true', '-DNETCDF_FOUND=true',
'-DNETCDF_LIBRARIES=%s;%s' % '-DNETCDF_LIBRARIES=%s;%s' %
(join_path(spec['netcdf-cxx'].prefix.lib,'libnetcdf_c++.%s' % dsuf), (join_path(spec['netcdf-cxx'].prefix.lib,
join_path(spec['netcdf'].prefix.lib,'libnetcdf.%s' % dsuf)), 'libnetcdf_c++.%s' % dsuf),
join_path(spec['netcdf'].prefix.lib,
'libnetcdf.%s' % dsuf)),
'-DNETCDF_INCLUDE_DIRS=%s;%s' % '-DNETCDF_INCLUDE_DIRS=%s;%s' %
(spec['netcdf-cxx'].prefix.include, (spec['netcdf-cxx'].prefix.include,
spec['netcdf'].prefix.include), spec['netcdf'].prefix.include),
]) ])
else: else:
options.extend([ options.extend([
@ -200,7 +207,7 @@ def install(self, spec, prefix):
with working_dir('examples/step-3'): with working_dir('examples/step-3'):
cmake('.') cmake('.')
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
# An example which uses Metis + PETSc # An example which uses Metis + PETSc
# FIXME: switch step-18 to MPI # FIXME: switch step-18 to MPI
@ -213,7 +220,7 @@ def install(self, spec, prefix):
if '^petsc' in spec and '^metis' in spec: if '^petsc' in spec and '^metis' in spec:
cmake('.') cmake('.')
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
# take step-40 which can use both PETSc and Trilinos # take step-40 which can use both PETSc and Trilinos
# FIXME: switch step-40 to MPI run # FIXME: switch step-40 to MPI run
@ -222,43 +229,49 @@ def install(self, spec, prefix):
print('========== Step-40 PETSc ============') print('========== Step-40 PETSc ============')
print('=====================================') print('=====================================')
# list the number of cycles to speed up # list the number of cycles to speed up
filter_file(r'(const unsigned int n_cycles = 8;)', ('const unsigned int n_cycles = 2;'), 'step-40.cc') filter_file(r'(const unsigned int n_cycles = 8;)',
('const unsigned int n_cycles = 2;'), 'step-40.cc')
cmake('.') cmake('.')
if '^petsc' in spec: if '^petsc' in spec:
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
print('=====================================') print('=====================================')
print('========= Step-40 Trilinos ==========') print('========= Step-40 Trilinos ==========')
print('=====================================') print('=====================================')
# change Linear Algebra to Trilinos # change Linear Algebra to Trilinos
filter_file(r'(\/\/ #define FORCE_USE_OF_TRILINOS.*)', ('#define FORCE_USE_OF_TRILINOS'), 'step-40.cc') filter_file(r'(\/\/ #define FORCE_USE_OF_TRILINOS.*)',
('#define FORCE_USE_OF_TRILINOS'), 'step-40.cc')
if '^trilinos+hypre' in spec: if '^trilinos+hypre' in spec:
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
print('=====================================') print('=====================================')
print('=== Step-40 Trilinos SuperluDist ====') print('=== Step-40 Trilinos SuperluDist ====')
print('=====================================') print('=====================================')
# change to direct solvers # change to direct solvers
filter_file(r'(LA::SolverCG solver\(solver_control\);)', ('TrilinosWrappers::SolverDirect::AdditionalData data(false,"Amesos_Superludist"); TrilinosWrappers::SolverDirect solver(solver_control,data);'), 'step-40.cc') filter_file(r'(LA::SolverCG solver\(solver_control\);)', ('TrilinosWrappers::SolverDirect::AdditionalData data(false,"Amesos_Superludist"); TrilinosWrappers::SolverDirect solver(solver_control,data);'), 'step-40.cc') # NOQA: ignore=E501
filter_file(r'(LA::MPI::PreconditionAMG preconditioner;)', (''), 'step-40.cc') filter_file(r'(LA::MPI::PreconditionAMG preconditioner;)',
filter_file(r'(LA::MPI::PreconditionAMG::AdditionalData data;)', (''), 'step-40.cc') (''), 'step-40.cc')
filter_file(r'(preconditioner.initialize\(system_matrix, data\);)', (''), 'step-40.cc') filter_file(r'(LA::MPI::PreconditionAMG::AdditionalData data;)',
filter_file(r'(solver\.solve \(system_matrix, completely_distributed_solution, system_rhs,)', ('solver.solve (system_matrix, completely_distributed_solution, system_rhs);'), 'step-40.cc') (''), 'step-40.cc')
filter_file(r'(preconditioner.initialize\(system_matrix, data\);)',
(''), 'step-40.cc')
filter_file(r'(solver\.solve \(system_matrix, completely_distributed_solution, system_rhs,)', ('solver.solve (system_matrix, completely_distributed_solution, system_rhs);'), 'step-40.cc') # NOQA: ignore=E501
filter_file(r'(preconditioner\);)', (''), 'step-40.cc') filter_file(r'(preconditioner\);)', (''), 'step-40.cc')
if '^trilinos+superlu-dist' in spec: if '^trilinos+superlu-dist' in spec:
make('release') make('release')
make('run',paralle=False) make('run', paralle=False)
print('=====================================') print('=====================================')
print('====== Step-40 Trilinos MUMPS =======') print('====== Step-40 Trilinos MUMPS =======')
print('=====================================') print('=====================================')
# switch to Mumps # switch to Mumps
filter_file(r'(Amesos_Superludist)', ('Amesos_Mumps'), 'step-40.cc') filter_file(r'(Amesos_Superludist)',
('Amesos_Mumps'), 'step-40.cc')
if '^trilinos+mumps' in spec: if '^trilinos+mumps' in spec:
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
print('=====================================') print('=====================================')
print('============ Step-36 ================') print('============ Step-36 ================')
@ -267,7 +280,7 @@ def install(self, spec, prefix):
if 'slepc' in spec: if 'slepc' in spec:
cmake('.') cmake('.')
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
print('=====================================') print('=====================================')
print('============ Step-54 ================') print('============ Step-54 ================')
@ -276,7 +289,7 @@ def install(self, spec, prefix):
if 'oce' in spec: if 'oce' in spec:
cmake('.') cmake('.')
make('release') make('release')
make('run',parallel=False) make('run', parallel=False)
def setup_environment(self, spack_env, env): def setup_environment(self, spack_env, env):
env.set('DEAL_II_DIR', self.prefix) env.set('DEAL_II_DIR', self.prefix)

View file

@ -24,29 +24,19 @@
############################################################################## ##############################################################################
from spack import * from spack import *
class Launchmon(Package): class Launchmon(Package):
"""Software infrastructure that enables HPC run-time tools to """Software infrastructure that enables HPC run-time tools to
co-locate tool daemons with a parallel job.""" co-locate tool daemons with a parallel job."""
homepage = "http://sourceforge.net/projects/launchmon" homepage = "https://github.com/LLNL/LaunchMON"
url = "http://downloads.sourceforge.net/project/launchmon/launchmon/1.0.1%20release/launchmon-1.0.1.tar.gz" url = "https://github.com/LLNL/LaunchMON/releases/download/v1.0.2/launchmon-v1.0.2.tar.gz" # NOQA: ignore=E501
version('1.0.1', '2f12465803409fd07f91174a4389eb2b') version('1.0.2', '8d6ba77a0ec2eff2fde2c5cc8fa7ff7a')
version('1.0.1-2', git='https://github.com/llnl/launchmon.git', commit='ff7e22424b8f375318951eb1c9282fcbbfa8aadf')
depends_on('autoconf') depends_on('autoconf')
depends_on('automake') depends_on('automake')
depends_on('libtool') depends_on('libtool')
def patch(self):
# This patch makes libgcrypt compile correctly with newer gcc versions.
mf = FileFilter('tools/libgcrypt/tests/Makefile.in')
mf.filter(r'(basic_LDADD\s*=\s*.*)', r'\1 -lgpg-error')
mf.filter(r'(tsexp_LDADD\s*=\s*.*)', r'\1 -lgpg-error')
mf.filter(r'(keygen_LDADD\s*=\s*.*)', r'\1 -lgpg-error')
mf.filter(r'(benchmark_LDADD\s*=\s*.*)', r'\1 -lgpg-error')
def install(self, spec, prefix): def install(self, spec, prefix):
configure( configure(
"--prefix=" + prefix, "--prefix=" + prefix,

View file

@ -23,7 +23,7 @@
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
from spack import * from spack import *
import os, shutil import os, glob
class Llvm(Package): class Llvm(Package):
@ -46,7 +46,9 @@ class Llvm(Package):
variant('libcxx', default=True, description="Build the LLVM C++ standard library") variant('libcxx', default=True, description="Build the LLVM C++ standard library")
variant('compiler-rt', default=True, description="Build the LLVM compiler runtime, including sanitizers") variant('compiler-rt', default=True, description="Build the LLVM compiler runtime, including sanitizers")
variant('gold', default=True, description="Add support for LTO with the gold linker plugin") variant('gold', default=True, description="Add support for LTO with the gold linker plugin")
variant('shared_libs', default=False, description="Build all components as shared libraries, faster, less memory to build, less stable")
variant('link_dylib', default=False, description="Build and link the libLLVM shared library rather than static")
variant('all_targets', default=True, description="Build all supported targets, default targets <current arch>,NVPTX,AMDGPU,CppBackend")
# Build dependency # Build dependency
depends_on('cmake @2.8.12.2:') depends_on('cmake @2.8.12.2:')
@ -257,6 +259,28 @@ def install(self, spec, prefix):
if '+compiler-rt' not in spec: if '+compiler-rt' not in spec:
cmake_args.append('-DLLVM_EXTERNAL_COMPILER_RT_BUILD:Bool=OFF') cmake_args.append('-DLLVM_EXTERNAL_COMPILER_RT_BUILD:Bool=OFF')
if '+shared_libs' in spec:
cmake_args.append('-DBUILD_SHARED_LIBS:Bool=ON')
if '+link_dylib' in spec:
cmake_args.append('-DLLVM_LINK_LLVM_DYLIB:Bool=ON')
if '+all_targets' not in spec: # all is default on cmake
targets = ['CppBackend', 'NVPTX', 'AMDGPU']
if 'x86' in spec.architecture.lower():
targets.append('X86')
elif 'arm' in spec.architecture.lower():
targets.append('ARM')
elif 'aarch64' in spec.architecture.lower():
targets.append('AArch64')
elif 'sparc' in spec.architecture.lower():
targets.append('sparc')
elif ('ppc' in spec.architecture.lower() or
'power' in spec.architecture.lower()):
targets.append('PowerPC')
cmake_args.append('-DLLVM_TARGETS_TO_BUILD:Bool=' + ';'.join(targets))
if '+clang' not in spec: if '+clang' not in spec:
if '+clang_extra' in spec: if '+clang_extra' in spec:
raise SpackException('The clang_extra variant requires the clang variant to be selected') raise SpackException('The clang_extra variant requires the clang variant to be selected')
@ -267,7 +291,5 @@ def install(self, spec, prefix):
cmake(*cmake_args) cmake(*cmake_args)
make() make()
make("install") make("install")
query_path = os.path.join('bin', 'clang-query') cp = which('cp')
# Manually install clang-query, because llvm doesn't... cp('-a', 'bin/', prefix)
if os.path.exists(query_path):
shutil.copy(query_path, os.path.join(prefix, 'bin'))

View file

@ -24,55 +24,61 @@
############################################################################## ##############################################################################
from spack import * from spack import *
import glob, sys, os import glob
import sys
import os
class Metis(Package): class Metis(Package):
""" """METIS is a set of serial programs for partitioning graphs, partitioning
METIS is a set of serial programs for partitioning graphs, partitioning finite element meshes, and producing fill finite element meshes, and producing fill reducing orderings for sparse
reducing orderings for sparse matrices. The algorithms implemented in METIS are based on the multilevel matrices. The algorithms implemented in METIS are based on the
recursive-bisection, multilevel k-way, and multi-constraint partitioning schemes. multilevel recursive-bisection, multilevel k-way, and multi-constraint
""" partitioning schemes."""
homepage = 'http://glaros.dtc.umn.edu/gkhome/metis/metis/overview' homepage = "http://glaros.dtc.umn.edu/gkhome/metis/metis/overview"
url = "http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/metis-5.1.0.tar.gz" base_url = "http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis"
version('5.1.0', '5465e67079419a69e0116de24fce58fe', version('5.1.0', '5465e67079419a69e0116de24fce58fe')
url='http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/metis-5.1.0.tar.gz') version('5.0.2', 'acb521a4e8c2e6dd559a7f9abd0468c5')
version('4.0.3', '5efa35de80703c1b2c4d0de080fafbcf4e0d363a21149a1ad2f96e0144841a55', version('4.0.3', 'd3848b454532ef18dc83e4fb160d1e10')
url='http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/OLD/metis-4.0.3.tar.gz')
variant('shared', default=True, description='Enables the build of shared libraries') variant('shared', default=True, description='Enables the build of shared libraries')
variant('debug', default=False, description='Builds the library in debug mode') variant('debug', default=False, description='Builds the library in debug mode')
variant('gdb', default=False, description='Enables gdb support') variant('gdb', default=False, description='Enables gdb support')
variant('idx64', default=False, description='Use int64_t as default index type') variant('idx64', default=False, description='Use int64_t as default index type')
variant('double', default=False, description='Use double precision floating point types') variant('real64', default=False, description='Use double precision floating point types')
depends_on('cmake @2.8:', when='@5:') # build-time dependency depends_on('cmake@2.8:', when='@5:') # build-time dependency
depends_on('gdb', when='+gdb')
patch('install_gklib_defs_rename.patch', when='@5:') patch('install_gklib_defs_rename.patch', when='@5:')
def url_for_version(self, version):
verdir = 'OLD/' if version < Version('4.0.3') else ''
return '%s/%smetis-%s.tar.gz' % (Metis.base_url, verdir, version)
@when('@4:4.0.3') @when('@:4')
def install(self, spec, prefix): def install(self, spec, prefix):
# Process library spec and options
if '+gdb' in spec: unsupp_vars = [v for v in ('+gdb', '+idx64', '+real64') if v in spec]
raise InstallError('gdb support not implemented in METIS 4!') if unsupp_vars:
if '+idx64' in spec: msg = 'Given variants %s are unsupported by METIS 4!' % unsupp_vars
raise InstallError('idx64 option not implemented in METIS 4!') raise InstallError(msg)
if '+double' in spec:
raise InstallError('double option not implemented for METIS 4!')
options = ['COPTIONS=-fPIC'] options = ['COPTIONS=-fPIC']
if '+debug' in spec: if '+debug' in spec:
options.append('OPTFLAGS=-g -O0') options.append('OPTFLAGS=-g -O0')
make(*options) make(*options)
# Compile and install library files
ccompile = Executable(self.compiler.cc)
mkdir(prefix.bin) mkdir(prefix.bin)
for x in ('pmetis', 'kmetis', 'oemetis', 'onmetis', 'partnmesh', binfiles = ('pmetis', 'kmetis', 'oemetis', 'onmetis', 'partnmesh',
'partdmesh', 'mesh2nodal', 'mesh2dual', 'graphchk'): 'partdmesh', 'mesh2nodal', 'mesh2dual', 'graphchk')
install(x, prefix.bin) for binfile in binfiles:
install(binfile, prefix.bin)
mkdir(prefix.lib) mkdir(prefix.lib)
install('libmetis.a', prefix.lib) install('libmetis.a', prefix.lib)
@ -82,106 +88,120 @@ def install(self, spec, prefix):
install(h, prefix.include) install(h, prefix.include)
mkdir(prefix.share) mkdir(prefix.share)
for f in (join_path(*p) sharefiles = (('Graphs', '4elt.graph'), ('Graphs', 'metis.mesh'),
for p in (('Programs', 'io.c'), ('Graphs', 'test.mgraph'))
('Test','mtest.c'), for sharefile in tuple(join_path(*sf) for sf in sharefiles):
('Graphs','4elt.graph'), install(sharefile, prefix.share)
('Graphs', 'metis.mesh'),
('Graphs', 'test.mgraph'))):
install(f, prefix.share)
if '+shared' in spec: if '+shared' in spec:
shared_flags = ['-fPIC', '-shared']
if sys.platform == 'darwin': if sys.platform == 'darwin':
lib_dsuffix = 'dylib' shared_suffix = 'dylib'
load_flag = '-Wl,-all_load' shared_flags.extend(['-Wl,-all_load', 'libmetis.a'])
no_load_flag = ''
else: else:
lib_dsuffix = 'so' shared_suffix = 'so'
load_flag = '-Wl,-whole-archive' shared_flags.extend(['-Wl,-whole-archive', 'libmetis.a',
no_load_flag = '-Wl,-no-whole-archive' '-Wl,-no-whole-archive'])
os.system(spack_cc + ' -fPIC -shared ' + load_flag + shared_out = '%s/libmetis.%s' % (prefix.lib, shared_suffix)
' libmetis.a ' + no_load_flag + ' -o libmetis.' + shared_flags.extend(['-o', shared_out])
lib_dsuffix)
install('libmetis.' + lib_dsuffix, prefix.lib) ccompile(*shared_flags)
# Set up and run tests on installation # Set up and run tests on installation
symlink(join_path(prefix.share, 'io.c'), 'io.c') ccompile('-I%s' % prefix.include, '-L%s' % prefix.lib,
symlink(join_path(prefix.share, 'mtest.c'), 'mtest.c') '-Wl,-rpath=%s' % (prefix.lib if '+shared' in spec else ''),
os.system(spack_cc + ' -I%s' % prefix.include + ' -c io.c') join_path('Programs', 'io.o'), join_path('Test', 'mtest.c'),
os.system(spack_cc + ' -I%s' % prefix.include + '-o', '%s/mtest' % prefix.bin, '-lmetis', '-lm')
' -L%s' % prefix.lib + ' -lmetis mtest.c io.o -o mtest')
_4eltgraph = join_path(prefix.share, '4elt.graph')
test_mgraph = join_path(prefix.share, 'test.mgraph')
metis_mesh = join_path(prefix.share, 'metis.mesh')
kmetis = join_path(prefix.bin, 'kmetis')
os.system('./mtest ' + _4eltgraph)
os.system(kmetis + ' ' + _4eltgraph + ' 40')
os.system(join_path(prefix.bin, 'onmetis') + ' ' + _4eltgraph)
os.system(join_path(prefix.bin, 'pmetis') + ' ' + test_mgraph + ' 2')
os.system(kmetis + ' ' + test_mgraph + ' 2')
os.system(kmetis + ' ' + test_mgraph + ' 5')
os.system(join_path(prefix.bin, 'partnmesh') + metis_mesh + ' 10')
os.system(join_path(prefix.bin, 'partdmesh') + metis_mesh + ' 10')
os.system(join_path(prefix.bin, 'mesh2dual') + metis_mesh)
test_bin = lambda testname: join_path(prefix.bin, testname)
test_graph = lambda graphname: join_path(prefix.share, graphname)
graph = test_graph('4elt.graph')
os.system('%s %s' % (test_bin('mtest'), graph))
os.system('%s %s 40' % (test_bin('kmetis'), graph))
os.system('%s %s' % (test_bin('onmetis'), graph))
graph = test_graph('test.mgraph')
os.system('%s %s 2' % (test_bin('pmetis'), graph))
os.system('%s %s 2' % (test_bin('kmetis'), graph))
os.system('%s %s 5' % (test_bin('kmetis'), graph))
graph = test_graph('metis.mesh')
os.system('%s %s 10' % (test_bin('partnmesh'), graph))
os.system('%s %s 10' % (test_bin('partdmesh'), graph))
os.system('%s %s' % (test_bin('mesh2dual'), graph))
# FIXME: The following code should replace the testing code in the
# block above since it causes installs to fail when one or more of the
# Metis tests fail, but it currently doesn't work because the 'mtest',
# 'onmetis', and 'partnmesh' tests return error codes that trigger
# false positives for failure.
"""
Executable(test_bin('mtest'))(test_graph('4elt.graph'))
Executable(test_bin('kmetis'))(test_graph('4elt.graph'), '40')
Executable(test_bin('onmetis'))(test_graph('4elt.graph'))
Executable(test_bin('pmetis'))(test_graph('test.mgraph'), '2')
Executable(test_bin('kmetis'))(test_graph('test.mgraph'), '2')
Executable(test_bin('kmetis'))(test_graph('test.mgraph'), '5')
Executable(test_bin('partnmesh'))(test_graph('metis.mesh'), '10')
Executable(test_bin('partdmesh'))(test_graph('metis.mesh'), '10')
Executable(test_bin('mesh2dual'))(test_graph('metis.mesh'))
"""
@when('@5:') @when('@5:')
def install(self, spec, prefix): def install(self, spec, prefix):
options = [] options = []
options.extend(std_cmake_args) options.extend(std_cmake_args)
build_directory = join_path(self.stage.path, 'spack-build') build_directory = join_path(self.stage.path, 'spack-build')
source_directory = self.stage.source_path source_directory = self.stage.source_path
options.append('-DGKLIB_PATH:PATH={metis_source}/GKlib'.format(metis_source=source_directory)) options.append('-DGKLIB_PATH:PATH=%s/GKlib' % source_directory)
options.append('-DCMAKE_INSTALL_NAME_DIR:PATH=%s/lib' % prefix) options.append('-DCMAKE_INSTALL_NAME_DIR:PATH=%s/lib' % prefix)
if '+shared' in spec: if '+shared' in spec:
options.append('-DSHARED:BOOL=ON') options.append('-DSHARED:BOOL=ON')
if '+debug' in spec: if '+debug' in spec:
options.extend(['-DDEBUG:BOOL=ON', options.extend(['-DDEBUG:BOOL=ON',
'-DCMAKE_BUILD_TYPE:STRING=Debug']) '-DCMAKE_BUILD_TYPE:STRING=Debug'])
if '+gdb' in spec: if '+gdb' in spec:
options.append('-DGDB:BOOL=ON') options.append('-DGDB:BOOL=ON')
metis_header = join_path(source_directory, 'include', 'metis.h') metis_header = join_path(source_directory, 'include', 'metis.h')
if '+idx64' in spec: if '+idx64' in spec:
filter_file('IDXTYPEWIDTH 32', 'IDXTYPEWIDTH 64', metis_header) filter_file('IDXTYPEWIDTH 32', 'IDXTYPEWIDTH 64', metis_header)
if '+real64' in spec:
if '+double' in spec:
filter_file('REALTYPEWIDTH 32', 'REALTYPEWIDTH 64', metis_header) filter_file('REALTYPEWIDTH 32', 'REALTYPEWIDTH 64', metis_header)
# Make clang 7.3 happy. # Make clang 7.3 happy.
# Prevents "ld: section __DATA/__thread_bss extends beyond end of file" # Prevents "ld: section __DATA/__thread_bss extends beyond end of file"
# See upstream LLVM issue https://llvm.org/bugs/show_bug.cgi?id=27059 # See upstream LLVM issue https://llvm.org/bugs/show_bug.cgi?id=27059
# Adopted from https://github.com/Homebrew/homebrew-science/blob/master/metis.rb # and https://github.com/Homebrew/homebrew-science/blob/master/metis.rb
if spec.satisfies('%clang@7.3.0'): if spec.satisfies('%clang@7.3.0'):
filter_file('#define MAX_JBUFS 128', '#define MAX_JBUFS 24', join_path(source_directory, 'GKlib', 'error.c')) filter_file('#define MAX_JBUFS 128', '#define MAX_JBUFS 24',
join_path(source_directory, 'GKlib', 'error.c'))
with working_dir(build_directory, create=True): with working_dir(build_directory, create=True):
cmake(source_directory, *options) cmake(source_directory, *options)
make() make()
make("install") make('install')
# now run some tests:
for f in ["4elt", "copter2", "mdual"]:
graph = join_path(source_directory,'graphs','%s.graph' % f)
Executable(join_path(prefix.bin,'graphchk'))(graph)
Executable(join_path(prefix.bin,'gpmetis'))(graph,'2')
Executable(join_path(prefix.bin,'ndmetis'))(graph)
graph = join_path(source_directory,'graphs','test.mgraph') # now run some tests:
Executable(join_path(prefix.bin,'gpmetis'))(graph,'2') for f in ['4elt', 'copter2', 'mdual']:
graph = join_path(source_directory,'graphs','metis.mesh') graph = join_path(source_directory, 'graphs', '%s.graph' % f)
Executable(join_path(prefix.bin,'mpmetis'))(graph,'2') Executable(join_path(prefix.bin, 'graphchk'))(graph)
Executable(join_path(prefix.bin, 'gpmetis'))(graph, '2')
Executable(join_path(prefix.bin, 'ndmetis'))(graph)
graph = join_path(source_directory, 'graphs', 'test.mgraph')
Executable(join_path(prefix.bin, 'gpmetis'))(graph, '2')
graph = join_path(source_directory, 'graphs', 'metis.mesh')
Executable(join_path(prefix.bin, 'mpmetis'))(graph, '2')
# install GKlib headers, which will be needed for ParMETIS # install GKlib headers, which will be needed for ParMETIS
GKlib_dist = join_path(prefix.include,'GKlib') GKlib_dist = join_path(prefix.include, 'GKlib')
mkdirp(GKlib_dist) mkdirp(GKlib_dist)
fs = glob.glob(join_path(source_directory,'GKlib',"*.h")) hfiles = glob.glob(join_path(source_directory, 'GKlib', '*.h'))
for f in fs: for hfile in hfiles:
install(f, GKlib_dist) install(hfile, GKlib_dist)

View file

@ -23,6 +23,8 @@
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
from spack import * from spack import *
import sys
class Octave(Package): class Octave(Package):
"""GNU Octave is a high-level language, primarily intended for numerical """GNU Octave is a high-level language, primarily intended for numerical
@ -34,7 +36,8 @@ class Octave(Package):
homepage = "https://www.gnu.org/software/octave/" homepage = "https://www.gnu.org/software/octave/"
url = "ftp://ftp.gnu.org/gnu/octave/octave-4.0.0.tar.gz" url = "ftp://ftp.gnu.org/gnu/octave/octave-4.0.0.tar.gz"
version('4.0.0' , 'a69f8320a4f20a8480c1b278b1adb799') version('4.0.2', 'c2a5cacc6e4c52f924739cdf22c2c687')
version('4.0.0', 'a69f8320a4f20a8480c1b278b1adb799')
# Variants # Variants
variant('readline', default=True) variant('readline', default=True)
@ -62,33 +65,35 @@ class Octave(Package):
# Required dependencies # Required dependencies
depends_on('blas') depends_on('blas')
depends_on('lapack') depends_on('lapack')
# Octave does not configure with sed from darwin:
depends_on('sed', sys.platform == 'darwin')
depends_on('pcre') depends_on('pcre')
depends_on('pkg-config')
# Strongly recommended dependencies # Strongly recommended dependencies
depends_on('readline', when='+readline') depends_on('readline', when='+readline')
# Optional dependencies # Optional dependencies
depends_on('arpack', when='+arpack') depends_on('arpack', when='+arpack')
depends_on('curl', when='+curl') depends_on('curl', when='+curl')
depends_on('fftw', when='+fftw') depends_on('fftw', when='+fftw')
depends_on('fltk', when='+fltk') depends_on('fltk', when='+fltk')
depends_on('fontconfig', when='+fontconfig') depends_on('fontconfig', when='+fontconfig')
depends_on('freetype', when='+freetype') depends_on('freetype', when='+freetype')
depends_on('glpk', when='+glpk') depends_on('glpk', when='+glpk')
depends_on('gl2ps', when='+gl2ps') depends_on('gl2ps', when='+gl2ps')
depends_on('gnuplot', when='+gnuplot') depends_on('gnuplot', when='+gnuplot')
depends_on('ImageMagick', when='+magick') depends_on('ImageMagick', when='+magick')
depends_on('hdf5', when='+hdf5') depends_on('hdf5', when='+hdf5')
depends_on('jdk', when='+jdk') depends_on('jdk', when='+jdk')
depends_on('llvm', when='+llvm') depends_on('llvm', when='+llvm')
#depends_on('opengl', when='+opengl') # TODO: add package # depends_on('opengl', when='+opengl') # TODO: add package
depends_on('qhull', when='+qhull') depends_on('qhull', when='+qhull')
depends_on('qrupdate', when='+qrupdate') depends_on('qrupdate', when='+qrupdate')
#depends_on('qscintilla', when='+qscintilla) # TODO: add package # depends_on('qscintilla', when='+qscintilla) # TODO: add package
depends_on('qt', when='+qt') depends_on('qt', when='+qt')
depends_on('suite-sparse',when='+suitesparse') depends_on('suite-sparse', when='+suitesparse')
depends_on('zlib', when='+zlib') depends_on('zlib', when='+zlib')
def install(self, spec, prefix): def install(self, spec, prefix):
config_args = [ config_args = [
@ -154,7 +159,8 @@ def install(self, spec, prefix):
config_args.append("--without-glpk") config_args.append("--without-glpk")
if '+magick' in spec: if '+magick' in spec:
config_args.append("--with-magick=%s" % spec['ImageMagick'].prefix.lib) config_args.append("--with-magick=%s"
% spec['ImageMagick'].prefix.lib)
if '+hdf5' in spec: if '+hdf5' in spec:
config_args.extend([ config_args.extend([
@ -187,7 +193,8 @@ def install(self, spec, prefix):
if '+qrupdate' in spec: if '+qrupdate' in spec:
config_args.extend([ config_args.extend([
"--with-qrupdate-includedir=%s" % spec['qrupdate'].prefix.include, "--with-qrupdate-includedir=%s"
% spec['qrupdate'].prefix.include,
"--with-qrupdate-libdir=%s" % spec['qrupdate'].prefix.lib "--with-qrupdate-libdir=%s" % spec['qrupdate'].prefix.lib
]) ])
else: else:

View file

@ -27,6 +27,26 @@
from spack import * from spack import *
def _verbs_dir():
"""
Try to find the directory where the OpenFabrics verbs package is
installed. Return None if not found.
"""
try:
# Try to locate Verbs by looking for a utility in the path
ibv_devices = which("ibv_devices")
# Run it (silently) to ensure it works
ibv_devices(output=str, error=str)
# Get path to executable
path = ibv_devices.exe[0]
# Remove executable name and "bin" directory
path = os.path.dirname(path)
path = os.path.dirname(path)
return path
except:
return None
class Openmpi(Package): class Openmpi(Package):
"""Open MPI is a project combining technologies and resources from """Open MPI is a project combining technologies and resources from
several other projects (FT-MPI, LA-MPI, LAM/MPI, and PACX-MPI) several other projects (FT-MPI, LA-MPI, LAM/MPI, and PACX-MPI)
@ -54,7 +74,7 @@ class Openmpi(Package):
variant('psm', default=False, description='Build support for the PSM library.') variant('psm', default=False, description='Build support for the PSM library.')
variant('psm2', default=False, description='Build support for the Intel PSM2 library.') variant('psm2', default=False, description='Build support for the Intel PSM2 library.')
variant('pmi', default=False, description='Build support for PMI-based launchers') variant('pmi', default=False, description='Build support for PMI-based launchers')
variant('verbs', default=False, description='Build support for OpenFabrics verbs.') variant('verbs', default=_verbs_dir() is not None, description='Build support for OpenFabrics verbs.')
variant('mxm', default=False, description='Build Mellanox Messaging support') variant('mxm', default=False, description='Build Mellanox Messaging support')
variant('thread_multiple', default=False, description='Enable MPI_THREAD_MULTIPLE support') variant('thread_multiple', default=False, description='Enable MPI_THREAD_MULTIPLE support')
@ -113,7 +133,6 @@ def install(self, spec, prefix):
# Fabrics # Fabrics
'--with-psm' if '+psm' in spec else '--without-psm', '--with-psm' if '+psm' in spec else '--without-psm',
'--with-psm2' if '+psm2' in spec else '--without-psm2', '--with-psm2' if '+psm2' in spec else '--without-psm2',
('--with-%s' % self.verbs) if '+verbs' in spec else ('--without-%s' % self.verbs),
'--with-mxm' if '+mxm' in spec else '--without-mxm', '--with-mxm' if '+mxm' in spec else '--without-mxm',
# Other options # Other options
'--enable-mpi-thread-multiple' if '+thread_multiple' in spec else '--disable-mpi-thread-multiple', '--enable-mpi-thread-multiple' if '+thread_multiple' in spec else '--disable-mpi-thread-multiple',
@ -121,6 +140,14 @@ def install(self, spec, prefix):
'--with-sqlite3' if '+sqlite3' in spec else '--without-sqlite3', '--with-sqlite3' if '+sqlite3' in spec else '--without-sqlite3',
'--enable-vt' if '+vt' in spec else '--disable-vt' '--enable-vt' if '+vt' in spec else '--disable-vt'
]) ])
if '+verbs' in spec:
path = _verbs_dir()
if path is not None:
config_args.append('--with-%s=%s' % (self.verbs, path))
else:
config_args.append('--with-%s' % self.verbs)
else:
config_args.append('--without-%s' % self.verbs)
# TODO: use variants for this, e.g. +lanl, +llnl, etc. # TODO: use variants for this, e.g. +lanl, +llnl, etc.
# use this for LANL builds, but for LLNL builds, we need: # use this for LANL builds, but for LLNL builds, we need:

View file

@ -26,33 +26,36 @@
from spack import * from spack import *
import sys import sys
class Parmetis(Package): class Parmetis(Package):
""" """ParMETIS is an MPI-based parallel library that implements a variety of
ParMETIS is an MPI-based parallel library that implements a variety of algorithms for partitioning unstructured algorithms for partitioning unstructured graphs, meshes, and for
graphs, meshes, and for computing fill-reducing orderings of sparse matrices. computing fill-reducing orderings of sparse matrices."""
"""
homepage = 'http://glaros.dtc.umn.edu/gkhome/metis/parmetis/overview' homepage = 'http://glaros.dtc.umn.edu/gkhome/metis/parmetis/overview'
url = 'http://glaros.dtc.umn.edu/gkhome/fetch/sw/parmetis/parmetis-4.0.3.tar.gz' base_url = 'http://glaros.dtc.umn.edu/gkhome/fetch/sw/parmetis'
version('4.0.3', 'f69c479586bf6bb7aff6a9bc0c739628') version('4.0.3', 'f69c479586bf6bb7aff6a9bc0c739628')
version('4.0.2', '0912a953da5bb9b5e5e10542298ffdce')
variant('shared', default=True, description='Enables the build of shared libraries') variant('shared', default=True, description='Enables the build of shared libraries')
variant('debug', default=False, description='Builds the library in debug mode') variant('debug', default=False, description='Builds the library in debug mode')
variant('gdb', default=False, description='Enables gdb support') variant('gdb', default=False, description='Enables gdb support')
depends_on('cmake @2.8:') # build dependency depends_on('cmake@2.8:') # build dependency
depends_on('mpi') depends_on('mpi')
patch('enable_external_metis.patch')
depends_on('metis@5:') depends_on('metis@5:')
patch('enable_external_metis.patch')
# bug fixes from PETSc developers # bug fixes from PETSc developers
# https://bitbucket.org/petsc/pkg-parmetis/commits/1c1a9fd0f408dc4d42c57f5c3ee6ace411eb222b/raw/ # https://bitbucket.org/petsc/pkg-parmetis/commits/1c1a9fd0f408dc4d42c57f5c3ee6ace411eb222b/raw/ # NOQA: ignore=E501
patch('pkg-parmetis-1c1a9fd0f408dc4d42c57f5c3ee6ace411eb222b.patch') patch('pkg-parmetis-1c1a9fd0f408dc4d42c57f5c3ee6ace411eb222b.patch')
# https://bitbucket.org/petsc/pkg-parmetis/commits/82409d68aa1d6cbc70740d0f35024aae17f7d5cb/raw/ # https://bitbucket.org/petsc/pkg-parmetis/commits/82409d68aa1d6cbc70740d0f35024aae17f7d5cb/raw/ # NOQA: ignore=E501
patch('pkg-parmetis-82409d68aa1d6cbc70740d0f35024aae17f7d5cb.patch') patch('pkg-parmetis-82409d68aa1d6cbc70740d0f35024aae17f7d5cb.patch')
depends_on('gdb', when='+gdb') def url_for_version(self, version):
verdir = 'OLD/' if version < Version('3.2.0') else ''
return '%s/%sparmetis-%s.tar.gz' % (Parmetis.base_url, verdir, version)
def install(self, spec, prefix): def install(self, spec, prefix):
options = [] options = []
@ -60,30 +63,27 @@ def install(self, spec, prefix):
build_directory = join_path(self.stage.path, 'spack-build') build_directory = join_path(self.stage.path, 'spack-build')
source_directory = self.stage.source_path source_directory = self.stage.source_path
metis_source = join_path(source_directory, 'metis')
# FIXME : Once a contract is defined, MPI compilers should be retrieved indirectly via spec['mpi'] in case options.extend([
# FIXME : they use a non-standard name '-DGKLIB_PATH:PATH=%s/GKlib' % spec['metis'].prefix.include,
options.extend(['-DGKLIB_PATH:PATH={metis_source}/GKlib'.format(metis_source=spec['metis'].prefix.include), '-DMETIS_PATH:PATH=%s' % spec['metis'].prefix,
'-DMETIS_PATH:PATH={metis_source}'.format(metis_source=spec['metis'].prefix), '-DCMAKE_C_COMPILER:STRING=%s' % spec['mpi'].mpicc,
'-DCMAKE_C_COMPILER:STRING=mpicc', '-DCMAKE_CXX_COMPILER:STRING=%s' % spec['mpi'].mpicxx
'-DCMAKE_CXX_COMPILER:STRING=mpicxx']) ])
if '+shared' in spec: if '+shared' in spec:
options.append('-DSHARED:BOOL=ON') options.append('-DSHARED:BOOL=ON')
if '+debug' in spec: if '+debug' in spec:
options.extend(['-DDEBUG:BOOL=ON', options.extend(['-DDEBUG:BOOL=ON',
'-DCMAKE_BUILD_TYPE:STRING=Debug']) '-DCMAKE_BUILD_TYPE:STRING=Debug'])
if '+gdb' in spec: if '+gdb' in spec:
options.append('-DGDB:BOOL=ON') options.append('-DGDB:BOOL=ON')
with working_dir(build_directory, create=True): with working_dir(build_directory, create=True):
cmake(source_directory, *options) cmake(source_directory, *options)
make() make()
make("install") make('install')
# The shared library is not installed correctly on Darwin; correct this # The shared library is not installed correctly on Darwin; fix this
if (sys.platform == 'darwin') and ('+shared' in spec): if (sys.platform == 'darwin') and ('+shared' in spec):
fix_darwin_install_name(prefix.lib) fix_darwin_install_name(prefix.lib)

View file

@ -0,0 +1,16 @@
from spack import *
class PyAutopep8(Package):
"""Automatic pep8 formatter"""
homepage = "https://github.com/hhatto/autopep8"
url = "https://github.com/hhatto/autopep8/archive/ver1.2.2.tar.gz"
version('1.2.2', 'def3d023fc9dfd1b7113602e965ad8e1')
extends('python')
depends_on('py-setuptools')
depends_on('py-pep8')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -0,0 +1,15 @@
from spack import *
class PyPep8(Package):
"""python pep8 format checker"""
homepage = "https://github.com/PyCQA/pycodestyle"
url = "https://github.com/PyCQA/pycodestyle/archive/1.7.0.tar.gz"
version('1.7.0', '31070a3a6391928893cbf5fa523eb8d9')
extends('python')
depends_on('py-setuptools')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -0,0 +1,18 @@
from spack import *
import os
class RustBindgen(Package):
"""The rust programming language toolchain"""
homepage = "http://www.rust-lang.org"
url = "https://github.com/crabtw/rust-bindgen"
version('0.16', tag='0.16', git='https://github.com/crabtw/rust-bindgen')
extends("rust")
depends_on("llvm")
def install(self, spec, prefix):
env = dict(os.environ)
env['LIBCLANG_PATH'] = os.path.join(spec['llvm'].prefix, 'lib')
cargo('install', '--root', prefix, env=env)

View file

@ -0,0 +1,63 @@
from spack import *
import os
def get_submodules():
git = which('git')
git('submodule', 'update', '--init', '--recursive')
class Rust(Package):
"""The rust programming language toolchain"""
homepage = "http://www.rust-lang.org"
url = "https://github.com/rust-lang/rust"
version('1.8.0', tag='1.8.0', git="https://github.com/rust-lang/rust")
resource(name='cargo',
git="https://github.com/rust-lang/cargo.git",
tag='0.10.0',
destination='cargo')
extendable = True
# Rust
depends_on("llvm")
depends_on("curl")
depends_on("git")
depends_on("cmake")
depends_on("python@:2.8")
# Cargo
depends_on("openssl")
def install(self, spec, prefix):
configure('--prefix=%s' % prefix,
'--llvm-root=' + spec['llvm'].prefix)
make()
make("install")
# Install cargo, rust package manager
with working_dir(os.path.join('cargo', 'cargo')):
get_submodules()
configure('--prefix=' + prefix,
'--local-rust-root=' + prefix)
make()
make("install")
def setup_dependent_package(self, module, ext_spec):
"""
Called before python modules' install() methods.
In most cases, extensions will only need to have one or two lines::
cargo('build')
cargo('install', '--root', prefix)
or
cargo('install', '--root', prefix)
"""
# Rust extension builds can have a global cargo executable function
module.cargo = Executable(join_path(self.spec.prefix.bin, 'cargo'))

View file

@ -1,5 +0,0 @@
esmumps : scotch
(cd esmumps ; $(MAKE) scotch && $(MAKE) install)
ptesmumps : ptscotch
(cd esmumps ; $(MAKE) ptscotch && $(MAKE) ptinstall)

View file

@ -23,7 +23,6 @@
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
import os import os
import re
from spack import * from spack import *
@ -32,7 +31,7 @@ class Scotch(Package):
partitioning, graph clustering, and sparse matrix ordering.""" partitioning, graph clustering, and sparse matrix ordering."""
homepage = "http://www.labri.fr/perso/pelegrin/scotch/" homepage = "http://www.labri.fr/perso/pelegrin/scotch/"
url = "http://gforge.inria.fr/frs/download.php/latestfile/298/scotch_6.0.3.tar.gz" url = "http://gforge.inria.fr/frs/download.php/latestfile/298/scotch_6.0.3.tar.gz"
base_url = "http://gforge.inria.fr/frs/download.php/latestfile/298" base_url = "http://gforge.inria.fr/frs/download.php/latestfile/298"
list_url = "http://gforge.inria.fr/frs/?group_id=248" list_url = "http://gforge.inria.fr/frs/?group_id=248"
@ -44,6 +43,7 @@ class Scotch(Package):
variant('compression', default=True, description='Activate the posibility to use compressed files') variant('compression', default=True, description='Activate the posibility to use compressed files')
variant('esmumps', default=False, description='Activate the compilation of esmumps needed by mumps') variant('esmumps', default=False, description='Activate the compilation of esmumps needed by mumps')
variant('shared', default=True, description='Build a shared version of the library') variant('shared', default=True, description='Build a shared version of the library')
variant('metis', default=True, description='Build metis and parmetis wrapper libraries')
depends_on('flex') depends_on('flex')
depends_on('bison') depends_on('bison')
@ -56,37 +56,15 @@ class Scotch(Package):
# from the Scotch hosting site. These alternative archives include a # from the Scotch hosting site. These alternative archives include a
# superset of the behavior in their default counterparts, so we choose to # superset of the behavior in their default counterparts, so we choose to
# always grab these versions for older Scotch versions for simplicity. # always grab these versions for older Scotch versions for simplicity.
def url_for_version(self, version):
return super(Scotch, self).url_for_version(version)
@when('@:6.0.0') @when('@:6.0.0')
def url_for_version(self, version): def url_for_version(self, version):
return '%s/scotch_%s_esmumps.tar.gz' % (Scotch.base_url, version) return '%s/scotch_%s_esmumps.tar.gz' % (Scotch.base_url, version)
@when('@6.0.1:')
def url_for_version(self, version):
return super(Scotch, self).url_for_version(version)
# NOTE: Several of the 'esmumps' enabled Scotch releases up to version
# 6.0.0 have broken build scripts that don't properly build 'esmumps' as a
# separate target, so we need a patch procedure to remove 'esmumps' from
# existing targets and to add it as a standalone target.
@when('@:6.0.0')
def patch(self): def patch(self):
makefile_path = os.path.join('src', 'Makefile') self.configure()
with open(makefile_path, 'r') as makefile:
esmumps_enabled = any(re.search(r'^esmumps(\s*):(.*)$', line)
for line in makefile.readlines())
if not esmumps_enabled:
mff = FileFilter(makefile_path)
mff.filter(r'^.*((esmumps)|(ptesmumps)).*(install).*$', '')
mfesmumps_dir = os.path.dirname(os.path.realpath(__file__))
mfesmumps_path = os.path.join(mfesmumps_dir, 'Makefile.esmumps')
with open(makefile_path, 'a') as makefile:
makefile.write('\ninclude %s\n' % mfesmumps_path)
@when('@6.0.1:')
def patch(self):
pass
# NOTE: Configuration of Scotch is achieved by writing a 'Makefile.inc' # NOTE: Configuration of Scotch is achieved by writing a 'Makefile.inc'
# file that contains all of the configuration variables and their desired # file that contains all of the configuration variables and their desired
@ -103,13 +81,13 @@ def configure(self):
] ]
# Library Build Type # # Library Build Type #
if '+shared' in self.spec: if '+shared' in self.spec:
makefile_inc.extend([ makefile_inc.extend([
# todo change for Darwin systems
'LIB = .so', 'LIB = .so',
'CLIBFLAGS = -shared -fPIC', 'CLIBFLAGS = -shared -fPIC',
'RANLIB = echo', 'RANLIB = echo',
'AR = $(CC)', 'AR = $(CC)',
'ARFLAGS = -shared $(LDFLAGS) -o' 'ARFLAGS = -shared $(LDFLAGS) -o'
]) ])
cflags.append('-fPIC') cflags.append('-fPIC')
@ -118,7 +96,7 @@ def configure(self):
'LIB = .a', 'LIB = .a',
'CLIBFLAGS = ', 'CLIBFLAGS = ',
'RANLIB = ranlib', 'RANLIB = ranlib',
'AR = ar', 'AR = ar',
'ARFLAGS = -ruv ' 'ARFLAGS = -ruv '
]) ])
@ -171,20 +149,49 @@ def configure(self):
fh.write('\n'.join(makefile_inc)) fh.write('\n'.join(makefile_inc))
def install(self, spec, prefix): def install(self, spec, prefix):
self.configure()
targets = ['scotch'] targets = ['scotch']
if '+mpi' in self.spec: if '+mpi' in self.spec:
targets.append('ptscotch') targets.append('ptscotch')
if '+esmumps' in self.spec: if self.spec.version >= Version('6.0.0'):
targets.append('esmumps') if '+esmumps' in self.spec:
if '+mpi' in self.spec: targets.append('esmumps')
targets.append('ptesmumps') if '+mpi' in self.spec:
targets.append('ptesmumps')
with working_dir('src'): with working_dir('src'):
for target in targets: for target in targets:
make(target, parallel=(target != 'ptesmumps')) # It seams that building ptesmumps in parallel fails, for
# version prior to 6.0.0 there is no separated targets force
# ptesmumps, this library is built by the ptscotch target. This
# should explain the test for the can_make_parallel variable
can_make_parallel = \
not (target == 'ptesmumps' or
(self.spec.version < Version('6.0.0') and
target == 'ptscotch'))
make(target, parallel=can_make_parallel)
# todo change this to take into account darwin systems
lib_ext = '.so' if '+shared' in self.spec else '.a'
# It seams easier to remove metis wrappers from the folder that will be
# installed than to tweak the Makefiles
if '+metis' not in self.spec:
with working_dir('lib'):
lib_ext = '.so' if '+shared' in self.spec else '.a'
force_remove('libscotchmetis{0}'.format(lib_ext))
force_remove('libptscotchparmetis{0}'.format(lib_ext))
with working_dir('include'):
force_remove('metis.h')
force_remove('parmetis.h')
if '~esmumps' in self.spec and self.spec.version < Version('6.0.0'):
with working_dir('lib'):
force_remove('libesmumps{0}'.format(lib_ext))
force_remove('libptesmumps{0}'.format(lib_ext))
with working_dir('include'):
force_remove('esmumps.h')
install_tree('bin', prefix.bin) install_tree('bin', prefix.bin)
install_tree('lib', prefix.lib) install_tree('lib', prefix.lib)

View file

@ -0,0 +1,62 @@
##############################################################################
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Stream(Package):
"""The STREAM benchmark is a simple synthetic benchmark program that
measures sustainable memory bandwidth (in MB/s) and the corresponding
computation rate for simple vector kernels."""
homepage = "https://www.cs.virginia.edu/stream/ref.html"
version('5.10', git='https://github.com/jeffhammond/STREAM.git')
variant('openmp', default=False, description='Build with OpenMP support')
def patch(self):
makefile = FileFilter('Makefile')
# Use the Spack compiler wrappers
makefile.filter('CC = .*', 'CC = cc')
makefile.filter('FC = .*', 'FC = f77')
cflags = '-O2'
fflags = '-O2'
if '+openmp' in self.spec:
cflags += ' ' + self.compiler.openmp_flag
fflags += ' ' + self.compiler.openmp_flag
# Set the appropriate flags for this compiler
makefile.filter('CFLAGS = .*', 'CFLAGS = {0}'.format(cflags))
makefile.filter('FFLAGS = .*', 'FFLAGS = {0}'.format(fflags))
def install(self, spec, prefix):
make()
# Manual installation
mkdir(prefix.bin)
install('stream_c.exe', prefix.bin)
install('stream_f.exe', prefix.bin)

View file

@ -23,9 +23,9 @@
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
from spack import * from spack import *
import os
import glob import glob
class Tbb(Package): class Tbb(Package):
"""Widely used C++ template library for task parallelism. """Widely used C++ template library for task parallelism.
Intel Threading Building Blocks (Intel TBB) lets you easily write parallel Intel Threading Building Blocks (Intel TBB) lets you easily write parallel
@ -35,35 +35,39 @@ class Tbb(Package):
homepage = "http://www.threadingbuildingblocks.org/" homepage = "http://www.threadingbuildingblocks.org/"
# Only version-specific URL's work for TBB # Only version-specific URL's work for TBB
version('4.4.3', '80707e277f69d9b20eeebdd7a5f5331137868ce1', url='https://www.threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb44_20160128oss_src_0.tgz') version('4.4.4', 'd4cee5e4ca75cab5181834877738619c56afeb71', url='https://www.threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb44_20160413oss_src.tgz') # NOQA: ignore=E501
version('4.4.3', '80707e277f69d9b20eeebdd7a5f5331137868ce1', url='https://www.threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb44_20160128oss_src_0.tgz') # NOQA: ignore=E501
def coerce_to_spack(self,tbb_build_subdir): def coerce_to_spack(self, tbb_build_subdir):
for compiler in ["icc","gcc","clang"]: for compiler in ["icc", "gcc", "clang"]:
fs = glob.glob(join_path(tbb_build_subdir,"*.%s.inc" % compiler )) fs = glob.glob(join_path(tbb_build_subdir,
for f in fs: "*.%s.inc" % compiler))
lines = open(f).readlines() for f in fs:
of = open(f,"w") lines = open(f).readlines()
for l in lines: of = open(f, "w")
if l.strip().startswith("CPLUS ="): for l in lines:
if l.strip().startswith("CPLUS ="):
of.write("# coerced to spack\n") of.write("# coerced to spack\n")
of.write("CPLUS = $(CXX)\n") of.write("CPLUS = $(CXX)\n")
elif l.strip().startswith("CPLUS ="): elif l.strip().startswith("CPLUS ="):
of.write("# coerced to spack\n") of.write("# coerced to spack\n")
of.write("CONLY = $(CC)\n") of.write("CONLY = $(CC)\n")
else: else:
of.write(l); of.write(l)
def install(self, spec, prefix): def install(self, spec, prefix):
# if spec.satisfies('%gcc@6.1:') and spec.satisfies('@:4.4.3'):
# we need to follow TBB's compiler selection logic to get the proper build + link flags raise InstallError('Only TBB 4.4.4 and above build with GCC 6.1!')
# but we still need to use spack's compiler wrappers
# We need to follow TBB's compiler selection logic to get the proper
# build + link flags but we still need to use spack's compiler wrappers
# to accomplish this, we do two things: # to accomplish this, we do two things:
# #
# * Look at the spack spec to determine which compiler we should pass to tbb's Makefile # * Look at the spack spec to determine which compiler we should pass
# to tbb's Makefile;
# #
# * patch tbb's build system to use the compiler wrappers (CC, CXX) for # * patch tbb's build system to use the compiler wrappers (CC, CXX) for
# icc, gcc, clang # icc, gcc, clang (see coerce_to_spack());
# (see coerce_to_spack())
# #
self.coerce_to_spack("build") self.coerce_to_spack("build")
@ -74,7 +78,6 @@ def install(self, spec, prefix):
else: else:
tbb_compiler = "gcc" tbb_compiler = "gcc"
mkdirp(prefix) mkdirp(prefix)
mkdirp(prefix.lib) mkdirp(prefix.lib)
@ -82,10 +85,10 @@ def install(self, spec, prefix):
# tbb does not have a configure script or make install target # tbb does not have a configure script or make install target
# we simply call make, and try to put the pieces together # we simply call make, and try to put the pieces together
# #
make("compiler=%s" %(tbb_compiler)) make("compiler=%s" % (tbb_compiler))
# install headers to {prefix}/include # install headers to {prefix}/include
install_tree('include',prefix.include) install_tree('include', prefix.include)
# install libs to {prefix}/lib # install libs to {prefix}/lib
tbb_lib_names = ["libtbb", tbb_lib_names = ["libtbb",
@ -94,10 +97,10 @@ def install(self, spec, prefix):
for lib_name in tbb_lib_names: for lib_name in tbb_lib_names:
# install release libs # install release libs
fs = glob.glob(join_path("build","*release",lib_name + ".*")) fs = glob.glob(join_path("build", "*release", lib_name + ".*"))
for f in fs: for f in fs:
install(f, prefix.lib) install(f, prefix.lib)
# install debug libs if they exist # install debug libs if they exist
fs = glob.glob(join_path("build","*debug",lib_name + "_debug.*")) fs = glob.glob(join_path("build", "*debug", lib_name + "_debug.*"))
for f in fs: for f in fs:
install(f, prefix.lib) install(f, prefix.lib)