Merge branch 'develop' into features/cflags

This commit is contained in:
Todd Gamblin 2016-05-05 10:49:15 -07:00
commit 9fb1a9537d
78 changed files with 1592 additions and 471 deletions

View file

@ -372,25 +372,32 @@ how this is done is in :ref:`sec-specs`.
``spack compiler add`` ``spack compiler add``
~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~
An alias for ``spack compiler find``.
.. _spack-compiler-find:
``spack compiler find``
~~~~~~~~~~~~~~~~~~~~~~~
If you do not see a compiler in this list, but you want to use it with If you do not see a compiler in this list, but you want to use it with
Spack, you can simply run ``spack compiler add`` with the path to Spack, you can simply run ``spack compiler find`` with the path to
where the compiler is installed. For example:: where the compiler is installed. For example::
$ spack compiler add /usr/local/tools/ic-13.0.079 $ spack compiler find /usr/local/tools/ic-13.0.079
==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml ==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
intel@13.0.079 intel@13.0.079
Or you can run ``spack compiler add`` with no arguments to force Or you can run ``spack compiler find`` with no arguments to force
auto-detection. This is useful if you do not know where compilers are auto-detection. This is useful if you do not know where compilers are
installed, but you know that new compilers have been added to your installed, but you know that new compilers have been added to your
``PATH``. For example, using dotkit, you might do this:: ``PATH``. For example, using dotkit, you might do this::
$ module load gcc-4.9.0 $ module load gcc-4.9.0
$ spack compiler add $ spack compiler find
==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml ==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
gcc@4.9.0 gcc@4.9.0
This loads the environment module for gcc-4.9.0 to get it into the This loads the environment module for gcc-4.9.0 to add it to
``PATH``, and then it adds the compiler to Spack. ``PATH``, and then it adds the compiler to Spack.
.. _spack-compiler-info: .. _spack-compiler-info:
@ -807,17 +814,22 @@ Environment Modules, you can get it with Spack:
1. Install with:: 1. Install with::
.. code-block:: sh
spack install environment-modules spack install environment-modules
2. Activate with:: 2. Activate with::
MODULES_HOME=`spack location -i environment-modules` Add the following two lines to your ``.bashrc`` profile (or similar):
MODULES_VERSION=`ls -1 $MODULES_HOME/Modules | head -1`
${MODULES_HOME}/Modules/${MODULES_VERSION}/bin/add.modules .. code-block:: sh
MODULES_HOME=`spack location -i environment-modules`
source ${MODULES_HOME}/Modules/init/bash
In case you use a Unix shell other than bash, substitute ``bash`` by
the appropriate file in ``${MODULES_HOME}/Modules/init/``.
This adds to your ``.bashrc`` (or similar) files, enabling Environment
Modules when you log in. It will ask your permission before changing
any files.
Spack and Environment Modules Spack and Environment Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

View file

@ -1831,6 +1831,23 @@ successfully find ``libdwarf.h`` and ``libdwarf.so``, without the
packager having to provide ``--with-libdwarf=/path/to/libdwarf`` on packager having to provide ``--with-libdwarf=/path/to/libdwarf`` on
the command line. the command line.
Message Parsing Interface (MPI)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
It is common for high performance computing software/packages to use ``MPI``.
As a result of conretization, a given package can be built using different
implementations of MPI such as ``Openmpi``, ``MPICH`` or ``IntelMPI``.
In some scenarios to configure a package one have to provide it with appropriate MPI
compiler wrappers such as ``mpicc``, ``mpic++``.
However different implementations of ``MPI`` may have different names for those
wrappers. In order to make package's ``install()`` method indifferent to the
choice ``MPI`` implementation, each package which implements ``MPI`` sets up
``self.spec.mpicc``, ``self.spec.mpicxx``, ``self.spec.mpifc`` and ``self.spec.mpif77``
to point to ``C``, ``C++``, ``Fortran 90`` and ``Fortran 77`` ``MPI`` wrappers.
Package developers are advised to use these variables, for example ``self.spec['mpi'].mpicc``
instead of hard-coding ``join_path(self.spec['mpi'].prefix.bin, 'mpicc')`` for
the reasons outlined above.
Forking ``install()`` Forking ``install()``
~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~

2
lib/spack/env/cc vendored
View file

@ -166,7 +166,7 @@ fi
# It doesn't work with -rpath. # It doesn't work with -rpath.
# This variable controls whether they are added. # This variable controls whether they are added.
add_rpaths=true add_rpaths=true
if [[ mode == ld && $OSTYPE == darwin* ]]; then if [[ $mode == ld && "$SPACK_SHORT_SPEC" =~ "darwin" ]]; then
for arg in "$@"; do for arg in "$@"; do
if [[ $arg == -r ]]; then if [[ $arg == -r ]]; then
add_rpaths=false add_rpaths=false

View file

@ -44,10 +44,10 @@ def setup_parser(subparser):
scopes = spack.config.config_scopes scopes = spack.config.config_scopes
# Add # Find
add_parser = sp.add_parser('add', help='Add compilers to the Spack configuration.') find_parser = sp.add_parser('find', aliases=['add'], help='Search the system for compilers to add to the Spack configuration.')
add_parser.add_argument('add_paths', nargs=argparse.REMAINDER) find_parser.add_argument('add_paths', nargs=argparse.REMAINDER)
add_parser.add_argument('--scope', choices=scopes, default=spack.cmd.default_modify_scope, find_parser.add_argument('--scope', choices=scopes, default=spack.cmd.default_modify_scope,
help="Configuration scope to modify.") help="Configuration scope to modify.")
# Remove # Remove
@ -70,7 +70,7 @@ def setup_parser(subparser):
help="Configuration scope to read from.") help="Configuration scope to read from.")
def compiler_add(args): def compiler_find(args):
"""Search either $PATH or a list of paths for compilers and add them """Search either $PATH or a list of paths for compilers and add them
to Spack's configuration.""" to Spack's configuration."""
paths = args.add_paths paths = args.add_paths
@ -136,7 +136,8 @@ def compiler_list(args):
def compiler(parser, args): def compiler(parser, args):
action = { 'add' : compiler_add, action = { 'add' : compiler_find,
'find' : compiler_find,
'remove' : compiler_remove, 'remove' : compiler_remove,
'rm' : compiler_remove, 'rm' : compiler_remove,
'info' : compiler_info, 'info' : compiler_info,

View file

@ -23,87 +23,106 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
import argparse import argparse
import xml.etree.ElementTree as ET
import itertools
import re
import os
import codecs import codecs
import os
import time
import xml.dom.minidom
import xml.etree.ElementTree as ET
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack import spack
import spack.cmd
from llnl.util.filesystem import *
from spack.build_environment import InstallError from spack.build_environment import InstallError
from spack.fetch_strategy import FetchError from spack.fetch_strategy import FetchError
import spack.cmd
description = "Run package installation as a unit test, output formatted results." description = "Run package installation as a unit test, output formatted results."
def setup_parser(subparser): def setup_parser(subparser):
subparser.add_argument( subparser.add_argument('-j',
'-j', '--jobs', action='store', type=int, '--jobs',
help="Explicitly set number of make jobs. Default is #cpus.") action='store',
type=int,
help="Explicitly set number of make jobs. Default is #cpus.")
subparser.add_argument( subparser.add_argument('-n',
'-n', '--no-checksum', action='store_true', dest='no_checksum', '--no-checksum',
help="Do not check packages against checksum") action='store_true',
dest='no_checksum',
help="Do not check packages against checksum")
subparser.add_argument( subparser.add_argument('-o', '--output', action='store', help="test output goes in this file")
'-o', '--output', action='store', help="test output goes in this file")
subparser.add_argument( subparser.add_argument('package', nargs=argparse.REMAINDER, help="spec of package to install")
'package', nargs=argparse.REMAINDER, help="spec of package to install")
class JunitResultFormat(object):
def __init__(self):
self.root = ET.Element('testsuite')
self.tests = []
def add_test(self, buildId, testResult, buildInfo=None):
self.tests.append((buildId, testResult, buildInfo))
def write_to(self, stream):
self.root.set('tests', '{0}'.format(len(self.tests)))
for buildId, testResult, buildInfo in self.tests:
testcase = ET.SubElement(self.root, 'testcase')
testcase.set('classname', buildId.name)
testcase.set('name', buildId.stringId())
if testResult == TestResult.FAILED:
failure = ET.SubElement(testcase, 'failure')
failure.set('type', "Build Error")
failure.text = buildInfo
elif testResult == TestResult.SKIPPED:
skipped = ET.SubElement(testcase, 'skipped')
skipped.set('type', "Skipped Build")
skipped.text = buildInfo
ET.ElementTree(self.root).write(stream)
class TestResult(object): class TestResult(object):
PASSED = 0 PASSED = 0
FAILED = 1 FAILED = 1
SKIPPED = 2 SKIPPED = 2
ERRORED = 3
class BuildId(object): class TestSuite(object):
def __init__(self, spec): def __init__(self, filename):
self.name = spec.name self.filename = filename
self.version = spec.version self.root = ET.Element('testsuite')
self.hashId = spec.dag_hash() self.tests = []
def stringId(self): def __enter__(self):
return "-".join(str(x) for x in (self.name, self.version, self.hashId)) return self
def __hash__(self): def append(self, item):
return hash((self.name, self.version, self.hashId)) if not isinstance(item, TestCase):
raise TypeError('only TestCase instances may be appended to a TestSuite instance')
self.tests.append(item) # Append the item to the list of tests
def __eq__(self, other): def __exit__(self, exc_type, exc_val, exc_tb):
if not isinstance(other, BuildId): # Prepare the header for the entire test suite
return False number_of_errors = sum(x.result_type == TestResult.ERRORED for x in self.tests)
self.root.set('errors', str(number_of_errors))
number_of_failures = sum(x.result_type == TestResult.FAILED for x in self.tests)
self.root.set('failures', str(number_of_failures))
self.root.set('tests', str(len(self.tests)))
return ((self.name, self.version, self.hashId) == for item in self.tests:
(other.name, other.version, other.hashId)) self.root.append(item.element)
with open(self.filename, 'wb') as file:
xml_string = ET.tostring(self.root)
xml_string = xml.dom.minidom.parseString(xml_string).toprettyxml()
file.write(xml_string)
class TestCase(object):
results = {
TestResult.PASSED: None,
TestResult.SKIPPED: 'skipped',
TestResult.FAILED: 'failure',
TestResult.ERRORED: 'error',
}
def __init__(self, classname, name, time=None):
self.element = ET.Element('testcase')
self.element.set('classname', str(classname))
self.element.set('name', str(name))
if time is not None:
self.element.set('time', str(time))
self.result_type = None
def set_result(self, result_type, message=None, error_type=None, text=None):
self.result_type = result_type
result = TestCase.results[self.result_type]
if result is not None and result is not TestResult.PASSED:
subelement = ET.SubElement(self.element, result)
if error_type is not None:
subelement.set('type', error_type)
if message is not None:
subelement.set('message', str(message))
if text is not None:
subelement.text = text
def fetch_log(path): def fetch_log(path):
@ -114,46 +133,76 @@ def fetch_log(path):
def failed_dependencies(spec): def failed_dependencies(spec):
return set(childSpec for childSpec in spec.dependencies.itervalues() if not return set(item for item in spec.dependencies.itervalues() if not spack.repo.get(item).installed)
spack.repo.get(childSpec).installed)
def create_test_output(topSpec, newInstalls, output, getLogFunc=fetch_log): def get_top_spec_or_die(args):
# Post-order traversal is not strictly required but it makes sense to output specs = spack.cmd.parse_specs(args.package, concretize=True)
# tests for dependencies first. if len(specs) > 1:
for spec in topSpec.traverse(order='post'): tty.die("Only 1 top-level package can be specified")
if spec not in newInstalls: top_spec = iter(specs).next()
continue return top_spec
failedDeps = failed_dependencies(spec)
package = spack.repo.get(spec)
if failedDeps:
result = TestResult.SKIPPED
dep = iter(failedDeps).next()
depBID = BuildId(dep)
errOutput = "Skipped due to failed dependency: {0}".format(
depBID.stringId())
elif (not package.installed) and (not package.stage.source_path):
result = TestResult.FAILED
errOutput = "Failure to fetch package resources."
elif not package.installed:
result = TestResult.FAILED
lines = getLogFunc(package.build_log_path)
errMessages = list(line for line in lines if
re.search('error:', line, re.IGNORECASE))
errOutput = errMessages if errMessages else lines[-10:]
errOutput = '\n'.join(itertools.chain(
[spec.to_yaml(), "Errors:"], errOutput,
["Build Log:", package.build_log_path]))
else:
result = TestResult.PASSED
errOutput = None
bId = BuildId(spec) def install_single_spec(spec, number_of_jobs):
output.add_test(bId, result, errOutput) package = spack.repo.get(spec)
# If it is already installed, skip the test
if spack.repo.get(spec).installed:
testcase = TestCase(package.name, package.spec.short_spec, time=0.0)
testcase.set_result(TestResult.SKIPPED, message='Skipped [already installed]', error_type='already_installed')
return testcase
# If it relies on dependencies that did not install, skip
if failed_dependencies(spec):
testcase = TestCase(package.name, package.spec.short_spec, time=0.0)
testcase.set_result(TestResult.SKIPPED, message='Skipped [failed dependencies]', error_type='dep_failed')
return testcase
# Otherwise try to install the spec
try:
start_time = time.time()
package.do_install(keep_prefix=False,
keep_stage=True,
ignore_deps=False,
make_jobs=number_of_jobs,
verbose=True,
fake=False)
duration = time.time() - start_time
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.PASSED)
except InstallError:
# An InstallError is considered a failure (the recipe didn't work correctly)
duration = time.time() - start_time
# Try to get the log
lines = fetch_log(package.build_log_path)
text = '\n'.join(lines)
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.FAILED, message='Installation failure', text=text)
except FetchError:
# A FetchError is considered an error (we didn't even start building)
duration = time.time() - start_time
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.ERRORED, message='Unable to fetch package')
return testcase
def get_filename(args, top_spec):
if not args.output:
fname = 'test-{x.name}-{x.version}-{hash}.xml'.format(x=top_spec, hash=top_spec.dag_hash())
output_directory = join_path(os.getcwd(), 'test-output')
if not os.path.exists(output_directory):
os.mkdir(output_directory)
output_filename = join_path(output_directory, fname)
else:
output_filename = args.output
return output_filename
def test_install(parser, args): def test_install(parser, args):
# Check the input
if not args.package: if not args.package:
tty.die("install requires a package argument") tty.die("install requires a package argument")
@ -162,50 +211,15 @@ def test_install(parser, args):
tty.die("The -j option must be a positive integer!") tty.die("The -j option must be a positive integer!")
if args.no_checksum: if args.no_checksum:
spack.do_checksum = False # TODO: remove this global. spack.do_checksum = False # TODO: remove this global.
specs = spack.cmd.parse_specs(args.package, concretize=True) # Get the one and only top spec
if len(specs) > 1: top_spec = get_top_spec_or_die(args)
tty.die("Only 1 top-level package can be specified") # Get the filename of the test
topSpec = iter(specs).next() output_filename = get_filename(args, top_spec)
# TEST SUITE
newInstalls = set() with TestSuite(output_filename) as test_suite:
for spec in topSpec.traverse(): # Traverse in post order : each spec is a test case
package = spack.repo.get(spec) for spec in top_spec.traverse(order='post'):
if not package.installed: test_case = install_single_spec(spec, args.jobs)
newInstalls.add(spec) test_suite.append(test_case)
if not args.output:
bId = BuildId(topSpec)
outputDir = join_path(os.getcwd(), "test-output")
if not os.path.exists(outputDir):
os.mkdir(outputDir)
outputFpath = join_path(outputDir, "test-{0}.xml".format(bId.stringId()))
else:
outputFpath = args.output
for spec in topSpec.traverse(order='post'):
# Calling do_install for the top-level package would be sufficient but
# this attempts to keep going if any package fails (other packages which
# are not dependents may succeed)
package = spack.repo.get(spec)
if (not failed_dependencies(spec)) and (not package.installed):
try:
package.do_install(
keep_prefix=False,
keep_stage=True,
ignore_deps=False,
make_jobs=args.jobs,
verbose=True,
fake=False)
except InstallError:
pass
except FetchError:
pass
jrf = JunitResultFormat()
handled = {}
create_test_output(topSpec, newInstalls, jrf)
with open(outputFpath, 'wb') as F:
jrf.write_to(F)

View file

@ -97,6 +97,9 @@ class Compiler(object):
# argument used to get C++11 options # argument used to get C++11 options
cxx11_flag = "-std=c++11" cxx11_flag = "-std=c++11"
# argument used to get C++14 options
cxx14_flag = "-std=c++1y"
def __init__(self, cspec, cc, cxx, f77, fc, **kwargs): def __init__(self, cspec, cc, cxx, f77, fc, **kwargs):
def check(exe): def check(exe):

View file

@ -54,9 +54,16 @@ def cxx11_flag(self):
if self.version < ver('4.3'): if self.version < ver('4.3'):
tty.die("Only gcc 4.3 and above support c++11.") tty.die("Only gcc 4.3 and above support c++11.")
elif self.version < ver('4.7'): elif self.version < ver('4.7'):
return "-std=gnu++0x" return "-std=c++0x"
else: else:
return "-std=gnu++11" return "-std=c++11"
@property
def cxx14_flag(self):
if self.version < ver('4.8'):
tty.die("Only gcc 4.8 and above support c++14.")
else:
return "-std=c++14"
@classmethod @classmethod
def fc_version(cls, fc): def fc_version(cls, fc):

View file

@ -157,12 +157,26 @@ def fetch(self):
tty.msg("Already downloaded %s" % self.archive_file) tty.msg("Already downloaded %s" % self.archive_file)
return return
possible_files = self.stage.expected_archive_files
save_file = None
partial_file = None
if possible_files:
save_file = self.stage.expected_archive_files[0]
partial_file = self.stage.expected_archive_files[0] + '.part'
tty.msg("Trying to fetch from %s" % self.url) tty.msg("Trying to fetch from %s" % self.url)
curl_args = ['-O', # save file to disk if partial_file:
save_args = ['-C', '-', # continue partial downloads
'-o', partial_file] # use a .part file
else:
save_args = ['-O']
curl_args = save_args + [
'-f', # fail on >400 errors '-f', # fail on >400 errors
'-D', '-', # print out HTML headers '-D', '-', # print out HTML headers
'-L', self.url, ] '-L', # resolve 3xx redirects
self.url, ]
if sys.stdout.isatty(): if sys.stdout.isatty():
curl_args.append('-#') # status bar when using a tty curl_args.append('-#') # status bar when using a tty
@ -178,6 +192,9 @@ def fetch(self):
if self.archive_file: if self.archive_file:
os.remove(self.archive_file) os.remove(self.archive_file)
if partial_file and os.path.exists(partial_file):
os.remove(partial_file)
if spack.curl.returncode == 22: if spack.curl.returncode == 22:
# This is a 404. Curl will print the error. # This is a 404. Curl will print the error.
raise FailedDownloadError( raise FailedDownloadError(
@ -209,6 +226,9 @@ def fetch(self):
"'spack clean <package>' to remove the bad archive, then fix", "'spack clean <package>' to remove the bad archive, then fix",
"your internet gateway issue and install again.") "your internet gateway issue and install again.")
if save_file:
os.rename(partial_file, save_file)
if not self.archive_file: if not self.archive_file:
raise FailedDownloadError(self.url) raise FailedDownloadError(self.url)

View file

@ -210,6 +210,18 @@ def _need_to_create_path(self):
return False return False
@property
def expected_archive_files(self):
"""Possible archive file paths."""
paths = []
if isinstance(self.fetcher, fs.URLFetchStrategy):
paths.append(os.path.join(self.path, os.path.basename(self.fetcher.url)))
if self.mirror_path:
paths.append(os.path.join(self.path, os.path.basename(self.mirror_path)))
return paths
@property @property
def archive_file(self): def archive_file(self):
"""Path to the source archive within this stage directory.""" """Path to the source archive within this stage directory."""

View file

@ -61,15 +61,14 @@
'optional_deps', 'optional_deps',
'make_executable', 'make_executable',
'configure_guess', 'configure_guess',
'unit_install',
'lock', 'lock',
'database', 'database',
'namespace_trie', 'namespace_trie',
'yaml', 'yaml',
'sbang', 'sbang',
'environment', 'environment',
'cmd.uninstall'] 'cmd.uninstall',
# 'cflags'] 'cmd.test_install']
def list_tests(): def list_tests():

View file

@ -219,3 +219,27 @@ def test_ld_deps(self):
' '.join(test_command)) ' '.join(test_command))
def test_ld_deps_reentrant(self):
"""Make sure ld -r is handled correctly on OS's where it doesn't
support rpaths."""
os.environ['SPACK_DEPENDENCIES'] = ':'.join([self.dep1])
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2=linux-x86_64"
reentrant_test_command = ['-r'] + test_command
self.check_ld('dump-args', reentrant_test_command,
'ld ' +
'-rpath ' + self.prefix + '/lib ' +
'-rpath ' + self.prefix + '/lib64 ' +
'-L' + self.dep1 + '/lib ' +
'-rpath ' + self.dep1 + '/lib ' +
'-r ' +
' '.join(test_command))
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2=darwin-x86_64"
self.check_ld('dump-args', reentrant_test_command,
'ld ' +
'-L' + self.dep1 + '/lib ' +
'-r ' +
' '.join(test_command))

View file

@ -1,91 +0,0 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import unittest
import shutil
import tempfile
from llnl.util.filesystem import *
import spack
from spack.stage import Stage
from spack.fetch_strategy import URLFetchStrategy
from spack.directory_layout import YamlDirectoryLayout
from spack.util.executable import which
from spack.test.mock_packages_test import *
from spack.test.mock_repo import MockArchive
class CflagsTest(MockPackagesTest):
"""Tests install and uninstall on a trivial package."""
def setUp(self):
super(CflagsTest, self).setUp()
# create a simple installable package directory and tarball
self.repo = MockArchive()
# We use a fake package, so skip the checksum.
spack.do_checksum = False
# Use a fake install directory to avoid conflicts bt/w
# installed pkgs and mock packages.
self.tmpdir = tempfile.mkdtemp()
self.orig_layout = spack.install_layout
spack.install_layout = YamlDirectoryLayout(self.tmpdir)
def tearDown(self):
super(CflagsTest, self).tearDown()
if self.repo.stage is not None:
self.repo.stage.destroy()
# Turn checksumming back on
spack.do_checksum = True
# restore spack's layout.
spack.install_layout = self.orig_layout
shutil.rmtree(self.tmpdir, ignore_errors=True)
def test_compiler_calls(self):
# Get a basic concrete spec for the trivial install package.
spec = Spec('cflags_test_package')
spec.concretize()
self.assertTrue(spec.concrete)
# Get the package
pkg = spack.db.get(spec)
# Fake the URL for the package so it downloads from a file.
pkg.fetcher = URLFetchStrategy(self.repo.url)
try:
pkg.do_install()
pkg.do_uninstall()
except Exception, e:
pkg.remove_prefix()
raise

View file

@ -0,0 +1,190 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import collections
from contextlib import contextmanager
import StringIO
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
# Monkey-patch open to write module files to a StringIO instance
@contextmanager
def mock_open(filename, mode):
if not mode == 'wb':
raise RuntimeError('test.test_install : unexpected opening mode for monkey-patched open')
FILE_REGISTRY[filename] = StringIO.StringIO()
try:
yield FILE_REGISTRY[filename]
finally:
handle = FILE_REGISTRY[filename]
FILE_REGISTRY[filename] = handle.getvalue()
handle.close()
import os
import itertools
import unittest
import spack
import spack.cmd
# The use of __import__ is necessary to maintain a name with hyphen (which cannot be an identifier in python)
test_install = __import__("spack.cmd.test-install", fromlist=['test_install'])
class MockSpec(object):
def __init__(self, name, version, hashStr=None):
self.dependencies = {}
self.name = name
self.version = version
self.hash = hashStr if hashStr else hash((name, version))
def traverse(self, order=None):
for _, spec in self.dependencies.items():
yield spec
yield self
#allDeps = itertools.chain.from_iterable(i.traverse() for i in self.dependencies.itervalues())
#return set(itertools.chain([self], allDeps))
def dag_hash(self):
return self.hash
@property
def short_spec(self):
return '-'.join([self.name, str(self.version), str(self.hash)])
class MockPackage(object):
def __init__(self, spec, buildLogPath):
self.name = spec.name
self.spec = spec
self.installed = False
self.build_log_path = buildLogPath
def do_install(self, *args, **kwargs):
self.installed = True
class MockPackageDb(object):
def __init__(self, init=None):
self.specToPkg = {}
if init:
self.specToPkg.update(init)
def get(self, spec):
return self.specToPkg[spec]
def mock_fetch_log(path):
return []
specX = MockSpec('X', "1.2.0")
specY = MockSpec('Y', "2.3.8")
specX.dependencies['Y'] = specY
pkgX = MockPackage(specX, 'logX')
pkgY = MockPackage(specY, 'logY')
class MockArgs(object):
def __init__(self, package):
self.package = package
self.jobs = None
self.no_checksum = False
self.output = None
# TODO: add test(s) where Y fails to install
class TestInstallTest(unittest.TestCase):
"""
Tests test-install where X->Y
"""
def setUp(self):
super(TestInstallTest, self).setUp()
# Monkey patch parse specs
def monkey_parse_specs(x, concretize):
if x == 'X':
return [specX]
elif x == 'Y':
return [specY]
return []
self.parse_specs = spack.cmd.parse_specs
spack.cmd.parse_specs = monkey_parse_specs
# Monkey patch os.mkdirp
self.os_mkdir = os.mkdir
os.mkdir = lambda x: True
# Monkey patch open
test_install.open = mock_open
# Clean FILE_REGISTRY
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
pkgX.installed = False
pkgY.installed = False
# Monkey patch pkgDb
self.saved_db = spack.repo
pkgDb = MockPackageDb({specX: pkgX, specY: pkgY})
spack.repo = pkgDb
def tearDown(self):
# Remove the monkey patched test_install.open
test_install.open = open
# Remove the monkey patched os.mkdir
os.mkdir = self.os_mkdir
del self.os_mkdir
# Remove the monkey patched parse_specs
spack.cmd.parse_specs = self.parse_specs
del self.parse_specs
super(TestInstallTest, self).tearDown()
spack.repo = self.saved_db
def test_installing_both(self):
test_install.test_install(None, MockArgs('X') )
self.assertEqual(len(FILE_REGISTRY), 1)
for _, content in FILE_REGISTRY.items():
self.assertTrue('tests="2"' in content)
self.assertTrue('failures="0"' in content)
self.assertTrue('errors="0"' in content)
def test_dependency_already_installed(self):
pkgX.installed = True
pkgY.installed = True
test_install.test_install(None, MockArgs('X'))
self.assertEqual(len(FILE_REGISTRY), 1)
for _, content in FILE_REGISTRY.items():
self.assertTrue('tests="2"' in content)
self.assertTrue('failures="0"' in content)
self.assertTrue('errors="0"' in content)
self.assertEqual(sum('skipped' in line for line in content.split('\n')), 2)

View file

@ -1,126 +0,0 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import itertools
import unittest
import spack
test_install = __import__("spack.cmd.test-install",
fromlist=["BuildId", "create_test_output", "TestResult"])
class MockOutput(object):
def __init__(self):
self.results = {}
def add_test(self, buildId, passed=True, buildInfo=None):
self.results[buildId] = passed
def write_to(self, stream):
pass
class MockSpec(object):
def __init__(self, name, version, hashStr=None):
self.dependencies = {}
self.name = name
self.version = version
self.hash = hashStr if hashStr else hash((name, version))
def traverse(self, order=None):
allDeps = itertools.chain.from_iterable(i.traverse() for i in
self.dependencies.itervalues())
return set(itertools.chain([self], allDeps))
def dag_hash(self):
return self.hash
def to_yaml(self):
return "<<<MOCK YAML {0}>>>".format(test_install.BuildId(self).stringId())
class MockPackage(object):
def __init__(self, buildLogPath):
self.installed = False
self.build_log_path = buildLogPath
specX = MockSpec("X", "1.2.0")
specY = MockSpec("Y", "2.3.8")
specX.dependencies['Y'] = specY
pkgX = MockPackage('logX')
pkgY = MockPackage('logY')
bIdX = test_install.BuildId(specX)
bIdY = test_install.BuildId(specY)
class UnitInstallTest(unittest.TestCase):
"""Tests test-install where X->Y"""
def setUp(self):
super(UnitInstallTest, self).setUp()
pkgX.installed = False
pkgY.installed = False
self.saved_db = spack.repo
pkgDb = MockPackageDb({specX:pkgX, specY:pkgY})
spack.repo = pkgDb
def tearDown(self):
super(UnitInstallTest, self).tearDown()
spack.repo = self.saved_db
def test_installing_both(self):
mo = MockOutput()
pkgX.installed = True
pkgY.installed = True
test_install.create_test_output(specX, [specX, specY], mo, getLogFunc=mock_fetch_log)
self.assertEqual(mo.results,
{bIdX:test_install.TestResult.PASSED,
bIdY:test_install.TestResult.PASSED})
def test_dependency_already_installed(self):
mo = MockOutput()
pkgX.installed = True
pkgY.installed = True
test_install.create_test_output(specX, [specX], mo, getLogFunc=mock_fetch_log)
self.assertEqual(mo.results, {bIdX:test_install.TestResult.PASSED})
#TODO: add test(s) where Y fails to install
class MockPackageDb(object):
def __init__(self, init=None):
self.specToPkg = {}
if init:
self.specToPkg.update(init)
def get(self, spec):
return self.specToPkg[spec]
def mock_fetch_log(path):
return []

View file

@ -1,4 +1,4 @@
############################################################################## #####################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC. # Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory. # Produced at the Lawrence Livermore National Laboratory.
# #
@ -84,7 +84,10 @@ function spack {
if [ "$_sp_arg" = "-h" ]; then if [ "$_sp_arg" = "-h" ]; then
command spack cd -h command spack cd -h
else else
cd $(spack location $_sp_arg "$@") LOC="$(spack location $_sp_arg "$@")"
if [[ -d "$LOC" ]] ; then
cd "$LOC"
fi
fi fi
return return
;; ;;

View file

@ -0,0 +1,13 @@
diff --git a/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp b/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp
index fc6fc28..14103d2 100644
--- a/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp
+++ b/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp
@@ -27,7 +27,7 @@ using namespace std;
#include <ctime>
#include <cmath>
-#include "adolc.h"
+#include <adolc/adolc.h>
#ifdef _OPENMP
#include <omp.h>

View file

@ -0,0 +1,80 @@
from spack import *
import sys
class AdolC(Package):
"""A package for the automatic differentiation of first and higher derivatives of vector functions in C and C++ programs by operator overloading."""
homepage = "https://projects.coin-or.org/ADOL-C"
url = "http://www.coin-or.org/download/source/ADOL-C/ADOL-C-2.6.1.tgz"
version('head', svn='https://projects.coin-or.org/svn/ADOL-C/trunk/')
version('2.6.1', '1032b28427d6e399af4610e78c0f087b')
variant('doc', default=True, description='Install documentation')
variant('openmp', default=False, description='Enable OpenMP support')
variant('sparse', default=False, description='Enable sparse drivers')
variant('tests', default=True, description='Build all included examples as a test case')
patch('openmp_exam.patch')
def install(self, spec, prefix):
make_args = ['--prefix=%s' % prefix]
# --with-cflags=FLAGS use CFLAGS=FLAGS (default: -O3 -Wall -ansi)
# --with-cxxflags=FLAGS use CXXFLAGS=FLAGS (default: -O3 -Wall)
if '+openmp' in spec:
if spec.satisfies('%gcc'):
make_args.extend([
'--with-openmp-flag=-fopenmp' # FIXME: Is this required? -I <path to omp.h> -L <LLVM OpenMP library path>
])
else:
raise InstallError("OpenMP flags for compilers other than GCC are not implemented.")
if '+sparse' in spec:
make_args.extend([
'--enable-sparse'
])
# We can simply use the bundled examples to check
# whether Adol-C works as expected
if '+tests' in spec:
make_args.extend([
'--enable-docexa', # Documeted examples
'--enable-addexa' # Additional examples
])
if '+openmp' in spec:
make_args.extend([
'--enable-parexa' # Parallel examples
])
configure(*make_args)
make()
make("install")
# Copy the config.h file, as some packages might require it
source_directory = self.stage.source_path
config_h = join_path(source_directory,'ADOL-C','src','config.h')
install(config_h, join_path(prefix.include,'adolc'))
# Install documentation to {prefix}/share
if '+doc' in spec:
install_tree(join_path('ADOL-C','doc'),
join_path(prefix.share,'doc'))
# Install examples to {prefix}/share
if '+tests' in spec:
install_tree(join_path('ADOL-C','examples'),
join_path(prefix.share,'examples'))
# Run some examples that don't require user input
# TODO: Check that bundled examples produce the correct results
with working_dir(join_path(source_directory,'ADOL-C','examples')):
Executable('./tapeless_scalar')()
Executable('./tapeless_vector')()
with working_dir(join_path(source_directory,'ADOL-C','examples','additional_examples')):
Executable('./checkpointing/checkpointing')()
if '+openmp' in spec:
with working_dir(join_path(source_directory,'ADOL-C','examples','additional_examples')):
Executable('./checkpointing/checkpointing')()

View file

@ -14,4 +14,5 @@ def install(self, spec, prefix):
make('-f', make('-f',
join_path(self.stage.source_path,'build','clang','Makefile'), join_path(self.stage.source_path,'build','clang','Makefile'),
parallel=False) parallel=False)
mkdirp(self.prefix.bin)
install(join_path(self.stage.source_path, 'src','bin','astyle'), self.prefix.bin) install(join_path(self.stage.source_path, 'src','bin','astyle'), self.prefix.bin)

View file

@ -8,6 +8,8 @@ class Autoconf(Package):
version('2.69', '82d05e03b93e45f5a39b828dc9c6c29b') version('2.69', '82d05e03b93e45f5a39b828dc9c6c29b')
version('2.62', '6c1f3b3734999035d77da5024aab4fbd') version('2.62', '6c1f3b3734999035d77da5024aab4fbd')
depends_on("m4")
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix) configure("--prefix=%s" % prefix)

View file

@ -0,0 +1,17 @@
from spack import *
class Bbcp(Package):
"""Securely and quickly copy data from source to target"""
homepage = "http://www.slac.stanford.edu/~abh/bbcp/"
version('git', git='http://www.slac.stanford.edu/~abh/bbcp/bbcp.git', branch="master")
def install(self, spec, prefix):
cd("src")
make()
# BBCP wants to build the executable in a directory whose name depends on the system type
makesname = Executable("../MakeSname")
bbcp_executable_path = "../bin/%s/bbcp" % makesname(output=str).rstrip("\n")
destination_path = "%s/bin/" % prefix
mkdirp(destination_path)
install(bbcp_executable_path, destination_path)

View file

@ -0,0 +1,12 @@
from spack import *
class Cnmem(Package):
"""CNMem mempool for CUDA devices"""
homepage = "https://github.com/NVIDIA/cnmem"
version('git', git='https://github.com/NVIDIA/cnmem.git', branch="master")
def install(self, spec, prefix):
cmake('.',*std_cmake_args)
make()
make('install')

View file

@ -12,6 +12,7 @@ class Dealii(Package):
variant('mpi', default=True, description='Compile with MPI') variant('mpi', default=True, description='Compile with MPI')
variant('arpack', default=True, description='Compile with Arpack and PArpack (only with MPI)') variant('arpack', default=True, description='Compile with Arpack and PArpack (only with MPI)')
variant('doc', default=False, description='Compile with documentation') variant('doc', default=False, description='Compile with documentation')
variant('gsl' , default=True, description='Compile with GSL')
variant('hdf5', default=True, description='Compile with HDF5 (only with MPI)') variant('hdf5', default=True, description='Compile with HDF5 (only with MPI)')
variant('metis', default=True, description='Compile with Metis') variant('metis', default=True, description='Compile with Metis')
variant('netcdf', default=True, description='Compile with Netcdf (only with MPI)') variant('netcdf', default=True, description='Compile with Netcdf (only with MPI)')
@ -39,6 +40,8 @@ class Dealii(Package):
depends_on ("mpi", when="+mpi") depends_on ("mpi", when="+mpi")
depends_on ("arpack-ng+mpi", when='+arpack+mpi') depends_on ("arpack-ng+mpi", when='+arpack+mpi')
depends_on ("doxygen", when='+doc') depends_on ("doxygen", when='+doc')
depends_on ("gsl", when='@8.5.0:+gsl')
depends_on ("gsl", when='@dev+gsl')
depends_on ("hdf5+mpi~cxx", when='+hdf5+mpi') #FIXME NetCDF declares dependency with ~cxx, why? depends_on ("hdf5+mpi~cxx", when='+hdf5+mpi') #FIXME NetCDF declares dependency with ~cxx, why?
depends_on ("metis@5:", when='+metis') depends_on ("metis@5:", when='+metis')
depends_on ("netcdf+mpi", when="+netcdf+mpi") depends_on ("netcdf+mpi", when="+netcdf+mpi")
@ -50,8 +53,8 @@ class Dealii(Package):
depends_on ("trilinos", when='+trilinos+mpi') depends_on ("trilinos", when='+trilinos+mpi')
# developer dependnecies # developer dependnecies
#depends_on ("numdiff") #FIXME depends_on ("numdiff", when='@dev')
#depends_on ("astyle") #FIXME depends_on ("astyle@2.04", when='@dev')
def install(self, spec, prefix): def install(self, spec, prefix):
options = [] options = []
@ -80,7 +83,6 @@ def install(self, spec, prefix):
(join_path(spec['lapack'].prefix.lib,'liblapack.%s' % dsuf), # FIXME don't hardcode names (join_path(spec['lapack'].prefix.lib,'liblapack.%s' % dsuf), # FIXME don't hardcode names
join_path(spec['blas'].prefix.lib,'libblas.%s' % dsuf)), # FIXME don't hardcode names join_path(spec['blas'].prefix.lib,'libblas.%s' % dsuf)), # FIXME don't hardcode names
'-DMUPARSER_DIR=%s ' % spec['muparser'].prefix, '-DMUPARSER_DIR=%s ' % spec['muparser'].prefix,
'-DP4EST_DIR=%s' % spec['p4est'].prefix,
'-DUMFPACK_DIR=%s' % spec['suite-sparse'].prefix, '-DUMFPACK_DIR=%s' % spec['suite-sparse'].prefix,
'-DTBB_DIR=%s' % spec['tbb'].prefix, '-DTBB_DIR=%s' % spec['tbb'].prefix,
'-DZLIB_DIR=%s' % spec['zlib'].prefix '-DZLIB_DIR=%s' % spec['zlib'].prefix
@ -100,7 +102,7 @@ def install(self, spec, prefix):
]) ])
# Optional dependencies for which librariy names are the same as CMake variables # Optional dependencies for which librariy names are the same as CMake variables
for library in ('hdf5', 'p4est','petsc', 'slepc','trilinos','metis'): for library in ('gsl','hdf5','p4est','petsc','slepc','trilinos','metis'):
if library in spec: if library in spec:
options.extend([ options.extend([
'-D{library}_DIR={value}'.format(library=library.upper(), value=spec[library].prefix), '-D{library}_DIR={value}'.format(library=library.upper(), value=spec[library].prefix),
@ -251,3 +253,6 @@ def install(self, spec, prefix):
cmake('.') cmake('.')
make('release') make('release')
make('run',parallel=False) make('run',parallel=False)
def setup_environment(self, spack_env, env):
env.set('DEAL_II_DIR', self.prefix)

View file

@ -45,6 +45,7 @@ class Eigen(Package):
# TODO : dependency on googlehash, superlu, adolc missing # TODO : dependency on googlehash, superlu, adolc missing
depends_on('cmake')
depends_on('metis@5:', when='+metis') depends_on('metis@5:', when='+metis')
depends_on('scotch', when='+scotch') depends_on('scotch', when='+scotch')
depends_on('fftw', when='+fftw') depends_on('fftw', when='+fftw')

View file

@ -6,6 +6,7 @@ class Flex(Package):
homepage = "http://flex.sourceforge.net/" homepage = "http://flex.sourceforge.net/"
url = "http://download.sourceforge.net/flex/flex-2.5.39.tar.gz" url = "http://download.sourceforge.net/flex/flex-2.5.39.tar.gz"
version('2.6.0', '5724bcffed4ebe39e9b55a9be80859ec')
version('2.5.39', 'e133e9ead8ec0a58d81166b461244fde') version('2.5.39', 'e133e9ead8ec0a58d81166b461244fde')
def install(self, spec, prefix): def install(self, spec, prefix):

View file

@ -38,6 +38,7 @@ class Gcc(Package):
list_url = 'http://open-source-box.org/gcc/' list_url = 'http://open-source-box.org/gcc/'
list_depth = 2 list_depth = 2
version('6.1.0', '8fb6cb98b8459f5863328380fbf06bd1')
version('5.3.0', 'c9616fd448f980259c31de613e575719') version('5.3.0', 'c9616fd448f980259c31de613e575719')
version('5.2.0', 'a51bcfeb3da7dd4c623e27207ed43467') version('5.2.0', 'a51bcfeb3da7dd4c623e27207ed43467')
version('4.9.3', '6f831b4d251872736e8e9cc09746f327') version('4.9.3', '6f831b4d251872736e8e9cc09746f327')

View file

@ -7,7 +7,8 @@ class Git(Package):
homepage = "http://git-scm.com" homepage = "http://git-scm.com"
url = "https://github.com/git/git/tarball/v2.7.1" url = "https://github.com/git/git/tarball/v2.7.1"
version('2.8.0-rc2', 'c2cf9f2cc70e35f2fafbaf9258f82e4c') version('2.8.1', '1308448d95afa41a4135903f22262fc8')
version('2.8.0', 'eca687e46e9750121638f258cff8317b')
version('2.7.3', 'fa1c008b56618c355a32ba4a678305f6') version('2.7.3', 'fa1c008b56618c355a32ba4a678305f6')
version('2.7.1', 'bf0706b433a8dedd27a63a72f9a66060') version('2.7.1', 'bf0706b433a8dedd27a63a72f9a66060')
@ -23,18 +24,10 @@ class Git(Package):
#version('2.2.1', 'ff41fdb094eed1ec430aed8ee9b9849c') #version('2.2.1', 'ff41fdb094eed1ec430aed8ee9b9849c')
# Git compiles with curl support by default on but if your system
# does not have it you will not be able to clone https repos
variant("curl", default=False, description="Add the internal support of curl for https clone")
# Git compiles with expat support by default on but if your system
# does not have it you will not be able to push https repos
variant("expat", default=False, description="Add the internal support of expat for https push")
depends_on("openssl") depends_on("openssl")
depends_on("autoconf") depends_on("autoconf")
depends_on("curl", when="+curl") depends_on("curl")
depends_on("expat", when="+expat") depends_on("expat")
# Also depends_on gettext: apt-get install gettext (Ubuntu) # Also depends_on gettext: apt-get install gettext (Ubuntu)
@ -49,23 +42,12 @@ def install(self, spec, prefix):
"--prefix=%s" % prefix, "--prefix=%s" % prefix,
"--without-pcre", "--without-pcre",
"--with-openssl=%s" % spec['openssl'].prefix, "--with-openssl=%s" % spec['openssl'].prefix,
"--with-zlib=%s" % spec['zlib'].prefix "--with-zlib=%s" % spec['zlib'].prefix,
"--with-curl=%s" % spec['curl'].prefix,
"--with-expat=%s" % spec['expat'].prefix,
] ]
if '+curl' in spec:
configure_args.append("--with-curl=%s" % spec['curl'].prefix)
if '+expat' in spec:
configure_args.append("--with-expat=%s" % spec['expat'].prefix)
which('autoreconf')('-i') which('autoreconf')('-i')
configure(*configure_args) configure(*configure_args)
make() make()
make("install") make("install")

View file

@ -11,6 +11,8 @@ class Glm(Package):
url = "https://github.com/g-truc/glm/archive/0.9.7.1.tar.gz" url = "https://github.com/g-truc/glm/archive/0.9.7.1.tar.gz"
version('0.9.7.1', '61af6639cdf652d1cdd7117190afced8') version('0.9.7.1', '61af6639cdf652d1cdd7117190afced8')
depends_on ("cmake")
def install(self, spec, prefix): def install(self, spec, prefix):
with working_dir('spack-build', create=True): with working_dir('spack-build', create=True):

View file

@ -35,6 +35,8 @@ class Gmp(Package):
version('6.0.0a', 'b7ff2d88cae7f8085bd5006096eed470') version('6.0.0a', 'b7ff2d88cae7f8085bd5006096eed470')
version('6.0.0' , '6ef5869ae735db9995619135bd856b84') version('6.0.0' , '6ef5869ae735db9995619135bd856b84')
depends_on("m4")
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix) configure("--prefix=%s" % prefix)
make() make()

View file

@ -38,7 +38,7 @@ class Hdf5(Package):
list_depth = 3 list_depth = 3
version('1.10.0', 'bdc935337ee8282579cd6bc4270ad199') version('1.10.0', 'bdc935337ee8282579cd6bc4270ad199')
version('1.8.16', 'b8ed9a36ae142317f88b0c7ef4b9c618') version('1.8.16', 'b8ed9a36ae142317f88b0c7ef4b9c618', preferred=True)
version('1.8.15', '03cccb5b33dbe975fdcd8ae9dc021f24') version('1.8.15', '03cccb5b33dbe975fdcd8ae9dc021f24')
version('1.8.13', 'c03426e9e77d7766944654280b467289') version('1.8.13', 'c03426e9e77d7766944654280b467289')

View file

@ -0,0 +1,21 @@
from spack import *
class Hydra(Package):
"""Hydra is a process management system for starting parallel jobs.
Hydra is designed to natively work with existing launcher daemons
(such as ssh, rsh, fork), as well as natively integrate with resource
management systems (such as slurm, pbs, sge)."""
homepage = "http://www.mpich.org"
url = "http://www.mpich.org/static/downloads/3.2/hydra-3.2.tar.gz"
list_url = "http://www.mpich.org/static/downloads/"
list_depth = 2
version('3.2', '4d670916695bf7e3a869cc336a881b39')
def install(self, spec, prefix):
configure('--prefix=%s' % prefix)
make()
make("install")

View file

@ -0,0 +1,42 @@
from spack import *
import os
class Ior(Package):
"""The IOR software is used for benchmarking parallel file systems
using POSIX, MPI-IO, or HDF5 interfaces."""
homepage = "https://github.com/LLNL/ior"
url = "https://github.com/LLNL/ior/archive/3.0.1.tar.gz"
version('3.0.1', '71150025e0bb6ea1761150f48b553065')
variant('hdf5', default=False, description='support IO with HDF5 backend')
variant('ncmpi', default=False, description='support IO with NCMPI backend')
depends_on('mpi')
depends_on('hdf5+mpi', when='+hdf5')
depends_on('netcdf+mpi', when='+ncmpi')
def install(self, spec, prefix):
os.system('./bootstrap')
config_args = [
'MPICC=%s' % spec['mpi'].prefix.bin + '/mpicc',
'--prefix=%s' % prefix,
]
if '+hdf5' in spec:
config_args.append('--with-hdf5')
else:
config_args.append('--without-hdf5')
if '+ncmpi' in spec:
config_args.append('--with-ncmpi')
else:
config_args.append('--without-ncmpi')
configure(*config_args)
make()
make('install')

View file

@ -8,6 +8,8 @@ class Libtool(Package):
version('2.4.6' , 'addf44b646ddb4e3919805aa88fa7c5e') version('2.4.6' , 'addf44b646ddb4e3919805aa88fa7c5e')
version('2.4.2' , 'd2f3b7d4627e69e13514a40e72a24d50') version('2.4.2' , 'd2f3b7d4627e69e13514a40e72a24d50')
depends_on('m4')
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix) configure("--prefix=%s" % prefix)

View file

@ -0,0 +1,125 @@
from spack import *
import glob, string
class Mfem(Package):
"""Free, lightweight, scalable C++ library for finite element methods."""
homepage = 'http://www.mfem.org'
url = 'https://github.com/mfem/mfem'
# version('3.1', git='https://github.com/mfem/mfem.git',
# commit='dbae60fe32e071989b52efaaf59d7d0eb2a3b574')
version('3.1', '841ea5cf58de6fae4de0f553b0e01ebaab9cd9c67fa821e8a715666ecf18fc57',
url='http://goo.gl/xrScXn', expand=False)
variant('metis', default=False, description='Activate support for metis')
variant('hypre', default=False, description='Activate support for hypre')
variant('suite-sparse', default=False,
description='Activate support for SuiteSparse')
variant('mpi', default=False, description='Activate support for MPI')
variant('lapack', default=False, description='Activate support for LAPACK')
variant('debug', default=False, description='Build debug version')
depends_on('blas', when='+lapack')
depends_on('lapack', when='+lapack')
depends_on('mpi', when='+mpi')
depends_on('metis', when='+mpi')
depends_on('hypre', when='+mpi')
depends_on('hypre', when='+hypre')
depends_on('metis@4:', when='+metis')
depends_on('suite-sparse', when='+suite-sparse')
depends_on('blas', when='+suite-sparse')
depends_on('lapack', when='+suite-sparse')
depends_on('metis@5:', when='+suite-sparse ^suite-sparse@4.5:')
depends_on('cmake', when='^metis@5:')
def check_variants(self, spec):
if '+mpi' in spec and ('+hypre' not in spec or '+metis' not in spec):
raise InstallError('mfem+mpi must be built with +hypre ' +
'and +metis!')
if '+suite-sparse' in spec and ('+metis' not in spec or
'+lapack' not in spec):
raise InstallError('mfem+suite-sparse must be built with ' +
'+metis and +lapack!')
if 'metis@5:' in spec and '%clang' in spec and ('^cmake %gcc' not in spec):
raise InstallError('To work around CMake bug with clang, must ' +
'build mfem with mfem[+variants] %clang ' +
'^cmake %gcc to force CMake to build with gcc')
return
def install(self, spec, prefix):
self.check_variants(spec)
options = ['PREFIX=%s' % prefix]
if '+lapack' in spec:
lapack_lib = '-L{0} -llapack -L{1} -lblas'.format(
spec['lapack'].prefix.lib, spec['blas'].prefix.lib)
options.extend(['MFEM_USE_LAPACK=YES',
'LAPACK_OPT=-I%s' % spec['lapack'].prefix.include,
'LAPACK_LIB=%s' % lapack_lib])
if '+hypre' in spec:
options.extend(['HYPRE_DIR=%s' % spec['hypre'].prefix,
'HYPRE_OPT=-I%s' % spec['hypre'].prefix.include,
'HYPRE_LIB=-L%s' % spec['hypre'].prefix.lib +
' -lHYPRE'])
if '+metis' in spec:
metis_lib = '-L%s -lmetis' % spec['metis'].prefix.lib
if spec['metis'].satisfies('@5:'):
metis_str = 'MFEM_USE_METIS_5=YES'
else:
metis_str = 'MFEM_USE_METIS_5=NO'
options.extend([metis_str,
'METIS_DIR=%s' % spec['metis'].prefix,
'METIS_OPT=-I%s' % spec['metis'].prefix.include,
'METIS_LIB=%s' % metis_lib])
if '+mpi' in spec: options.extend(['MFEM_USE_MPI=YES'])
if '+suite-sparse' in spec:
ssp = spec['suite-sparse'].prefix
ss_lib = '-L%s' % ssp.lib
ss_lib += (' -lumfpack -lcholmod -lcolamd -lamd -lcamd' +
' -lccolamd -lsuitesparseconfig')
no_librt_archs = ['darwin-i686', 'darwin-x86_64']
no_rt = any(map(lambda a: spec.satisfies('='+a), no_librt_archs))
if not no_rt: ss_lib += ' -lrt'
ss_lib += (' ' + metis_lib + ' ' + lapack_lib)
options.extend(['MFEM_USE_SUITESPARSE=YES',
'SUITESPARSE_DIR=%s' % ssp,
'SUITESPARSE_OPT=-I%s' % ssp.include,
'SUITESPARSE_LIB=%s' % ss_lib])
if '+debug' in spec: options.extend(['MFEM_DEBUG=YES'])
# Dirty hack to cope with URL redirect
tgz_file = string.split(self.url,'/')[-1]
tar = which('tar')
tar('xzvf', tgz_file)
cd(glob.glob('mfem*')[0])
# End dirty hack to cope with URL redirect
make('config', *options)
make('all')
# Run a small test before installation
args = ['-m', join_path('data','star.mesh'), '--no-visualization']
if '+mpi' in spec:
Executable(join_path(spec['mpi'].prefix.bin,
'mpirun'))('-np',
'4',
join_path('examples','ex1p'),
*args)
else:
Executable(join_path('examples', 'ex1'))(*args)
make('install')

View file

@ -55,9 +55,10 @@ def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
spack_env.set('MPICH_FC', spack_fc) spack_env.set('MPICH_FC', spack_fc)
def setup_dependent_package(self, module, dep_spec): def setup_dependent_package(self, module, dep_spec):
"""For dependencies, make mpicc's use spack wrapper.""" self.spec.mpicc = join_path(self.prefix.bin, 'mpicc')
# FIXME : is this necessary ? Shouldn't this be part of a contract with MPI providers? self.spec.mpicxx = join_path(self.prefix.bin, 'mpic++')
module.mpicc = join_path(self.prefix.bin, 'mpicc') self.spec.mpifc = join_path(self.prefix.bin, 'mpif90')
self.spec.mpif77 = join_path(self.prefix.bin, 'mpif77')
def install(self, spec, prefix): def install(self, spec, prefix):
config_args = ["--prefix=" + prefix, config_args = ["--prefix=" + prefix,

View file

@ -6,6 +6,7 @@ class Mrnet(Package):
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_5.0.1.tar.gz" url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_5.0.1.tar.gz"
list_url = "http://ftp.cs.wisc.edu/paradyn/mrnet" list_url = "http://ftp.cs.wisc.edu/paradyn/mrnet"
version('5.0.1-2', git='https://github.com/dyninst/mrnet.git', commit='20b1eacfc6d680d9f6472146d2dfaa0f900cc2e9')
version('5.0.1', '17f65738cf1b9f9b95647ff85f69ecdd') version('5.0.1', '17f65738cf1b9f9b95647ff85f69ecdd')
version('4.1.0', '5a248298b395b329e2371bf25366115c') version('4.1.0', '5a248298b395b329e2371bf25366115c')
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f') version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')

View file

@ -147,6 +147,12 @@ def setup_dependent_environment(self, spack_env, run_env, extension_spec):
spack_env.set('MPICH_F90', spack_fc) spack_env.set('MPICH_F90', spack_fc)
spack_env.set('MPICH_FC', spack_fc) spack_env.set('MPICH_FC', spack_fc)
def setup_dependent_package(self, module, dep_spec):
self.spec.mpicc = join_path(self.prefix.bin, 'mpicc')
self.spec.mpicxx = join_path(self.prefix.bin, 'mpicxx')
self.spec.mpifc = join_path(self.prefix.bin, 'mpif90')
self.spec.mpif77 = join_path(self.prefix.bin, 'mpif77')
def install(self, spec, prefix): def install(self, spec, prefix):
# we'll set different configure flags depending on our environment # we'll set different configure flags depending on our environment
configure_args = [ configure_args = [

View file

@ -0,0 +1,20 @@
from spack import *
class Ncview(Package):
"""Simple viewer for NetCDF files."""
homepage = "http://meteora.ucsd.edu/~pierce/ncview_home_page.html"
url = "ftp://cirrus.ucsd.edu/pub/ncview/ncview-2.1.7.tar.gz"
version('2.1.7', 'debd6ca61410aac3514e53122ab2ba07')
depends_on("netcdf")
depends_on("udunits2")
# OS Dependencies
# Ubuntu: apt-get install libxaw7-dev
# CentOS 7: yum install libXaw-devel
def install(self, spec, prefix):
configure('--prefix=%s' % prefix)
make()
make("install")

View file

@ -12,15 +12,19 @@ class Netcdf(Package):
version('4.4.0', 'cffda0cbd97fdb3a06e9274f7aef438e') version('4.4.0', 'cffda0cbd97fdb3a06e9274f7aef438e')
version('4.3.3', '5fbd0e108a54bd82cb5702a73f56d2ae') version('4.3.3', '5fbd0e108a54bd82cb5702a73f56d2ae')
variant('mpi', default=True, description='Enables MPI parallelism') variant('mpi', default=True, description='Enables MPI parallelism')
variant('hdf4', default=False, description="Enable HDF4 support") variant('hdf4', default=False, description='Enable HDF4 support')
# Dependencies: depends_on("m4")
depends_on("curl") # required for DAP support
depends_on("hdf", when='+hdf4') depends_on("hdf", when='+hdf4')
depends_on("hdf5+mpi~cxx", when='+mpi') # required for NetCDF-4 support
depends_on("hdf5~mpi", when='~mpi') # required for NetCDF-4 support # Required for DAP support
depends_on("zlib") # required for NetCDF-4 support depends_on("curl")
# Required for NetCDF-4 support
depends_on("zlib")
depends_on("hdf5+mpi", when='+mpi')
depends_on("hdf5~mpi", when='~mpi')
def install(self, spec, prefix): def install(self, spec, prefix):
# Environment variables # Environment variables
@ -48,7 +52,7 @@ def install(self, spec, prefix):
# /usr/lib/x86_64-linux-gnu/libcurl.so: undefined reference to `SSL_CTX_use_certificate_chain_file@OPENSSL_1.0.0' # /usr/lib/x86_64-linux-gnu/libcurl.so: undefined reference to `SSL_CTX_use_certificate_chain_file@OPENSSL_1.0.0'
LIBS.append("-lcurl") LIBS.append("-lcurl")
CPPFLAGS.append("-I%s" % spec['curl'].prefix.include) CPPFLAGS.append("-I%s" % spec['curl'].prefix.include)
LDFLAGS.append ("-L%s" % spec['curl'].prefix.lib) LDFLAGS.append( "-L%s" % spec['curl'].prefix.lib)
if '+mpi' in spec: if '+mpi' in spec:
config_args.append('--enable-parallel4') config_args.append('--enable-parallel4')

View file

@ -0,0 +1,35 @@
diff --git a/Makefile.system b/Makefile.system
index b89f60e..2dbdad0 100644
--- a/Makefile.system
+++ b/Makefile.system
@@ -139,6 +139,10 @@ NO_PARALLEL_MAKE=0
endif
GETARCH_FLAGS += -DNO_PARALLEL_MAKE=$(NO_PARALLEL_MAKE)
+ifdef MAKE_NO_J
+GETARCH_FLAGS += -DMAKE_NO_J=$(MAKE_NO_J)
+endif
+
ifdef MAKE_NB_JOBS
GETARCH_FLAGS += -DMAKE_NB_JOBS=$(MAKE_NB_JOBS)
endif
diff --git a/getarch.c b/getarch.c
index f9c49e6..dffad70 100644
--- a/getarch.c
+++ b/getarch.c
@@ -1012,6 +1012,7 @@ int main(int argc, char *argv[]){
#endif
#endif
+#ifndef MAKE_NO_J
#ifdef MAKE_NB_JOBS
printf("MAKE += -j %d\n", MAKE_NB_JOBS);
#elif NO_PARALLEL_MAKE==1
@@ -1021,6 +1022,7 @@ int main(int argc, char *argv[]){
printf("MAKE += -j %d\n", get_num_cores());
#endif
#endif
+#endif
break;

View file

@ -1,6 +1,7 @@
from spack import * from spack import *
import sys import sys
import os import os
import shutil
class Openblas(Package): class Openblas(Package):
"""OpenBLAS: An optimized BLAS library""" """OpenBLAS: An optimized BLAS library"""
@ -13,18 +14,21 @@ class Openblas(Package):
variant('shared', default=True, description="Build shared libraries as well as static libs.") variant('shared', default=True, description="Build shared libraries as well as static libs.")
variant('openmp', default=True, description="Enable OpenMP support.") variant('openmp', default=True, description="Enable OpenMP support.")
variant('fpic', default=True, description="Build position independent code")
# virtual dependency # virtual dependency
provides('blas') provides('blas')
provides('lapack') provides('lapack')
patch('make.patch')
def install(self, spec, prefix): def install(self, spec, prefix):
# Openblas is picky about compilers. Configure fails with # Openblas is picky about compilers. Configure fails with
# FC=/abs/path/to/f77, whereas FC=f77 works fine. # FC=/abs/path/to/f77, whereas FC=f77 works fine.
# To circumvent this, provide basename only: # To circumvent this, provide basename only:
make_defs = ['CC=%s' % os.path.basename(spack_cc), make_defs = ['CC=%s' % os.path.basename(spack_cc),
'FC=%s' % os.path.basename(spack_f77)] 'FC=%s' % os.path.basename(spack_f77),
'MAKE_NO_J=1']
make_targets = ['libs', 'netlib'] make_targets = ['libs', 'netlib']
@ -32,6 +36,8 @@ def install(self, spec, prefix):
if '+shared' in spec: if '+shared' in spec:
make_targets += ['shared'] make_targets += ['shared']
else: else:
if '+fpic' in spec:
make_defs.extend(['CFLAGS=-fPIC', 'FFLAGS=-fPIC'])
make_defs += ['NO_SHARED=1'] make_defs += ['NO_SHARED=1']
# fix missing _dggsvd_ and _sggsvd_ # fix missing _dggsvd_ and _sggsvd_
@ -64,6 +70,10 @@ def install(self, spec, prefix):
if '+shared' in spec: if '+shared' in spec:
symlink('libopenblas.%s' % dso_suffix, 'liblapack.%s' % dso_suffix) symlink('libopenblas.%s' % dso_suffix, 'liblapack.%s' % dso_suffix)
# Openblas may pass its own test but still fail to compile Lapack
# symbols. To make sure we get working Blas and Lapack, do a small test.
self.check_install(spec)
def setup_dependent_package(self, module, dspec): def setup_dependent_package(self, module, dspec):
# This is WIP for a prototype interface for virtual packages. # This is WIP for a prototype interface for virtual packages.
@ -76,3 +86,60 @@ def setup_dependent_package(self, module, dspec):
if '+shared' in self.spec: if '+shared' in self.spec:
self.spec.blas_shared_lib = join_path(libdir, 'libopenblas.%s' % dso_suffix) self.spec.blas_shared_lib = join_path(libdir, 'libopenblas.%s' % dso_suffix)
self.spec.lapack_shared_lib = self.spec.blas_shared_lib self.spec.lapack_shared_lib = self.spec.blas_shared_lib
def check_install(self, spec):
"Build and run a small program to test that we have Lapack symbols"
print "Checking Openblas installation..."
checkdir = "spack-check"
with working_dir(checkdir, create=True):
source = r"""
#include <cblas.h>
#include <stdio.h>
int main(void) {
int i=0;
double A[6] = {1.0, 2.0, 1.0, -3.0, 4.0, -1.0};
double B[6] = {1.0, 2.0, 1.0, -3.0, 4.0, -1.0};
double C[9] = {.5, .5, .5, .5, .5, .5, .5, .5, .5};
cblas_dgemm(CblasColMajor, CblasNoTrans, CblasTrans,
3, 3, 2, 1, A, 3, B, 3, 2, C, 3);
for (i = 0; i < 9; i++)
printf("%f\n", C[i]);
return 0;
}
"""
expected = """\
11.000000
-9.000000
5.000000
-9.000000
21.000000
-1.000000
5.000000
-1.000000
3.000000
"""
with open("check.c", 'w') as f:
f.write(source)
cc = which('cc')
# TODO: Automate these path and library settings
cc('-c', "-I%s" % join_path(spec.prefix, "include"), "check.c")
cc('-o', "check", "check.o",
"-L%s" % join_path(spec.prefix, "lib"), "-llapack", "-lblas", "-lpthread")
try:
check = Executable('./check')
output = check(return_output=True)
except:
output = ""
success = output == expected
if not success:
print "Produced output does not match expected output."
print "Expected output:"
print '-'*80
print expected
print '-'*80
print "Produced output:"
print '-'*80
print output
print '-'*80
raise RuntimeError("Openblas install check failed")
shutil.rmtree(checkdir)

View file

@ -0,0 +1,13 @@
#include <cblas.h>
#include <stdio.h>
int main(void) {
int i=0;
double A[6] = {1.0, 2.0, 1.0, -3.0, 4.0, -1.0};
double B[6] = {1.0, 2.0, 1.0, -3.0, 4.0, -1.0};
double C[9] = {.5, .5, .5, .5, .5, .5, .5, .5, .5};
cblas_dgemm(CblasColMajor, CblasNoTrans, CblasTrans,
3, 3, 2, 1, A, 3, B, 3, 2, C, 3);
for (i = 0; i < 9; i++)
printf("%f\n", C[i]);
return 0;
}

View file

@ -0,0 +1,9 @@
11.000000
-9.000000
5.000000
-9.000000
21.000000
-1.000000
5.000000
-1.000000
3.000000

View file

@ -1,7 +1,5 @@
import os
from spack import * from spack import *
import os
class Openmpi(Package): class Openmpi(Package):
"""Open MPI is a project combining technologies and resources from """Open MPI is a project combining technologies and resources from
@ -38,6 +36,7 @@ class Openmpi(Package):
depends_on('hwloc') depends_on('hwloc')
def url_for_version(self, version): def url_for_version(self, version):
return "http://www.open-mpi.org/software/ompi/v%s/downloads/openmpi-%s.tar.bz2" % (version.up_to(2), version) return "http://www.open-mpi.org/software/ompi/v%s/downloads/openmpi-%s.tar.bz2" % (version.up_to(2), version)
@ -48,6 +47,12 @@ def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
spack_env.set('OMPI_FC', spack_fc) spack_env.set('OMPI_FC', spack_fc)
spack_env.set('OMPI_F77', spack_f77) spack_env.set('OMPI_F77', spack_f77)
def setup_dependent_package(self, module, dep_spec):
self.spec.mpicc = join_path(self.prefix.bin, 'mpicc')
self.spec.mpicxx = join_path(self.prefix.bin, 'mpic++')
self.spec.mpifc = join_path(self.prefix.bin, 'mpif90')
self.spec.mpif77 = join_path(self.prefix.bin, 'mpif77')
def install(self, spec, prefix): def install(self, spec, prefix):
config_args = ["--prefix=%s" % prefix, config_args = ["--prefix=%s" % prefix,

View file

@ -3,6 +3,7 @@
from spack import * from spack import *
class Openssl(Package): class Openssl(Package):
"""The OpenSSL Project is a collaborative effort to develop a """The OpenSSL Project is a collaborative effort to develop a
robust, commercial-grade, full-featured, and Open Source robust, commercial-grade, full-featured, and Open Source
@ -14,10 +15,12 @@ class Openssl(Package):
version('1.0.1h', '8d6d684a9430d5cc98a62a5d8fbda8cf') version('1.0.1h', '8d6d684a9430d5cc98a62a5d8fbda8cf')
version('1.0.1r', '1abd905e079542ccae948af37e393d28') version('1.0.1r', '1abd905e079542ccae948af37e393d28')
version('1.0.1t', '9837746fcf8a6727d46d22ca35953da1')
version('1.0.2d', '38dd619b2e77cbac69b99f52a053d25a') version('1.0.2d', '38dd619b2e77cbac69b99f52a053d25a')
version('1.0.2e', '5262bfa25b60ed9de9f28d5d52d77fc5') version('1.0.2e', '5262bfa25b60ed9de9f28d5d52d77fc5')
version('1.0.2f', 'b3bf73f507172be9292ea2a8c28b659d') version('1.0.2f', 'b3bf73f507172be9292ea2a8c28b659d')
version('1.0.2g', 'f3c710c045cdee5fd114feb69feba7aa') version('1.0.2g', 'f3c710c045cdee5fd114feb69feba7aa')
version('1.0.2h', '9392e65072ce4b614c1392eefc1f23d0')
depends_on("zlib") depends_on("zlib")
parallel = False parallel = False
@ -30,26 +33,14 @@ def url_for_version(self, version):
# Same idea, but just to avoid issuing the same message multiple times # Same idea, but just to avoid issuing the same message multiple times
warnings_given_to_user = getattr(Openssl, '_warnings_given', {}) warnings_given_to_user = getattr(Openssl, '_warnings_given', {})
if openssl_url is None: if openssl_url is None:
latest = 'http://www.openssl.org/source/openssl-{version}.tar.gz' if self.spec.satisfies('@external'):
older = 'http://www.openssl.org/source/old/{version_number}/openssl-{version_full}.tar.gz' # The version @external is reserved to system openssl. In that case return a fake url and exit
# Try to use the url where the latest tarballs are stored. If the url does not exist (404), then openssl_url = '@external (reserved version for system openssl)'
# return the url for older format
version_number = '.'.join([str(x) for x in version[:-1]])
older_url = older.format(version_number=version_number, version_full=version)
latest_url = latest.format(version=version)
response = urllib.urlopen(latest.format(version=version))
if response.getcode() == 404:
openssl_url = older_url
# Checks if we already warned the user for this particular version of OpenSSL.
# If not we display a warning message and mark this version
if not warnings_given_to_user.get(version, False): if not warnings_given_to_user.get(version, False):
tty.warn('This installation depends on an old version of OpenSSL, which may have known security issues. ') tty.msg('Using openssl@external : the version @external is reserved for system openssl')
tty.warn('Consider updating to the latest version of this package.')
tty.warn('More details at {homepage}'.format(homepage=Openssl.homepage))
warnings_given_to_user[version] = True warnings_given_to_user[version] = True
else: else:
openssl_url = latest_url openssl_url = self.check_for_outdated_release(version, warnings_given_to_user) # Store the computed URL
# Store the computed URL
openssl_urls[version] = openssl_url openssl_urls[version] = openssl_url
# Store the updated dictionary of URLS # Store the updated dictionary of URLS
Openssl._openssl_url = openssl_urls Openssl._openssl_url = openssl_urls
@ -58,6 +49,28 @@ def url_for_version(self, version):
return openssl_url return openssl_url
def check_for_outdated_release(self, version, warnings_given_to_user):
latest = 'ftp://ftp.openssl.org/source/openssl-{version}.tar.gz'
older = 'http://www.openssl.org/source/old/{version_number}/openssl-{version_full}.tar.gz'
# Try to use the url where the latest tarballs are stored. If the url does not exist (404), then
# return the url for older format
version_number = '.'.join([str(x) for x in version[:-1]])
try:
openssl_url = latest.format(version=version)
urllib.urlopen(openssl_url)
except IOError:
openssl_url = older.format(version_number=version_number, version_full=version)
# Checks if we already warned the user for this particular version of OpenSSL.
# If not we display a warning message and mark this version
if not warnings_given_to_user.get(version, False):
tty.warn(
'This installation depends on an old version of OpenSSL, which may have known security issues. ')
tty.warn('Consider updating to the latest version of this package.')
tty.warn('More details at {homepage}'.format(homepage=Openssl.homepage))
warnings_given_to_user[version] = True
return openssl_url
def install(self, spec, prefix): def install(self, spec, prefix):
# OpenSSL uses a variable APPS in its Makefile. If it happens to be set # OpenSSL uses a variable APPS in its Makefile. If it happens to be set
# in the environment, then this will override what is set in the # in the environment, then this will override what is set in the

View file

@ -0,0 +1,38 @@
from spack import *
class OsuMicroBenchmarks(Package):
"""The Ohio MicroBenchmark suite is a collection of independent MPI
message passing performance microbenchmarks developed and written at
The Ohio State University. It includes traditional benchmarks and
performance measures such as latency, bandwidth and host overhead
and can be used for both traditional and GPU-enhanced nodes."""
homepage = "http://mvapich.cse.ohio-state.edu/benchmarks/"
url = "http://mvapich.cse.ohio-state.edu/download/mvapich/osu-micro-benchmarks-5.3.tar.gz"
version('5.3', '42e22b931d451e8bec31a7424e4adfc2')
variant('cuda', default=False, description="Enable CUDA support")
depends_on('mpi')
depends_on('cuda', when='+cuda')
def install(self, spec, prefix):
config_args = [
'CC=%s' % spec['mpi'].prefix.bin + '/mpicc',
'CXX=%s' % spec['mpi'].prefix.bin + '/mpicxx',
'LDFLAGS=-lrt',
'--prefix=%s' % prefix
]
if '+cuda' in spec:
config_args.extend([
'--enable-cuda',
'--with-cuda=%s' % spec['cuda'].prefix,
])
configure(*config_args)
make()
make('install')

View file

@ -7,10 +7,17 @@ class P4est(Package):
version('1.1', '37ba7f4410958cfb38a2140339dbf64f') version('1.1', '37ba7f4410958cfb38a2140339dbf64f')
# disable by default to make it work on frontend of clusters variant('tests', default=True, description='Run small tests')
variant('tests', default=False, description='Run small tests')
# build dependencies
depends_on('automake')
depends_on('autoconf')
depends_on('libtool@2.4.2:')
# other dependencies
depends_on('lua') # Needed for the submodule sc
depends_on('mpi') depends_on('mpi')
depends_on('zlib')
def install(self, spec, prefix): def install(self, spec, prefix):
options = ['--enable-mpi', options = ['--enable-mpi',
@ -19,16 +26,20 @@ def install(self, spec, prefix):
'--without-blas', '--without-blas',
'CPPFLAGS=-DSC_LOG_PRIORITY=SC_LP_ESSENTIAL', 'CPPFLAGS=-DSC_LOG_PRIORITY=SC_LP_ESSENTIAL',
'CFLAGS=-O2', 'CFLAGS=-O2',
'CC=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpicc'), # TODO: use ENV variables or MPI class wrappers 'CC=%s' % self.spec['mpi'].mpicc,
'CXX=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpic++'), 'CXX=%s' % self.spec['mpi'].mpicxx,
'FC=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif90'), 'FC=%s' % self.spec['mpi'].mpifc,
'F77=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif77'), 'F77=%s' % self.spec['mpi'].mpif77
] ]
configure('--prefix=%s' % prefix, *options) configure('--prefix=%s' % prefix, *options)
make() make()
# Make tests optional as sometimes mpiexec can't be run with an error:
# mpiexec has detected an attempt to run as root.
# Running at root is *strongly* discouraged as any mistake (e.g., in
# defining TMPDIR) or bug can result in catastrophic damage to the OS
# file system, leaving your system in an unusable state.
if '+tests' in self.spec: if '+tests' in self.spec:
make("check") make("check")
make("install") make("install")

View file

@ -0,0 +1,14 @@
diff --git a/eo/src/CMakeLists.txt b/eo/src/CMakeLists.txt
index b2b445a..d45ddc7 100644
--- a/eo/src/CMakeLists.txt
+++ b/eo/src/CMakeLists.txt
@@ -47,7 +47,7 @@ install(DIRECTORY do es ga gp other utils
add_subdirectory(es)
add_subdirectory(ga)
add_subdirectory(utils)
-#add_subdirectory(serial)
+add_subdirectory(serial) # Required when including <paradiseo/eo/utils/eoTimer.h> , which is need by <paradiseo/eo/mpi/eoMpi.h>
if(ENABLE_PYEO)
add_subdirectory(pyeo)

View file

@ -0,0 +1,13 @@
diff --git a/cmake/Config.cmake b/cmake/Config.cmake
index 02593ba..d198ca9 100644
--- a/cmake/Config.cmake
+++ b/cmake/Config.cmake
@@ -6,7 +6,7 @@ if(${CMAKE_SYSTEM_NAME} MATCHES "Darwin")
# detect OS X version. (use '/usr/bin/sw_vers -productVersion' to extract V from '10.V.x'.)
execute_process (COMMAND /usr/bin/sw_vers -productVersion OUTPUT_VARIABLE MACOSX_VERSION_RAW)
- string(REGEX REPLACE "10\\.([0-9]).*" "\\1" MACOSX_VERSION "${MACOSX_VERSION_RAW}")
+ string(REGEX REPLACE "10\\.([0-9]+).*" "\\1" MACOSX_VERSION "${MACOSX_VERSION_RAW}")
if(${MACOSX_VERSION} LESS 5)
message(FATAL_ERROR "Unsupported version of OS X : ${MACOSX_VERSION_RAW}")
return()

View file

@ -0,0 +1,13 @@
diff --git a/moeo/test/t-moeo2DMinHypervolumeArchive.cpp b/moeo/test/t-moeo2DMinHypervolumeArchive.cpp
index 994a9a4..c4ba77b 100644
--- a/moeo/test/t-moeo2DMinHypervolumeArchive.cpp
+++ b/moeo/test/t-moeo2DMinHypervolumeArchive.cpp
@@ -41,7 +41,7 @@
#include <moeo>
#include <cassert>
-#include<archive/moeo2DMinHyperVolumeArchive.h>
+#include<archive/moeo2DMinHypervolumeArchive.h>
//-----------------------------------------------------------------------------

View file

@ -0,0 +1,13 @@
diff --git a/eo/tutorial/Lesson3/exercise3.1.cpp b/eo/tutorial/Lesson3/exercise3.1.cpp
index dc37479..d178941 100644
--- a/eo/tutorial/Lesson3/exercise3.1.cpp
+++ b/eo/tutorial/Lesson3/exercise3.1.cpp
@@ -289,7 +289,7 @@ void main_function(int argc, char **argv)
checkpoint.add(fdcStat);
// The Stdout monitor will print parameters to the screen ...
- eoStdoutMonitor monitor(false);
+ eoStdoutMonitor monitor;
// when called by the checkpoint (i.e. at every generation)
checkpoint.add(monitor);

View file

@ -0,0 +1,66 @@
from spack import *
import sys
class Paradiseo(Package):
"""A C++ white-box object-oriented framework dedicated to the reusable design of metaheuristics."""
homepage = "http://paradiseo.gforge.inria.fr/"
# Installing from the development version is a better option at this
# point than using the very old supplied packages
version('head', git='https://gforge.inria.fr/git/paradiseo/paradiseo.git')
# This is a version that the package formula author has tested successfully.
# However, the clone is very large (~1Gb git history). The history in the
# head version has been trimmed significantly.
version('dev-safe', git='https://gforge.inria.fr/git/paradiseo/paradiseo.git',
commit='dbb8fbe9a786efd4d1c26408ac1883442e7643a6')
variant('mpi', default=True, description='Compile with parallel and distributed metaheuristics module')
variant('smp', default=True, description='Compile with symmetric multi-processing module ')
variant('edo', default=True, description='Compile with (Experimental) EDO module')
#variant('tests', default=False, description='Compile with build tests')
#variant('doc', default=False, description='Compile with documentation')
variant('debug', default=False, description='Builds a debug version of the libraries')
variant('openmp', default=False, description='Enable OpenMP support')
variant('gnuplot', default=False, description='Enable GnuPlot support')
# Required dependencies
depends_on ("cmake")
# Optional dependencies
depends_on ("mpi", when="+mpi")
depends_on ("doxygen", when='+doc')
depends_on ("gnuplot", when='+gnuplot')
depends_on ("eigen", when='+edo')
depends_on ("boost~mpi", when='+edo~mpi')
depends_on ("boost+mpi", when='+edo+mpi')
# Patches
patch('enable_eoserial.patch')
patch('fix_osx_detection.patch')
patch('fix_tests.patch')
patch('fix_tutorials.patch')
def install(self, spec, prefix):
options = []
options.extend(std_cmake_args)
options.extend([
'-DCMAKE_BUILD_TYPE:STRING=%s' % ('Debug' if '+debug' in spec else 'Release'),
'-DINSTALL_TYPE:STRING=MIN',
'-DMPI:BOOL=%s' % ('TRUE' if '+mpi' in spec else 'FALSE'),
'-DSMP:BOOL=%s' % ('TRUE' if '+smp' in spec else 'FALSE'), # Note: This requires a C++11 compatible compiler
'-DEDO:BOOL=%s' % ('TRUE' if '+edo' in spec else 'FALSE'),
'-DENABLE_CMAKE_TESTING:BOOL=%s' % ('TRUE' if '+tests' in spec else 'FALSE'),
'-DENABLE_OPENMP:BOOL=%s' % ('TRUE' if '+openmp' in spec else 'FALSE'),
'-DENABLE_GNUPLOT:BOOL=%s' % ('TRUE' if '+gnuplot' in spec else 'FALSE')
])
with working_dir('spack-build', create=True):
# Configure
cmake('..', *options)
# Build, test and install
make("VERBOSE=1")
if '+tests' in spec:
make("test")
make("install")

View file

@ -0,0 +1,14 @@
from spack import *
class PySqlalchemy(Package):
"""The Python SQL Toolkit and Object Relational Mapper"""
homepage = 'http://www.sqlalchemy.org/'
url = "https://pypi.python.org/packages/source/S/SQLAlchemy/SQLAlchemy-1.0.12.tar.gz"
version('1.0.12', '6d19ef29883bbebdcac6613cf391cac4')
extends('python')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -7,7 +7,7 @@ class PyBottleneck(Package):
version('1.0.0', '380fa6f275bd24f27e7cf0e0d752f5d2') version('1.0.0', '380fa6f275bd24f27e7cf0e0d752f5d2')
extends('python', ignore=r'bin/f2py$') extends('python')
depends_on('py-numpy') depends_on('py-numpy')
def install(self, spec, prefix): def install(self, spec, prefix):

View file

@ -0,0 +1,22 @@
from spack import *
class PyCsvkit(Package):
"""A library of utilities for working with CSV, the king of tabular file
formats"""
homepage = 'http://csvkit.rtfd.org/'
url = "https://pypi.python.org/packages/source/c/csvkit/csvkit-0.9.1.tar.gz"
version('0.9.1', '48d78920019d18846933ee969502fff6')
extends('python')
depends_on('py-dateutil')
depends_on('py-dbf')
depends_on('py-xlrd')
depends_on('py-SQLAlchemy')
depends_on('py-six')
depends_on('py-openpyxl')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -0,0 +1,15 @@
from spack import *
class PyDbf(Package):
"""Pure python package for reading/writing dBase, FoxPro, and Visual FoxPro
.dbf files (including memos)"""
homepage = 'https://pypi.python.org/pypi/dbf'
url = "https://pypi.python.org/packages/source/d/dbf/dbf-0.96.005.tar.gz"
version('0.96.005', 'bce1a1ed8b454a30606e7e18dd2f8277')
extends('python')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -0,0 +1,14 @@
from spack import *
class PyJdcal(Package):
"""Julian dates from proleptic Gregorian and Julian calendars"""
homepage = 'http://github.com/phn/jdcal'
url = "https://pypi.python.org/packages/source/j/jdcal/jdcal-1.2.tar.gz"
version('1.2', 'ab8d5ba300fd1eb01514f363d19b1eb9')
extends('python')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -12,7 +12,7 @@ class PyMatplotlib(Package):
variant('gui', default=False, description='Enable GUI') variant('gui', default=False, description='Enable GUI')
variant('ipython', default=False, description='Enable ipython support') variant('ipython', default=False, description='Enable ipython support')
extends('python', ignore=r'bin/nosetests.*$|bin/pbr$|bin/f2py$') extends('python', ignore=r'bin/nosetests.*$|bin/pbr$')
depends_on('py-pyside', when='+gui') depends_on('py-pyside', when='+gui')
depends_on('py-ipython', when='+ipython') depends_on('py-ipython', when='+ipython')

View file

@ -9,7 +9,7 @@ class PyNumexpr(Package):
version('2.4.6', '17ac6fafc9ea1ce3eb970b9abccb4fbd') version('2.4.6', '17ac6fafc9ea1ce3eb970b9abccb4fbd')
version('2.5', '84f66cced45ba3e30dcf77a937763aaa') version('2.5', '84f66cced45ba3e30dcf77a937763aaa')
extends('python', ignore=r'bin/f2py$') extends('python')
depends_on('py-numpy') depends_on('py-numpy')
def install(self, spec, prefix): def install(self, spec, prefix):

View file

@ -0,0 +1,17 @@
from spack import *
class PyOpenpyxl(Package):
"""A Python library to read/write Excel 2007 xlsx/xlsm files"""
homepage = 'http://openpyxl.readthedocs.org/'
url = "https://pypi.python.org/packages/source/o/openpyxl/openpyxl-2.4.0-a1.tar.gz"
version('2.4.0-a1', 'e5ca6d23ceccb15115d45cdf26e736fc')
extends('python')
depends_on('py-jdcal')
depends_on('py-setuptools')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -10,7 +10,7 @@ class PyPandas(Package):
version('0.16.1', 'fac4f25748f9610a3e00e765474bdea8') version('0.16.1', 'fac4f25748f9610a3e00e765474bdea8')
version('0.18.0', 'f143762cd7a59815e348adf4308d2cf6') version('0.18.0', 'f143762cd7a59815e348adf4308d2cf6')
extends('python', ignore=r'bin/f2py$') extends('python')
depends_on('py-dateutil') depends_on('py-dateutil')
depends_on('py-numpy') depends_on('py-numpy')
depends_on('py-setuptools') depends_on('py-setuptools')

View file

@ -7,7 +7,7 @@ class PyScikitImage(Package):
version('0.12.3', '04ea833383e0b6ad5f65da21292c25e1') version('0.12.3', '04ea833383e0b6ad5f65da21292c25e1')
extends('python', ignore=r'bin/.*\.py$|bin/f2py$') extends('python', ignore=r'bin/.*\.py$')
depends_on('py-dask') depends_on('py-dask')
depends_on('py-pillow') depends_on('py-pillow')

View file

@ -0,0 +1,15 @@
from spack import *
class PyXlrd(Package):
"""Library for developers to extract data from Microsoft Excel (tm)
spreadsheet files"""
homepage = 'http://www.python-excel.org/'
url = "https://pypi.python.org/packages/source/x/xlrd/xlrd-0.9.4.tar.gz"
version('0.9.4', '911839f534d29fe04525ef8cd88fe865')
extends('python')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -151,6 +151,8 @@ def python_ignore(self, ext_pkg, args):
patterns.append(r'setuptools\.pth') patterns.append(r'setuptools\.pth')
patterns.append(r'bin/easy_install[^/]*$') patterns.append(r'bin/easy_install[^/]*$')
patterns.append(r'setuptools.*egg$') patterns.append(r'setuptools.*egg$')
if ext_pkg.name != 'py-numpy':
patterns.append(r'bin/f2py$')
return match_predicate(ignore_arg, patterns) return match_predicate(ignore_arg, patterns)

View file

@ -101,7 +101,7 @@ def patch(self):
@property @property
def common_config_args(self): def common_config_args(self):
return [ config_args = [
'-prefix', self.prefix, '-prefix', self.prefix,
'-v', '-v',
'-opensource', '-opensource',
@ -115,7 +115,16 @@ def common_config_args(self):
'-no-openvg', '-no-openvg',
'-no-pch', '-no-pch',
# NIS is deprecated in more recent glibc # NIS is deprecated in more recent glibc
'-no-nis'] '-no-nis'
]
if '+gtk' in self.spec:
config_args.append('-gtkstyle')
else:
config_args.append('-no-gtkstyle')
return config_args
# Don't disable all the database drivers, but should # Don't disable all the database drivers, but should
# really get them into spack at some point. # really get them into spack at some point.
@ -128,8 +137,7 @@ def configure(self):
'-thread', '-thread',
'-shared', '-shared',
'-release', '-release',
'-fast' '-fast')
)
@when('@4') @when('@4')
def configure(self): def configure(self):

View file

@ -0,0 +1,11 @@
--- a/configure
+++ b/configure
@@ -40456,7 +40456,7 @@
hwloc_saved_LDFLAGS="$LDFLAGS"
if test "x$with_hwloc" != x; then
CPPFLAGS="-I$with_hwloc/include $CPPFLAGS"
- LDFLAGS="-L$with_hwloc/lib $CPPFLAGS"
+ LDFLAGS="-L$with_hwloc/lib $LDFLAGS"
fi

View file

@ -16,7 +16,12 @@ class Qthreads(Package):
version('1.10', '5af8c8bbe88c2a6d45361643780d1671') version('1.10', '5af8c8bbe88c2a6d45361643780d1671')
patch("ldflags.patch")
patch("restrict.patch")
patch("trap.patch")
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix) configure("--prefix=%s" % prefix,
"--enable-guard-pages")
make() make()
make("install") make("install")

View file

@ -0,0 +1,12 @@
--- a/include/qthread/common.h.in
+++ b/include/qthread/common.h.in
@@ -84,7 +84,9 @@
/* Define to the equivalent of the C99 'restrict' keyword, or to
nothing if this is not supported. Do not define if restrict is
supported directly. */
+#ifndef restrict
#undef restrict
+#endif
/* Work around a bug in Sun C++: it does not support _Restrict or
__restrict__, even though the corresponding Sun C compiler ends up with
"#define restrict _Restrict" or "#define restrict __restrict__" in the

View file

@ -0,0 +1,11 @@
--- a/include/qthread/qthread.hpp
+++ b/include/qthread/qthread.hpp
@@ -236,7 +236,7 @@
return qthread_incr64((uint64_t *)operand, incr);
default:
- *(int *)(0) = 0;
+ __builtin_trap();
}
return T(0); // never hit - keep compiler happy
}

View file

@ -0,0 +1,12 @@
from spack import *
class Raja(Package):
"""RAJA Parallel Framework."""
homepage = "http://software.llnl.gov/RAJA/"
version('git', git='https://github.com/LLNL/RAJA.git', branch="master")
def install(self, spec, prefix):
cmake('.',*std_cmake_args)
make()
make('install')

View file

@ -0,0 +1,13 @@
from spack import *
class Scons(Package):
"""SCons is a software construction tool"""
homepage = "http://scons.org"
url = "http://downloads.sourceforge.net/project/scons/scons/2.5.0/scons-2.5.0.tar.gz"
version('2.5.0', '9e00fa0df8f5ca5c5f5975b40e0ed354')
extends('python')
def install(self, spec, prefix):
python('setup.py', 'install', '--prefix=%s' % prefix)

View file

@ -0,0 +1,51 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Serf(Package):
"""Apache Serf - a high performance C-based HTTP client library built upon the Apache Portable Runtime (APR) library"""
homepage = 'https://serf.apache.org/'
url = 'https://archive.apache.org/dist/serf/serf-1.3.8.tar.bz2'
version('1.3.8', '1d45425ca324336ce2f4ae7d7b4cfbc5567c5446')
depends_on('apr')
depends_on('apr-util')
depends_on('scons')
depends_on('expat')
depends_on('openssl')
def install(self, spec, prefix):
scons = which("scons")
options = ['PREFIX=%s' % prefix]
options.append('APR=%s' % spec['apr'].prefix)
options.append('APU=%s' % spec['apr-util'].prefix)
options.append('OPENSSL=%s' % spec['openssl'].prefix)
options.append('LINKFLAGS=-L%s/lib' % spec['expat'].prefix)
options.append('CPPFLAGS=-I%s/include' % spec['expat'].prefix)
scons(*options)
scons('install')

View file

@ -37,6 +37,7 @@ class Subversion(Package):
depends_on('apr-util') depends_on('apr-util')
depends_on('zlib') depends_on('zlib')
depends_on('sqlite') depends_on('sqlite')
depends_on('serf')
# Optional: We need swig if we want the Perl, Python or Ruby # Optional: We need swig if we want the Perl, Python or Ruby
# bindings. # bindings.
@ -54,6 +55,7 @@ def install(self, spec, prefix):
options.append('--with-apr-util=%s' % spec['apr-util'].prefix) options.append('--with-apr-util=%s' % spec['apr-util'].prefix)
options.append('--with-zlib=%s' % spec['zlib'].prefix) options.append('--with-zlib=%s' % spec['zlib'].prefix)
options.append('--with-sqlite=%s' % spec['sqlite'].prefix) options.append('--with-sqlite=%s' % spec['sqlite'].prefix)
options.append('--with-serf=%s' % spec['serf'].prefix)
#options.append('--with-swig=%s' % spec['swig'].prefix) #options.append('--with-swig=%s' % spec['swig'].prefix)
configure(*options) configure(*options)

View file

@ -6,7 +6,9 @@ class SuperluDist(Package):
homepage = "http://crd-legacy.lbl.gov/~xiaoye/SuperLU/" homepage = "http://crd-legacy.lbl.gov/~xiaoye/SuperLU/"
url = "http://crd-legacy.lbl.gov/~xiaoye/SuperLU/superlu_dist_4.1.tar.gz" url = "http://crd-legacy.lbl.gov/~xiaoye/SuperLU/superlu_dist_4.1.tar.gz"
version('4.3', 'ee66c84e37b4f7cc557771ccc3dc43ae') version('5.0.0', '2b53baf1b0ddbd9fcf724992577f0670')
# default to version 4.3 since petsc and trilinos are not tested with 5.0.
version('4.3', 'ee66c84e37b4f7cc557771ccc3dc43ae', preferred=True)
version('4.2', 'ae9fafae161f775fbac6eba11e530a65') version('4.2', 'ae9fafae161f775fbac6eba11e530a65')
version('4.1', '4edee38cc29f687bd0c8eb361096a455') version('4.1', '4edee38cc29f687bd0c8eb361096a455')
version('4.0', 'c0b98b611df227ae050bc1635c6940e0') version('4.0', 'c0b98b611df227ae050bc1635c6940e0')

View file

@ -22,6 +22,7 @@
# along with this program; if not, write to the Free Software Foundation, # along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
from spack import * from spack import *
class Swig(Package): class Swig(Package):
@ -33,14 +34,19 @@ class Swig(Package):
code. In addition, SWIG provides a variety of customization code. In addition, SWIG provides a variety of customization
features that let you tailor the wrapping process to suit your features that let you tailor the wrapping process to suit your
application.""" application."""
homepage = "http://www.swig.org"
url = "http://prdownloads.sourceforge.net/swig/swig-3.0.2.tar.gz"
homepage = "http://www.swig.org"
url = "http://prdownloads.sourceforge.net/swig/swig-3.0.8.tar.gz"
version('3.0.8', 'c96a1d5ecb13d38604d7e92148c73c97')
version('3.0.2', '62f9b0d010cef36a13a010dc530d0d41') version('3.0.2', '62f9b0d010cef36a13a010dc530d0d41')
version('2.0.12', 'c3fb0b2d710cc82ed0154b91e43085a4')
version('2.0.2', 'eaf619a4169886923e5f828349504a29')
version('1.3.40', '2df766c9e03e02811b1ab4bba1c7b9cc')
depends_on('pcre') depends_on('pcre')
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix) configure('--prefix=%s' % prefix)
make() make()
make("install") make('install')

View file

@ -0,0 +1,124 @@
from spack import *
import os
import subprocess
class Turbomole(Package):
"""TURBOMOLE: Program Package for ab initio Electronic Structure
Calculations. NB: Requires a license to download."""
# NOTE: Turbomole requires purchase of a license to download. Go to the
# NOTE: Turbomole home page, http://www.turbomole-gmbh.com, for details.
# NOTE: Spack will search the current directory for this file. It is
# NOTE: probably best to add this file to a Spack mirror so that it can be
# NOTE: found from anywhere. For information on setting up a Spack mirror
# NOTE: see http://software.llnl.gov/spack/mirrors.html
homepage = "http://www.turbomole-gmbh.com/"
version('7.0.2', '92b97e1e52e8dcf02a4d9ac0147c09d6',
url="file://%s/turbolinux702.tar.gz" % os.getcwd())
variant('mpi', default=False, description='Set up MPI environment')
variant('smp', default=False, description='Set up SMP environment')
# Turbomole's install is odd. There are three variants
# - serial
# - parallel, MPI
# - parallel, SMP
#
# Only one of these can be active at a time. MPI and SMP are set as
# variants so there could be up to 3 installs per version. Switching
# between them would be accomplished with `module swap` commands.
def do_fetch(self, mirror_only=True):
if '+mpi' in self.spec and '+smp' in self.spec:
raise InstallError('Can not have both SMP and MPI enabled in the same build.')
super(Turbomole, self).do_fetch(mirror_only)
def get_tm_arch(self):
# For python-2.7 we could use `tm_arch = subprocess.check_output()`
# Use the following for compatibility with python 2.6
if 'TURBOMOLE' in os.getcwd():
tm_arch = subprocess.Popen(['sh', 'scripts/sysname'],
stdout=subprocess.PIPE).communicate()[0]
return tm_arch.rstrip('\n')
else:
return
def install(self, spec, prefix):
if spec.satisfies('@:7.0.2'):
calculate_version = 'calculate_2.4_linux64'
molecontrol_version = 'MoleControl_2.5'
tm_arch=self.get_tm_arch()
tar = which('tar')
dst = join_path(prefix, 'TURBOMOLE')
tar('-x', '-z', '-f', 'thermocalc.tar.gz')
with working_dir('thermocalc'):
cmd = 'sh install <<<y'
subprocess.call(cmd, shell=True)
install_tree('basen', join_path(dst, 'basen'))
install_tree('cabasen', join_path(dst, 'cabasen'))
install_tree(calculate_version, join_path(dst, calculate_version))
install_tree('cbasen', join_path(dst, 'cbasen'))
install_tree('DOC', join_path(dst, 'DOC'))
install_tree('jbasen', join_path(dst, 'jbasen'))
install_tree('jkbasen', join_path(dst, 'jkbasen'))
install_tree(molecontrol_version, join_path(dst, molecontrol_version))
install_tree('parameter', join_path(dst, 'parameter'))
install_tree('perlmodules', join_path(dst, 'perlmodules'))
install_tree('scripts', join_path(dst, 'scripts'))
install_tree('smprun_scripts', join_path(dst, 'smprun_scripts'))
install_tree('structures', join_path(dst, 'structures'))
install_tree('thermocalc', join_path(dst, 'thermocalc'))
install_tree('TURBOTEST', join_path(dst, 'TURBOTEST'))
install_tree('xbasen', join_path(dst, 'xbasen'))
install('Config_turbo_env', dst)
install('Config_turbo_env.tcsh', dst)
install('README', dst)
install('README_LICENSES', dst)
install('TURBOMOLE_702_LinuxPC', dst)
if '+mpi' in spec:
install_tree('bin/%s_mpi' % tm_arch, join_path(dst, 'bin', '%s_mpi' % tm_arch))
install_tree('libso/%s_mpi' % tm_arch, join_path(dst, 'libso', '%s_mpi' % tm_arch))
install_tree('mpirun_scripts/%s_mpi' % tm_arch, join_path(dst, 'mpirun_scripts', '%s_mpi' % tm_arch))
elif '+smp' in spec:
install_tree('bin/%s_smp' % tm_arch, join_path(dst, 'bin', '%s_smp' % tm_arch))
install_tree('libso/%s_smp' % tm_arch, join_path(dst, 'libso', '%s_smp' % tm_arch))
install_tree('mpirun_scripts/%s_smp' % tm_arch, join_path(dst, 'mpirun_scripts', '%s_smp' % tm_arch))
else:
install_tree('bin/%s' % tm_arch, join_path(dst, 'bin', tm_arch))
if '+mpi' in spec or '+smp' in spec:
install('mpirun_scripts/ccsdf12', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/dscf', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/grad', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/mpgrad', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/pnoccsd', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/rdgrad', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/ricc2', join_path(dst, 'mpirun_scripts'))
install('mpirun_scripts/ridft', join_path(dst, 'mpirun_scripts'))
def setup_environment(self, spack_env, run_env):
if self.spec.satisfies('@:7.0.2'):
molecontrol_version = 'MoleControl_2.5'
tm_arch=self.get_tm_arch()
run_env.set('TURBODIR', join_path(self.prefix, 'TURBOMOLE'))
run_env.set('MOLE_CONTROL', join_path(self.prefix, 'TURBOMOLE', molecontrol_version))
run_env.prepend_path('PATH', join_path(self.prefix, 'TURBOMOLE', 'thermocalc'))
run_env.prepend_path('PATH', join_path(self.prefix, 'TURBOMOLE', 'scripts'))
if '+mpi' in self.spec:
run_env.set('PARA_ARCH', 'MPI')
run_env.prepend_path('PATH', join_path(self.prefix, 'TURBOMOLE', 'bin', '%s_mpi' % tm_arch))
elif '+smp' in self.spec:
run_env.set('PARA_ARCH', 'SMP')
run_env.prepend_path('PATH', join_path(self.prefix, 'TURBOMOLE', 'bin', '%s_smp' % tm_arch))
else:
run_env.prepend_path('PATH', join_path(self.prefix, 'TURBOMOLE', 'bin', tm_arch))

View file

@ -17,6 +17,8 @@ class Wget(Package):
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix, configure("--prefix=%s" % prefix,
"--with-ssl=openssl") "--with-ssl=openssl",
"OPENSSL_CFLAGS=-I%s" % spec['openssl'].prefix.include,
"OPENSSL_LIBS=-L%s -lssl -lcrypto -lz" % spec['openssl'].prefix.lib)
make() make()
make("install") make("install")

View file

@ -24,8 +24,8 @@ class XercesC(Package):
""" """
homepage = "https://xerces.apache.org/xerces-c" homepage = "https://xerces.apache.org/xerces-c"
url = "https://www.apache.org/dist/xerces/c/3/sources/xerces-c-3.1.2.tar.gz" url = "https://www.apache.org/dist/xerces/c/3/sources/xerces-c-3.1.3.tar.bz2"
version('3.1.2', '9eb1048939e88d6a7232c67569b23985') version('3.1.3', '5e333b55cb43e6b025ddf0e5d0f0fb0d')
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix, configure("--prefix=%s" % prefix,

View file

@ -1,3 +1,4 @@
import re, os, glob
from spack import * from spack import *
class Zoltan(Package): class Zoltan(Package):
@ -12,8 +13,13 @@ class Zoltan(Package):
base_url = "http://www.cs.sandia.gov/~kddevin/Zoltan_Distributions" base_url = "http://www.cs.sandia.gov/~kddevin/Zoltan_Distributions"
version('3.83', '1ff1bc93f91e12f2c533ddb01f2c095f') version('3.83', '1ff1bc93f91e12f2c533ddb01f2c095f')
version('3.8', '9d8fba8a990896881b85351d4327c4a9')
version('3.6', '9cce794f7241ecd8dbea36c3d7a880f9')
version('3.3', '5eb8f00bda634b25ceefa0122bd18d65') version('3.3', '5eb8f00bda634b25ceefa0122bd18d65')
variant('debug', default=False, description='Builds a debug version of the library')
variant('shared', default=True, description='Builds a shared version of the library')
variant('fortran', default=True, description='Enable Fortran support') variant('fortran', default=True, description='Enable Fortran support')
variant('mpi', default=False, description='Enable MPI support') variant('mpi', default=False, description='Enable MPI support')
@ -24,28 +30,49 @@ def install(self, spec, prefix):
'--enable-f90interface' if '+fortan' in spec else '--disable-f90interface', '--enable-f90interface' if '+fortan' in spec else '--disable-f90interface',
'--enable-mpi' if '+mpi' in spec else '--disable-mpi', '--enable-mpi' if '+mpi' in spec else '--disable-mpi',
] ]
config_cflags = [
'-O0' if '+debug' in spec else '-O3',
'-g' if '+debug' in spec else '-g0',
]
if '+shared' in spec:
config_args.append('--with-ar=$(CXX) -shared $(LDFLAGS) -o')
config_args.append('RANLIB=echo')
config_cflags.append('-fPIC')
if '+mpi' in spec: if '+mpi' in spec:
config_args.append('--with-mpi=%s' % spec['mpi'].prefix)
config_args.append('--with-mpi-compilers=%s' % spec['mpi'].prefix.bin)
config_args.append('CC=%s/mpicc' % spec['mpi'].prefix.bin) config_args.append('CC=%s/mpicc' % spec['mpi'].prefix.bin)
config_args.append('CXX=%s/mpicxx' % spec['mpi'].prefix.bin) config_args.append('CXX=%s/mpicxx' % spec['mpi'].prefix.bin)
config_args.append('--with-mpi=%s' % spec['mpi'].prefix)
config_args.append('--with-mpi-compilers=%s' % spec['mpi'].prefix.bin)
# NOTE: Early versions of Zoltan come packaged with a few embedded # NOTE: Early versions of Zoltan come packaged with a few embedded
# library packages (e.g. ParMETIS, Scotch), which messes with Spack's # library packages (e.g. ParMETIS, Scotch), which messes with Spack's
# ability to descend directly into the package's source directory. # ability to descend directly into the package's source directory.
if spec.satisfies('@:3.3'): if spec.satisfies('@:3.6'):
cd('Zoltan_v%s' % self.version) cd('Zoltan_v%s' % self.version)
mkdirp('build') mkdirp('build')
cd('build') cd('build')
config_zoltan = Executable('../configure') config_zoltan = Executable('../configure')
config_zoltan('--prefix=%s' % pwd(), *config_args) config_zoltan(
'--prefix=%s' % pwd(),
'--with-cflags=%s' % ' '.join(config_cflags),
'--with-cxxflags=%s' % ' '.join(config_cflags),
*config_args)
make() make()
make('install') make('install')
# NOTE: Unfortunately, Zoltan doesn't provide any configuration options for
# the extension of the output library files, so this script must change these
# extensions as a post-processing step.
if '+shared' in spec:
for libpath in glob.glob('lib/*.a'):
libdir, libname = (os.path.dirname(libpath), os.path.basename(libpath))
move(libpath, os.path.join(libdir, re.sub(r'\.a$', '.so', libname)))
mkdirp(prefix) mkdirp(prefix)
move('include', prefix) move('include', prefix)
move('lib', prefix) move('lib', prefix)